Class Is in Session: AI App Schools on English Pronunciation

Isthmus. Nuclear. Anemone. Tricky English pronunciations are a challenge for many immigrants to the U.S. and native-born speakers alike. ELSA — which stands for English language speech assistant — aims to help with that.

The three-year-old Silicon Valley startup offers an English-pronunciation app, dubbed ELSA Speak, that’s geared toward American English and available for Android and iOS devices.

Vu Van, ELSA’s co-founder and CEO, said English pronunciations plague many seeking careers. A Vietnamese immigrant, Van picked up English early and later went to Stanford University to earn an MBA in 2011, but still struggled with certain words, leading her to start ELSA. 

ELSA’s app is designed to be a personalized coach for practicing English, particularly for non-native speakers. It offers bite-sized lessons intended to improve pronunciation with 10 minutes a day of practice.

While most language apps emphasize grammar, “we are very focused on pinpointing your pronunciation errors,” Van said. “It’s supposed to help people with accents.”

She said that having an accent can crush one’s confidence and that working on pronunciation is difficult without the aid of an expensive tutor.

That’s where ELSA comes in. The app uses AI and speech-recognition technology to help people practice English for professional and everyday situations.

Practice Makes Perfect

The coaching app, which enables people to set daily practice reminders, has a slick interface that makes learning easy and fun. The app coach shows a sentence and prompts you to tap the microphone icon and say it. It gives positive feedback in bold, writing EXCELLENT in big green lettering for good pronunciations.

Perhaps even more valuable, it counters mispronunciations with helpful tips to get it right. For example, the language coach offers a number of pointers that help users understand where to place their tongue in their mouth and how to hold their lips when saying particular words.

ELSA is geared to help non-native speakers focus on sentences commonly used in a new job or at a conference, among other professional settings.

The app first takes people through a five-minute assessment test to identify challenges. It offers more than 600 two-minute English lessons and more than 3,000 words for people to practice.

Users’ conversational English lessons are recorded and scored in the app to help gauge their pronunciation level on specific words.

ELSA’s Coaching Evolution

To train its pronounciation model, the company fed thousands of hours of spoken English into a recurrent neural network. It’s now fine-tuning its algorithm and is constantly training with data from users of its app, said Van.

“The more NVIDIA GPUs we have, the more experiments we can run on the model,” she said.

Launched in 2016, ELSA’s apps have been downloaded more than 2 million times. The service is free for the first week, and then requires a subscription to continue beyond limited access. Subscriptions cost $3.99 for a month, $8.99 for three months or $29.99 for a year.

ELSA is a member of the NVIDIA Inception program, a virtual accelerator that offers hardware grants, marketing support and training with deep learning experts.

The startup recently scooped up $3.2 million in venture funding. The founders are seeking additional AI talent to help further build out the service.

The post Class Is in Session: AI App Schools on English Pronunciation appeared first on The Official NVIDIA Blog.

NVIDIA Technology Powers New Home Gaming System, Nintendo Switch

The first thing to know about the new Nintendo Switch home gaming system: it’s really fun to play. With great graphics, loads of game titles and incredible performance, the Nintendo Switch will provide people with many hours of engaging and interactive gaming entertainment.

But creating a device so fun required some serious engineering. The development encompassed 500 man-years of effort across every facet of creating a new gaming platform: algorithms, computer architecture, system design, system software, APIs, game engines and peripherals. They all had to be rethought and redesigned for Nintendo to deliver the best experience for gamers, whether they’re in the living room or on the move.

A Console Architecture for the Living Room and Beyond

Nintendo Switch is powered by the performance of the custom Tegra processor. The high-efficiency scalable processor includes an NVIDIA GPU based on the same architecture as the world’s top-performing GeForce gaming graphics cards.

Nintendo SwitchThe Nintendo Switch’s gaming experience is also supported by fully custom software, including a revamped physics engine, new libraries, advanced game tools and libraries. NVIDIA additionally created new gaming APIs to fully harness this performance. The newest API, NVN, was built specifically to bring lightweight, fast gaming to the masses.

Gameplay is further enhanced by hardware-accelerated video playback and custom software for audio effects and rendering.

We’ve optimized the full suite of hardware and software for gaming and mobile use cases. This includes custom operating system integration with the GPU to increase both performance and efficiency.

NVIDIA gaming technology is integrated into all aspects of the new Nintendo Switch home gaming system, which promises to deliver a great experience to gamers.

The Nintendo Switch will be available in March 2017. More information is available at https://www.nintendo.com/switch.

Nintendo Switch is a trademark of Nintendo.

The post NVIDIA Technology Powers New Home Gaming System, Nintendo Switch appeared first on The Official NVIDIA Blog.

At Asia’s Sprawling Computex Show, NVIDIA Technology Stretches from VR to Virtualization

Ground zero for Computex Taipei, Asia’s largest technology show, is the sprawling half-million square feet of Nangang Exhibition Hall.

The region’s entrepreneurial energy is on vivid display within the hall, which is partially wrapped with NVIDIA branding. Fledgling companies like DuckyChannel tout its range of keyboards tied to the Chinese zodiac. Groovy Technology Corp. draws the curious with low-cost digital signage. In a carpeted corner, Be Quiet! displays its sound-dampened desktops.

And major hometown Taiwan names from Acer to Zotac are demo’ing NVIDIA technology. More than a dozen companies are among them, including ASUS, Clevo, Colorful, EVGA, Galaxy, Gigabyte, Innovision, Inwin, Leadtek, MSI, Supermicro and Thermaltake, as well as Microsoft.

text
MSI was one of the major hometown names demo’ing NVIDIA technology at Computex.

One of the biggest hits is NVIDIA’s Pascal-based GPU architecture, which takes gaming and VR to a new level, with the GeForce GTX 1080. Unveiled three weeks ago to broad acclaim and just now shipping, the GeForce GTX 1080 performs 2x faster than Titan X, with 3x its power efficiency.

It’s often getting paired up here with the Oculus Rift and HTC Vive headsets. And it’s driving experiences like EVE Valkyrie, featuring post-apocalyptic dogfighting vaulted into space; The Unspoken, where players conjure fireballs in their right hand while brawling on a Chicago construction site; and Edge of Nowhere, based on tracking a lost expedition in vast stretches of Antarctica.

text
G-SYNC monitors lit up the exhibition hall.

Also lighting up the exhibition hall are new G-SYNC monitors – which deploy variable frame rates to do away with tearing and stuttering. Operating at 180Hz, the eSports monitors by Acer and ASUS set a new standard for the fastest gaming experience.

NVIDIA’s reach beyond the consumer markets is clear, with our technologies featuring in automobiles and the enterprise space.

text
Audi’s A4 sedan features a virtual dashboard powered by the NVIDIA Tegra system on a chip.

German automaker Audi is debuting to Taiwan’s wealthy consumers its new A4 sedan, featuring a virtual dashboard powered by the NVIDIA Tegra system on a chip. It’s also giving consumers a peek at its new virtual showroom experience. It uses VR to enable consumers to tour any Audi vehicle – from its ferocious R8 coupe to its capacious Q7 SUV – as if it’s in the room, and to see it in varying styles and colors.

MIT Lab is showing off the fruits of its collaboration with Taiwan’s Institute for Information Industry on a new type of vehicle for urban commuting. Called the Persuasive Electric Vehicle, or PEV, it looks like an electric tricycle with a protective roof. Based on NVIDIA technology, it’s expected to go on trial in Taiwan next year.

text
Supermicro showed off a system based on NVIDIA’s Tesla M10 GPU accelerators.

And in a sign of how Computex and Taiwan as a whole have evolved far beyond consumer electronics, Gigabyte and Supermicro are displaying systems based on the Tesla M10 GPU accelerators, which drive virtualized applications across multiple desktops.

 

The post At Asia’s Sprawling Computex Show, NVIDIA Technology Stretches from VR to Virtualization appeared first on The Official NVIDIA Blog.

ARMv7 now has a bootloader

Progress on the armv7 platform continues, and Jonathan Gray writes in to the arm@ mailing list with some promising news:

There is now a bootloader for armv7 thanks to kettenis@ Recent armv7 snapshots will configure disks to use efiboot and install device tree dtb files on a fat partition at the start of the disk.

u-boot kernel images are no longer part of the release but can still be built for the time being. We are going to start assuming the kernel has been loaded with a dtb file to describe the hardware sometime soon. Those doing new installs can ignore the details but here they are.

Read more...

Inside Job: Student Turns to GPUs to Create Drones for the Great Indoors

Marc Gyongyosi isn’t your average college student. The junior computer science major at Northwestern University’s McCormick School of Engineering has thrown himself into the world of lightweight robotics in a way that reaches far beyond the classroom.

Not only has Gyongyosi spent the past two years working with BMW’s robotics research department on developing robotic systems to help factory workers, he’s also involved in two startups. One of those, MDAR Technologies, is working on 3D vision systems for autonomous vehicles.

But it’s his work with the second company, IFM Technologies, which he founded, that landed him on a stage at our annual GPU Technology Conference.

IFM has been working on an autonomous drone that can be reliably operated indoors. Most drones today only fly outdoors because a) they’re too large and clunky to be safely flown indoors, and b) the GPS systems they rely on don’t work indoors. Further complicating the market for outdoor drones is the fact that the FAA must approve them for flight. That’s not the case with indoor drones.

Gyongyosi looked at that convergence of facts and determined that there’s a huge potential market for a commercially available indoor drone. He told GTC attendees that he estimates there are multi-billion-dollar opportunities in areas such as warehouse analytics, utility analysis, insurance inspections, and commercial real estate and construction.

And make no mistake, he’s not in this just to identify those opportunities; he wants to seize them. “We don’t want to just be a research project,” Gyongyosi said during his talk. “We want to be something that goes from problem to solution.”

His solution, however, has presented technical challenges. To start with, he’s had to find an alternative to the GPS built into outdoor drones. He said others have tried motion capture or radio beacons as GPS substitutes, but because he’s trying to keep IFM’s drone small and light, he didn’t want the extra weight. That, plus those options tend to be expensive and need constant calibration.

Similarly, other drones rely on onboard sensors to detect physical objects around them to avoid collision. But that also has presented a major space challenge on IFM’s small drone, as the amount of data that has to be processed is enormous.

“The processing power you need onboard is large,” he said. “That’s why these platforms are very large.”

To combat these issues, Gyongyosi did two things: First, he opted to mount a single camera on the IFM, sacrificing stereoscopic vision but preserving space and keeping the weight down. Then, he choose to incorporate feature tracking that operates somewhat like sensors, but instead uses the data from the camera.

When the performance of that configuration came up short of his expectations, he turned to the GPU, specifically NVIDIA’s Jetson Tegra K1, which is now part of the vehicles physical design.

The results speak for themselves. GPUs are processing the data nearly four times as fast as a CPU. Plus, the feature-tracking rate nearly doubled, from 5.5 Hz to 9.8 Hz. And if that’s not enough, it also improved accuracy and created enough spare space that Gyongyosi was able to add a second camera, which is mounted at a 45-degree angle to the first, trading stereoscopic sight for a larger field of vision.

To further illustrate the potential impact of IFM’s design, Gyongyosi pointed to the colossal failure that is Berlin’s long-planned futuristic airport, a project that was supposed to open years ago but remains non-operational after design flaws were found in the fire detection system during inspection.

Gyongyosi believes indoor drones could have prevented the fiasco by detecting the issue long before inspection, and he hopes IFM’s drones will be performing such tasks soon.

 

The post Inside Job: Student Turns to GPUs to Create Drones for the Great Indoors appeared first on The Official NVIDIA Blog.

Ready for Takeoff: Off-the-Shelf Jetson TX1 Carrier Boards Let Developers Soar

Developers looking to bring deep learning to drones, robots and other embedded computing applications can now get a head start.

Connect Tech Inc. (CTI), an Ontario, Canada-based embedded-computing group, has launched the Astro Carrier for Jetson TX1. It’s the first commercially available, deployment-ready carrier board for our supercomputing module.

Deploying Jetson TX1 into a final product requires developers to design and manufacture custom carrier boards. But not everyone has the expertise or resources to do so. That changes with the Astro Carrier for Jetson TX1, which CTI introduced today at Embedded World, in Nuremberg, Germany.

Measuring only 57 x 87 mm — about the size of a playing card — Astro Carrier operates in temperatures ranging from -40º to +85ºC. This lets it pack Jetson TX1’s supercomputing punch across a wide range of environmental conditions.

CTI Astro Carrier for Jetson TX1
CTI Astro Carrier for Jetson TX1

Astro Carrier connects with off-the-shelf or custom breakout boards, and offers a full array of features, including:

  • 2 Gigabit ports (1 from Jetson TX1 and another from an on-board controller)
  • 1 USB 3.0 and 1 USB 2.0 port
  • 1 HDMI port
  • Up to 3 camera serial interface channels
  • Mini PCIe expansion support

CTI also announced a lower cost carrier board, aptly named Elroy. It offers a lighter feature set while retaining the durability the company is known for. Elroy is slated to be available in April.

Designed for Developers

When we launched Jetson TX1 in November, we also released a complete suite of developer documentation that lets third-parties create their own carrier boards for it. The CTI Astro Carrier marks the first entry into the Jetson TX1 ecosystem, making it easier than ever for entrepreneurs, researchers and technologists to deploy our module.

If you’re at Embedded World this week, stop by the Connect Tech booth (Hall 2, Stand 2-318) or the NVIDIA booth (Hall 4A, Stand 4A-646) to see the CTI Astro Carrier for Jetson TX1 in person.

Pricing and availability of Astro Carrier for Jetson TX1 will be announced shortly, with delivery expected in the coming weeks.

The post Ready for Takeoff: Off-the-Shelf Jetson TX1 Carrier Boards Let Developers Soar appeared first on The Official NVIDIA Blog.

The Vulkan Graphics API Is Here—and Your NVIDIA GPU Is Ready

If you’re a GeForce gamer, you already have what you need to take advantage of what the Vulkan API can do. If you’re a developer, you will now have the choice of a new tool that will give you more control, and greater performance, on a broad range of devices.

Our support for Vulkan, on the day it launches, not just on multiple platforms, but in cutting-edge games such as The Talos Principle, has some of the industry’s most respected observers taking notice.

“To be able to play a game like The Talos Principle on the same day an API launches, is an unheard of achievement,” said Jon Peddie, president of Jon Peddie Research. “NVIDIA’s multi-platform compatibility and fully conformant driver support across many operating systems is a testament to the company’s leadership role in Vulkan’s development.”

text
GeForce gamers will be the first to play the Vulkan version of The Talos Principle, a puzzle game from Croteam that shipped today.

What Is Vulkan?

Vulkan is a low level API that gives direct access of the GPU to developers who want the ultimate in control. With a simpler, thinner driver, Vulkan has less latency and overhead than traditional OpenGL or Direct3D. Vulkan also has efficient multi-threading capabilities so that multi-core CPUs can keep the graphics pipeline loaded, enabling a new level of performance on existing hardware.

Vulkan is the first new generation, low-level API that is cross platform. This allows developers to create applications for a variety of PC, mobile and embedded devices using diverse operating systems. Like OpenGL, Vulkan is an open, royalty-free standard available for any platform to adopt. For developers who prefer to remain on OpenGL or OpenGL ES, NVIDIA will continue to drive innovations on those traditional APIs too.

Who’s Behind Vulkan?

vulkan-transparent-webVulkan was created by the Khronos Group, a standards organization that brings together a wide range of hardware and software companies, including NVIDIA, for the creation of open standard, royalty-free APIs for authoring and accelerated playback of dynamic media on a wide variety of platforms and devices. We’re proud to have played a leadership role in creating Vulkan. And we’re committed to helping developers use Vulkan to get the best from our GPUs.

Why You Should Care

Vulkan is great for developers. It reduces porting costs and opens up new market opportunities for applications across multiple platforms. Best of all, the NVIDIA drivers needed to take advantage of Vulkan are already here. On launch day we have Vulkan drivers available for Windows, Linux, and Android platforms. See our Vulkan driver page for all the details.

Here’s what Vulkan will mean for you:

  • For gamers with GeForce GPUs: Vulkan’s low latency and high-efficiency lets developers add more details and more special effects to their games, while still maintaining great performance. Because a Vulkan driver is thinner with less overhead, application developers will get fewer performance surprises. This translates to smoother, more fluid experiences.

    NVIDIA is shipping fully-conformant Vulkan drivers for all GeForce boards based on Kepler or Maxwell GPUs running Windows (Windows 7 or later) or Linux. “We have been using NVIDIA hardware and drivers on both Windows and Android for Vulkan development, and the reductions in CPU overhead have been impressive,” said Oculus Chief Technology Officer John Carmack.

    GeForce gamers will be the first to play the Vulkan version of  The Talos Principle, a puzzle game from Croteam that also shipped today. “We’ve successfully collaborated with the NVIDIA driver support team in the past, but I was amazed with the work they did on Vulkan,” said Croteam Senior Programmer Dean Sekuliuc. “They promptly provided us with the latest beta drivers so we were able to quickly implement the new API into Serious Engine and make The Talos Principle one of the first titles supporting Vulkan. Smooth!”<

  • For professional application developers using Quadro: Our Vulkan and OpenGL drivers use an integrated binary architecture that enables the use of GLSL shaders in Vulkan. Developers also have the flexibility to continue using OpenGL or plan a smooth transition from OpenGL to Vulkan to take advantage of Vulkan’s new capabilities. For example, Vulkan’s multi-threaded architecture can enable multiple CPU cores to prepare massive amounts of data for the GPU faster than before. For design and digital content creation applications, this means enhanced interactivity with large models.
  • For mobile developers using Tegra: We’re making Vulkan available to developers for both Android and Linux. Vulkan will ship alongside OpenGL ES as a core API in a future version of Android. This means that standard Android will have a state-of-the-art API with integrated graphics and compute, ultimately unleashing the GPU in Tegra for cutting-edge vision and compute applications, as well as awesome gaming graphics. Developers can use Vulkan on NVIDIA SHIELD Android TV and SHIELD tablets for Android coding, and Jetson for embedded Linux development.

How to Learn More About Vulkan

To learn more, click here or stop by our upcoming GPU Technology Conference in San Jose, CA, April 4-7, where we’ll have a full slate of Vulkan sessions.

We can’t wait to see what you do with the combination of Vulkan, NVIDIA drivers, and NVIDIA GPUs.

The post The Vulkan Graphics API Is Here—and Your NVIDIA GPU Is Ready appeared first on The Official NVIDIA Blog.

Intel at Mobile World Congress 2016

Billions of increasingly smart and connected devices, data-rich personalized services, and cloud applications are bringing amazing experiences to our daily lives. This places unprecedented demands on today’s wireless networks and connected devices and makes faster, smarter, more efficient 5G wireless networks and technology critical. At Mobile World Congress 2016, Intel will announce new developments that will accelerate the road to 5G and help make amazing experiences of the future possible.

The post Intel at Mobile World Congress 2016 appeared first on Intel Newsroom.

Media Alert: Intel to Outline Path to 5G at Mobile World Congress 2016

Billions of increasingly smart and connected devices, data-rich personalized services, and cloud applications are bringing amazing experiences to our daily lives. This places unprecedented demands on today’s wireless networks and connected devices and makes faster, smarter, more efficient 5G wireless networks and technology critical. At Mobile World Congress 2016, Intel will announce new developments that will accelerate the road to 5G and help make amazing experiences of the future possible.

Intel Press Conference

WHAT

Aicha Evens - 1000x1500

Join Aicha Evans, corporate vice president and general manager of the Intel Communication and Devices Group to hear about new partnerships and technologies that are paving the way to 5G.

When

Monday, Feb. 22, 2016, 4:00 – 5:00 p.m. CET
Product and technology demonstrations will also be available in the booth.

Where

Intel Booth, Fira Gran Via, Hall 3, Booth #3D30
Barcelona, Spain

Additional Speakerships & Panels

Join Intel executives at the GSMA panels to discuss topics such as 5G and the Internet of Things, including:

Keynote Session: Mobile is Disruption
Feb. 22, 2016, 12:15 – 1 p.m. CET
Hall 4, Auditorium 1

Intel CEO Brian Krzanich and executives from Ericsson and AT&T will discuss the future of mobility.

Designing and Engineering the Industrial Internet
Feb. 24, 2016, 1:45 – 3 p.m. CET
The Innovation Zone, Fira Gran Via

Doug Davis, senior vice president and general manager, IoT Group at Intel,   will join industry leaders and PricewaterhouseCoopers to discuss the opportunities and challenges associated with designing and engineering the next-generation Industrial Internet.

Media Lounge

Enjoy a comfortable space built exclusively for media and analysts in Intel’s booth on Tuesday, Feb. 23 through Wednesday, Feb. 24. Wi-Fi and refreshments will be available. All attendees with media or analysts badges will be admitted.

Media Contacts

Lindsey Sech, Intel Global Communications (U.S.)
+1 408 552 3597, lindsey.a.sech@intel.com

Alistair Kemp, Intel Global Communications (Europe)
+447789746205, Alistair.kemp@intel.com

More Information

Find more information at www.intel.com/newsroom/mwc. Join the conversation at Mobile World Congress by tagging #Intel and #MWC16

The post Media Alert: Intel to Outline Path to 5G at Mobile World Congress 2016 appeared first on Intel Newsroom.

Chip Shot: Intel and Qualcomm Collaborate on 802.11ad Ecosystem

Intel and Qualcomm Atheros* have achieved a milestone in making 802.11ad WiGig* a mainstream technology, as announced in a joint blog. Both companies successfully demonstrated multi-gigabit interoperability between their respective 802.11ad WiGig solutions, paving the way for industry development of 802.11ad WiGig devices that communicate and connect seamlessly with each other. 802.11ad represents an important step in the evolution of Wi-Fi*, enabling new user capabilities such as wire-equivalent docking and high-quality, low-latency video streaming, multimedia kiosks, while bringing an increase in network capacity.

The post Chip Shot: Intel and Qualcomm Collaborate on 802.11ad Ecosystem appeared first on Intel Newsroom.