Coming to a Desktop Near You: The Future of Self-Driving

There has never been a better time to learn how AI will transform the way people, goods and services move.

During GTC Digital, anyone can experience the latest developments in AI technology for free, from the comfort of their own home. Hundreds of expert talks and training sessions covering autonomous vehicles, robotics, healthcare, finance and more will soon be available at the click of a button.

Beginning March 25, GTC Digital attendees can tune in to sessions hosted by autonomous driving leaders from Ford, Toyota, Zoox and more, as well as receive virtual training from NVIDIA experts on developing AI for self-driving.

Check out what’s in store for the NVIDIA autonomous vehicle ecosystem.

Learn from Leaders

GTC Digital talks let AI experts and developers delve into their latest work, sharing key insights on how to deploy intelligent vehicle technology.

This year’s automotive speakers are covering the latest topics in self-driving development, including AI training, simulation and software.

  • Neda Cvijetic, senior manager of Autonomous Vehicles at NVIDIA, will apply an engineering focus to widely acknowledged autonomous vehicle challenges, and explain how NVIDIA is tackling them. This session will air live with a question-and-answer segment to follow.
  • Clement Farabet, vice president of AI Infrastructure at NVIDIA, will discuss NVIDIA’s end-to-end AI platform for developing NVIDIA DRIVE software. This live talk will cover how to scale the infrastructure to train self-driving deep neural networks and include a Q&A session.
  • Tokihiko Akita, project research fellow, Toyota Technical Institute, will detail how deep neural networks can be used for autonomous vehicle object recognition and tracking in adverse weather with millimeter-wave radar sensors.
  • Nikita Jaipuria, research scientist for computer vision and machine learning, and Rohan Bhasin, research engineer, both at Ford Motor Company, will discuss in their April 2 session how to leverage generative adversarial networks to create synthetic data for autonomous vehicle training and validation.
  • Zejia Zheng and Jeff Pyke, software engineers at Zoox, will outline the Zoox TensorRT conversion pipeline to optimize deep neural network deployment on high-performance NVIDIA GPUs.

Virtual Hands-On Training

Developers will also get a chance to dive deeper into autonomy with NVIDIA Deep Learning Institute courses as well as interact virtually with experts in autonomous vehicle development.

DLI will offer a variety of instructor-led training sessions addressing the biggest developer challenges in areas such as autonomous vehicles, manufacturing and robotics. Receive intensive instruction on topics such as autonomous vehicle perception and sensor integration in our live sessions this April.

Get detailed answers to all your development questions in our Connect with Experts sessions on intelligent cockpits, autonomous driving software development and validation and more with NVIDIA’s in-house specialists.

Register to access these sessions and more for free and receive the latest updates on GTC Digital.

The post Coming to a Desktop Near You: The Future of Self-Driving appeared first on The Official NVIDIA Blog.

Virtually Free GTC: 30,000 Developers and AI Researchers to Access Hundreds of Hours of No-Cost Sessions at GTC Digital

Just three weeks ago, we announced plans to take GTC online due to the COVID-19 crisis.

Since then, a small army of researchers, partners, customers and NVIDIA employees has worked remotely to produce GTC Digital, which kicks off this week.

GTC typically packs hundreds of hours of talks, presentations and conversations into a five-day event in San Jose.

Our goal with GTC Digital is to bring some of the best aspects of this event to a global audience, and make it accessible for months.

Hundreds of our speakers — among the most talented, experienced scientists and researchers in the world — agreed to participate. Apart from the instructor-led, hands-on workshops and training sessions, which require a nominal fee, we’re delighted to bring this content to the global community at no cost. And we’ve incorporated new platforms to facilitate interaction and engagement.

Accelerating Blender Cycles with NVIDIA RTX: Blender is an open-source 3D software package that comes with the Cycles Renderer. Cycles is already a GPU enabled path-tracer, now super-charged with the latest generation of RTX GPUs. Furthering the rendering speed, RTX AI features such as the OptiX Denoiser infers rendering results for a truly interactive ray tracing experience.
Accelerating Blender Cycles with NVIDIA RTX: Blender is an open-source 3D software package that comes with the Cycles Renderer. Cycles is already a GPU enabled path-tracer, now super-charged with the latest generation of RTX GPUs. Furthering the rendering speed, RTX AI features such as the OptiX Denoiser infers rendering results for a truly interactive ray tracing experience.

We provided refunds to those who purchased a GTC 2020 pass, and those tickets have been converted to GTC Digital passes. Passholders just need to log in with GTC 2020 credentials to get started. Anyone else can attend with free registration.

Most GTC Digital content is for a technical audience of data scientists, researchers and developers. But we also offer high-level talks and podcasts on various topics, including women in data science, AI for business and responsible AI.

What’s in Store at GTC Digital

The following activities will be virtual events that take place at a specific time (early registration recommended). Participants will be able to interact in real time with the presenters.

Training Sessions:

  • Seven full-day, instructor-led workshops, from March 25 to April 2, on data science, deep learning, CUDA, cybersecurity, AI for predictive maintenance, AI for anomaly detection, and autonomous vehicles. Each full-day workshop costs $79.
  • Fifteen 2-hour training sessions running April 6-10, on various topics, including autonomous vehicles, CUDA, conversational AI, data science, deep learning inference, intelligent video analytics, medical imaging, recommendation systems, deep learning training at scale, and using containers for HPC. Each two-hour instructor-led session costs $39. 

Live Webinars: Seventeen 1-hour sessions, from March 24-April 8, on various topics, including data science, conversational AI, edge computing, deep learning, IVA, autonomous machines and more. Live webinars will be converted to on-demand content and posted within 48 hours. Free. 

Connect with Experts: Thirty-eight 1-hour sessions, from March 25-April 10, where participants can chat one-on-one with NVIDIA experts to get answers in a virtual classroom. Topics include conversational AI, recommender systems, deep learning training and autonomous vehicle development. Free. 

The following activities will be available on demand:

Recorded Talks: More than 150 recorded presentations with experts from leading companies around the world, speaking on a variety of topics such as computer vision, edge computing, conversational AI, data science, CUDA, graphics and ray tracing, medical imaging, virtualization, weather modeling and more. Free. 

Tech Demos: We’ll feature amazing demo videos, narrated by experts, highlighting how NVIDIA GPUs are accelerating creative workflows, enabling analysis of massive datasets and helping advance research. Free. 

AI Podcast: Several half-hour interviews with leaders across AI and accelerated computing will be posted over the next four weeks. Among them: Kathie Baxter, of Salesforce, on responsible AI; Stanford Professor Margot Gerritsen on women in data science and how data science intersects with AI; Ryan Coffee, of the SLAC National Accelerator Lab, on how deep learning is advancing physics research; and Richard Loft, of the National Center of Atmospheric Research, on how AI is helping scientists better model climate change. Free.

Posters: A virtual gallery of 140+ posters from researchers around the world showing how they are solving unique problems with GPUs. Registrants will be able to contact and share feedback with researchers. Free. 

For the Einsteins and Da Vincis of Our Time

The world faces extraordinary challenges now, and the scientists, researchers and developers focused on solving them need extraordinary tools and technology. Our goal with GTC has always been to help the world’s leading developers — the Einsteins and Da Vincis of our time — solve difficult challenges with accelerated computing. And that’s still our goal with GTC Digital.

Whether you work for a small startup or a large enterprise, in the public or private sector, wherever you are, we encourage you to take part, and we look forward to hearing from you.

The post Virtually Free GTC: 30,000 Developers and AI Researchers to Access Hundreds of Hours of No-Cost Sessions at GTC Digital appeared first on The Official NVIDIA Blog.

As GTC Goes Digital, AI Podcast Continues

It was a simple formula: Get a microphone, go to NVIDIA’s GPU Technology Conference, interview as many people as possible about what they’re doing.

Year in and year out the result has been great conversations with some people using AI to taking on the greatest challenges of our time — from fusion energy research and astronomy to cybersecurity and transportation.

Those conversations won’t be happening face-to-face, but they’ll continue this year. In the weeks ahead, we’ll be releasing interviews with members of the GTC community about their work.

It’s just one facet of the premier deep learning and AI conference that you’ll be able to engage with, online.

You’ll be able to join live webinars, training and Connect with the Experts sessions starting Tuesday, March 24.

You can also choose from a library of talks, panels, research posters and demos that you can view on your own schedule, at your own pace.

New on-demand content will be announced every Thursday starting March 26. And registration for GTC Digital is free.

Meanwhile, here’s a selection of some of the most interesting podcast interviews we’ve done at past GTCs.

How Deep Learning Can Accelerate the Quest for Cheap, Clean Fusion Energy 

Clean, cheap fusion energy would change everything for the better. AI Podcast guest William Tang has spent a career at the forefront of that field, currently as principal research physicist at the Princeton Plasma Physics Laboratory. He’s also one of the world’s foremost experts on how the science of fusion energy and high performance computing intersect. He talks about how new tools — deep learning and artificial intelligence — are being put to work to enable big-data-driven discovery in key scientific endeavors, such as the quest to deliver fusion energy.

Astronomers Turn to AI as New Telescopes Come Online

The good news: astronomers are getting new tools to let them see further, better than ever before. The bad news: they’ll soon be getting more data than humans can handle. To turn the vast quantities of data that will be pouring out of these instruments into world-changing scientific discoveries, Brant Robertson, a visiting professor at the Institute for Advanced Study in Princeton and an associate professor of astronomy at UC Santa Cruz, is turning to AI.

How Airbus A³ Plans to Bring Autonomous Air Taxis to Urban Skies

With self-driving cars generating so much buzz, it’s hard to believe that a self-piloting air taxi is, err, flying under the radar. But not for long. Arne Stoschek, head of autonomous systems at Airbus A3 (pronounced “A-cubed”), the Silicon Valley-based advanced products and partnerships outpost of Airbus Group, discusses a plan to bring self-piloted air taxis to the Bay Area’s skies.

How Syed Ahmed Taught AI to Translate Sign Language

We all know how far AI, and in particular deep learning, have pushed speech recognition, whether that’s with Apple Siri, Amazon Alexa or Google Assistant. Syed Ahmed is directing the power of AI toward another form of communication: American Sign Language. And what he’s done is set up a deep learning model that translates ASL into English.

Tune in to the AI Podcast

Get the AI Podcast through iTunes, Google Podcasts, Google Play, Castbox, DoggCatcher, Overcast, PlayerFM, Pocket Casts, Podbay, PodBean, PodCruncher, PodKicker, Soundcloud, Spotify, Stitcher and TuneIn. If your favorite isn’t listed here, drop us a note.

Listen our AI Podcast on Spotify PodcastsListen our AI Podcast on Google Podcasts Listen our AI Podcast on Apple Podcasts

Make the AI Podcast Better

Have a few minutes to spare? Fill out this listener survey. Your answers will help us make a better podcast.

The post As GTC Goes Digital, AI Podcast Continues appeared first on The Official NVIDIA Blog.

GTC News Can Wait

We have exciting products and news to share with you.

But this isn’t the right time. We’re going to hold off on sharing our GTC news for now.

That way, our employees, partners, the media and analysts who follow us, and our customers around the world can focus on staying safe and reducing the spread of the virus.

We will still stream tons of great content from researchers and developers who have prepared great talks.

This is a time to focus on our family, our friends, our community. Our employees are working from home. Many hourly workers will not need to work but they’ll all be fully paid.

Stay safe everyone. We will get through this together.

The post GTC News Can Wait appeared first on The Official NVIDIA Blog.

Look Under the Hood of Self-Driving Development at GTC 2020

The progress of self-driving cars can be seen in test vehicles on the road. But the major mechanics for autonomous driving development are making tracks in the data center.

Training, testing and validating self-driving technology requires enormous amounts of data, which must be managed by a robust hardware and software infrastructure. Companies around the world are turning to high-performance, energy efficient GPU technology to build the AI infrastructure needed to put autonomous driving deep neural networks (DNNs) through their paces.

At next month’s GPU Technology Conference in San Jose, Calif., automakers, suppliers, startups and safety experts will discuss how they’re tackling the infrastructure component of autonomous vehicle development.

By attending sessions on topics such as DNN training, data creation. and validation in simulation, attendees can learn the end-to-end process of building a self-driving car in the data center.

Mastering Learning Curves

Without a human at the wheel, autonomous vehicles rely on a wide range of DNNs that  perceive the surrounding environment. To recognize everything from pedestrians to street signs and traffic lights, these networks require exhaustive training on mountains of driving data.

Tesla has delivered nearly half a million vehicles with AI-assisted driving capabilities worldwide. They’re gathering data while continuously receiving the latest models through over-the-air updates.

At GTC, Tim Zaman, machine learning infrastructure engineering manager at Tesla, will share how the automaker built and maintains a low-maintenance, efficient and lightning-fast, yet user-friendly, machine-learning infrastructure that its engineers rely on to develop Tesla Autopilot.

As more test vehicles outfitted with sensors drive on public roads, the pool of training data can grow by terabytes. Ke Li, software engineer at Pony.ai, will talk about how the self-driving startup is building a GPU-centric infrastructure that can process the increasingly heavy sensor data more efficiently, scale with future advances in GPU compute power, and can integrate with other heterogeneous compute platforms.

For NVIDIA’s own autonomous vehicle development, we’ve built a scalable infrastructure to train self-driving DNNs. Clement Farabet, vice president of AI Infrastructure at NVIDIA, will discuss Project MagLev, an internal end-to-end AI platform for developing NVIDIA DRIVE software.

The session will cover how MagLev enables autonomous AI designers to iterate training of new DNN designs across thousands of GPU systems and validate the behavior of these designs over multi-petabyte-scale datasets.

Virtual Test Tracks

Before autonomous vehicles are widely deployed on public roads, they must be proven safe for all possible conditions the car could encounter — including rare and dangerous scenarios.

Simulation in the data center presents a powerful solution to what has otherwise been an insurmountable obstacle. By tapping into the virtual world, developers can safely and accurately test and validate autonomous driving hardware and software without leaving the office.

Zvi Greenstein, general manager at NVIDIA, will give an overview of the NVIDIA DRIVE Constellation VR simulation platform, a cloud-based solution that enables hardware-in-the-loop testing and large-scale deployment in data centers. The session will cover how NVIDIA DRIVE Constellation is used to validate safe autonomous driving and how companies can partner with NVIDIA and join the DRIVE Constellation ecosystem.

Having data as diverse and random as the real world is also a major challenge when it comes to validation. Nikita Jaipuria and Rohan Bhasin, research engineers at Ford, will discuss how to generate photorealistic synthetic data using generative adversarial networks (GANs). These simulated images can be used to represent a wide variety of situations for comprehensive autonomous vehicle testing.

Regulators and third-party safety agencies are also using simulation technology to evaluate self-driving cars. Stefan Merkl, mobility regional manager at TÜV SÜD America, Inc., will outline the agency’s universal framework to help navigate patchwork local regulations, providing a unified method for the assessment of automated vehicles.

In addition to these sessions, GTC attendees will hear the latest NVIDIA news and experience demos and hands-on training for a comprehensive view of the infrastructure needed to build the car of the future. Register before Feb. 13 to take advantage of early rates and receive 20% off with code CMAUTO.

The post Look Under the Hood of Self-Driving Development at GTC 2020 appeared first on The Official NVIDIA Blog.

NVIDIA DRIVE Ecosystem Charges into Next Decade of AI

Top transportation companies are using NVIDIA DRIVE to lead the way into the coming era of autonomous mobility.

Electric vehicle makers, mapping companies and mobility providers announced at the GPU Technology Conference in Suzhou, China, that they’re leveraging NVIDIA DRIVE in the development of their self-driving solutions.

By joining the DRIVE ecosystem, each of these companies can contribute industry experience and expertise to a worldwide community dedicated to delivering safer and more efficient transportation.

“NVIDIA created an open platform so the industry can team up together to realize this autonomous future,” NVIDIA CEO Jensen Huang said during his GTC keynote. “The rich ecosystem we’ve developed is a testament to the openness of this platform.”

Driving Autonomy Forward

DiDi Chuxing, the world’s leading mobile transportation platform, announced that it will use NVIDIA AI technologies to bring Level 4 autonomous vehicles and intelligent ride-hailing services to market. The company, which provides more than 30 million rides a day, will use the NVIDIA DRIVE platform and NVIDIA AI data center solutions to develop its fleets and provide services in the DiDi Cloud.

As part of the centralized AI processing of DiDi’s autonomous vehicles, NVIDIA DRIVE enables data to be fused from all types of sensors (cameras, lidar, radar, etc.) using numerous deep neural networks to understand the 360-degree environment surrounding the car and plan a safe path forward.

To train these DNNs, DiDi will use NVIDIA GPU data center servers. For cloud computing, DiDi will also build an AI infrastructure and launch virtual GPU cloud servers for computing, rendering and gaming.

DiDi autonomous vehicle on display at GTC.

Automakers, truck manufacturers and software startups on the GTC showfloor displayed the significant progress they’ve achieved on the NVIDIA DRIVE platform. Autonomous pilots from Momenta and WeRide continue to grow in sophistication, while autonomous trucking company TuSimple expands its highway trucking routes.

Next-generation production vehicles from Xpeng and Karma Automotive will leverage the DRIVE AGX platform for AI-powered driver assistance systems.

Karma Automotive is developing AI-assisted driving on the NVIDIA DRIVE AGX platform.

Triangulating Success with HD Mapmakers

At GTC, Amap and Kuandeng announced that their high-definition maps are now compatible with DRIVE Localization, an open, scalable platform that enables autonomous vehicles to localize themselves within centimeters on roads worldwide.

Localization makes it possible for a self-driving car to pinpoint its location so it can understand its surroundings and establish a sense of the road and lane structures. This enables it to plan lane changes ahead of what’s immediately visible and determine lane paths even when markings aren’t clear.

DRIVE Localization makes that centimeter-level positioning possible by matching semantic landmarks in the vehicle’s environment with features from HD maps by companies like Amap and Kuandeng to determine exactly where it is in real time.

With more than 100 million daily active users of its maps services, Amap is one of the leading mapping companies in China. It has already collected HD map data on more than 320,000 km of roadways and is working with automakers such as General Motors, Geely and FAW to provide commercial maps.

Kuandeng is dedicated to providing critical infrastructure data service for autonomous vehicles. Its HD map solution and high-precision localization product provide core technical and data support for automakers, suppliers and startups. Kuandeng is also building up a cloud-based, crowdsourced platform for real-time HD map updates, establishing a closed-loop reflux system of data.

Mapping the world with such a high level of precision is a virtually impossible task for one company to accomplish alone. By partnering with the top regional mapmakers around the world, NVIDIA is helping develop highly accurate and comprehensive maps for autonomous vehicle navigation and localization.

“HD maps make it possible to pinpoint an autonomous vehicle’s location with centimeter-level accuracy,” said Frank Jiang, vice general manager of the Automotive Business Department at Amap. “By making our maps compatible with NVIDIA DRIVE Localization, we can enable highly precise localization for autonomous vehicles.”

By growing the NVIDIA DRIVE ecosystem with leading companies around the world, we’re working to deliver safer, more efficient transportation to global roads, sooner.

The post NVIDIA DRIVE Ecosystem Charges into Next Decade of AI appeared first on The Official NVIDIA Blog.

2020 Vision: See the Future of Transportation at GTC

The next decade of autonomous transportation is just starting to come into focus. GTC attendees will be among the first to see what’s new for safer, more efficient mobility.

The NVIDIA GPU Technology Conference in San Jose, Calif., draws autonomous vehicle developers, researchers, press and analysts from around the world. The annual event is a forum to exhibit and discuss innovations in AI-powered transportation, as well as to form lasting connections across the industry.

This March, NVIDIA CEO Jensen Huang will kick off GTC with a keynote address on the rapid growth of AI computing, mapping out how GPU-powered technology is changing the automotive industry in addition to healthcare, robotics, retail and more.

Attendees will get the opportunity to dive deeper into these topics in dedicated sessions, as well as experience them firsthand on the exhibit floor. NVIDIA experts will also be onsite for hands-on training and networking events.

Expert AI Sight

GTC talks let AI experts and developers delve into their latest work, sharing key insights on how to deploy intelligent vehicle technology. The automotive speakers for GTC 2020 represent nearly every facet of the industry — including automakers, suppliers, startups and universities — and will cover a diversity of topics sure to satisfy any autonomous driving interest.

TuSimple founder and president Xiaodi Hou hosts a session on autonomous trucking at GTC 2019.

Here’s a brief look at some of the 40+ automotive sessions at GTC 2020:

  • Nikita Jaipuria, research scientist for computer vision and machine learning, and Rohan Bhasin, research engineer, both at Ford Motor Company, will discuss how to leverage generative adversarial networks to create synthetic data for autonomous vehicle training and validation.
  • Mate Rimac, CEO of Rimac, will outline how AI is transforming performance vehicles, from advanced driver assistance to intelligent coaching to bringing a video game-like experience to the track.
  • Wadim Kehl, research scientist, and Arjun Bhargava, machine learning engineer, both at Toyota Research Institute, will detail how they’ve combined PyTorch with NVIDIA TensorRT to strike the delicate balance of algorithm complexity, data management and energy efficiency needed for safe self-driving operation.
  • Neda Cvijetic, senior manager of Autonomous Vehicles at NVIDIA, will apply an engineering focus to widely acknowledged autonomous vehicle challenges, including drivable path perception and handling intersections, and then explain how NVIDIA is tackling them.

Zoom In 

Developers at GTC will also get a chance to dive deeper into autonomy with NVIDIA Deep Learning Institute courses as well as interact with experts in autonomous vehicle development.

Throughout the week, DLI will offer more than 60 instructor-led training sessions, 30 self-paced courses and six full-day workshops addressing the biggest developer challenges in areas such as autonomous vehicles, manufacturing and robotics.

Stuck in a self-driving rut? Visit our Connect with Experts sessions on intelligent cockpits, vehicle perception, autonomous driving software development and more to ask NVIDIA’s in-house specialists.

Not Just a Vision

The GTC exhibit hall will be the place to see and interact with autonomous driving technology firsthand.

At the NVIDIA booth, attendees can watch NVIDIA DRIVE in action, including its various deep neural networks, simulation and DRIVE AGX platforms. The dedicated autonomous vehicle zone will feature the latest vehicles from the DRIVE ecosystem, which includes automakers, suppliers, robotaxi companies, truckmakers, sensor companies and software startups.

Autonomous driving software company AutoX showcased their self-driving prototype to attendees at GTC 2019.

GTC goers will get the exclusive chance to experience all this as well as learn about AI developments in industries such as healthcare, energy, finance and gaming with a variety of networking opportunities throughout the week.

Register in 2019 for GTC to take advantage of early bird pricing — the first 1,000 people to sign up for conference passes will get priority access to Huang’s keynote.

Mark your calendar for March 2020 and see for yourself how AI is changing the next decade of autonomous transportation.

The post 2020 Vision: See the Future of Transportation at GTC appeared first on The Official NVIDIA Blog.

Transportation’s Capital: Autonomous Vehicles Get a Move on at GTC DC

Neither snow nor rain nor heat nor gloom of night stays the coming of autonomous vehicle technology, according to the U.S. Postal Service.

The USPS Wednesday discussed the results of its autonomous mail delivery pilot with NVIDIA DRIVE ecosystem member TuSimple at GTC DC.

The discussion follows the announcement earlier this week that the USPS — the world’s largest delivery service — will use NVIDIA end-to-end AI technology for mail processing.

The intersection of transportation, public policy, and AI was a major focus at this week’s conference, which brought more than 3,500 government officials, technology experts, media and analysts together this week in the nation’s capital.

Amid discussions on the integration of AI into industries like healthcare, robotics and education, attendees agreed on one trend: safer, more efficient transportation is well underway.

Brent Raney, director of surface transportation at USPS, said the human-supervised autonomous trucks delivered mail on its Dallas-Phoenix route up to two hours faster than the typical human driver.

TuSimple’s trucks navigated tricky traffic situations during its autonomous pilot with the USPS.

Despite rain, construction zones and 40 mph crosswinds, the TuSimple trucks were able to navigate the route and consistently arrive earlier than expected.

And as the trucking industry faces growing driver shortages, self-driving vehicles are arriving just in time.

“For us, the sooner we can get to autonomy, the better,” Raney said.

Delivering Autonomy, Safely

The USPS isn’t the only delivery expert turning to AI-powered solutions. Volvo Group, one of the largest commercial vehicle manufacturers in the world, detailed how it is developing its next generation of autonomous trucks to ensure safer transportation.

Volvo Group is developing its next generation of vehicles, such as the Vera truck, for the autonomous era.

In June, the manufacturer announced that it’s using the NVIDIA DRIVE autonomous driving platform to train, test and deploy self-driving AI vehicles, targeting public transport, freight transport, refuse and recycling collection, construction, mining, forestry and more.

As part of this collaboration, Julia Ng, senior computer vision engineer at NVIDIA, spoke at GTC DC on how we’re prioritizing safety — which includes measures such as the Safety Force Field driving policy and the integration of diversity and redundancy at every level of the autonomous driving system.

Dawn Fenton, director of sustainability and public affairs at Volvo Group, joined Ng on stage and said the NVIDIA approach aligns with the manufacturer’s long-time legacy of safety in transportation.

“For this technology to be successful, we must be able to prove it’s safe,” Fenton said, adding that consumer education is a crucial step toward public road deployment.

Virtual Tools for Real-World Results

Other speakers shifted the conversation from the open road to virtual highways.

While public road drives are a key component of testing and validating autonomous vehicle technology, experts agreed that simulation is necessary to achieve the scale and variability needed to ensure safe operation.

NVIDIA DRIVE Constellation is a cloud-based solution that enables millions of miles to be driven in virtual environments across a broad range of scenarios — from routine driving to rare or even dangerous situations — with greater efficiency, cost-effectiveness and safety than what is possible in the real world.

Stefan Merkl, mobility manager at safety agency TÜV SÜD America, said platforms like DRIVE Constellation are critical to determining the capabilities of autonomous driving systems.

DRIVE Constellation is a hardware-in-the-loop simulation platform for safe and scalable autonomous vehicle validation.

“For functional safety, we need simulation,” Merkl said. “It allows us to achieve the scale and randomization needed for comprehensive validation.”

With the progress achieved thus far, and the vision and tools available for future development, this year’s GTC DC proved that coming autonomous transportation is truly a capital idea.

And there will be even more to explore next year. GTC Silicon Valley will host four days of broad-ranging autonomous vehicle discussions this March. Register here.

 

The post Transportation’s Capital: Autonomous Vehicles Get a Move on at GTC DC appeared first on The Official NVIDIA Blog.

U.S. Government CTO, CIO Among Leaders Flocking to GTC DC

The U.S. government’s CTO and CIO on Tuesday joined other key tech decision makers, lawmakers, and industry leaders at the start of the two-day GPU Technology Conference in Washington D.C.

Federal CIO Suzette Kent led a panel of civilian agency leaders explaining how they’re using AI.  Moments later, U.S. CTO Michael Kratsios led a discussion of how the federal government is supporting U.S. AI leadership.

And Moira Bergin, the House Committee on Homeland Security’s subcommittee director for cybersecurity and infrastructure protection, joined a discussion of how Congress and the administration are addressing new AI cybersecurity capabilities.

The talks were among the more than 160 sessions — led by a cross-section of Washington  leaders from government and industry — that have drawn more than 3,500 to downtown D.C. this week.

GTC DC — hosted by NVIDIA and its partners, including Booz Allen Hamilton, Dell, IBM, Lockheed Martin and others — has quickly become the capital’s largest AI event. And it’s research, not rhetoric, attendees will tell you, that makes DC an AI accelerator like no other.

The conference is packed with representatives from more than a score of federal agencies — among them the U.S. Department of Energy, NASA, and the National Institutes of Health — together able to marshal scientific efforts on a scale far beyond that of anywhere else in the world.

Putting AI to Work

The conference opened with a keynote from Ian Buck, NVIDIA’s vice president for accelerated computing.

Buck — known for creating the CUDA computing platform that puts GPUs to work powering everything from supercomputing to next-generation AI — detailed the broad range of AI tools NVIDIA makes available to help organizations advance their work.

“The challenge is how do we take AI from innovation to actually applying AI,” Buck said during his keynote address Wednesday morning. “Our challenge, NVIDIA’s challenge, and my challenge is ‘How can I bring AI to industries and activate it?’”

Buck’s message was buttressed by Kent, who led a panel of civilian agency leaders discussing how they’re using AI to improve government services.

“We’re using these AI capabilities to act faster,” Kent said. “In the areas where we’re keeping citizens safe, whether it’s reacting to weather or a problem caused by humans — the speed at which we help is increasing.”

Meanwhile, Kratsios led a discussion about how the U.S government — which has a decades long history of supporting key technology advances — is working to extend U.S. technology leadership in the AI age.

“We fundamentally believe that AI is something that’s going to touch every industry in the United States,” Kratsios said. “We view artificial intelligence as a tool that can empower workers to do their jobs better, safer, faster, and more effectively.”

Wrapping up the day, the House’s Bergin joined Coleman Mehta, senior director of U.S. policy at Palo Alto Networks; Daniel Kroese, associate director of the national risk management center at the Cybersecurity and Infrastructure Security Agency; and Joshua Patterson, GM of data science at NVIDIA.

In a panel moderated by NVIDIA’s  Iain Cunningham, VP of intellectual property and cybersecurity, the four spoke about the new AI capabilities, potential countermeasures, and preparations being made by the administration and Congress.

Bergin said she’s “excited” about the prospects for AI after what she described as a decade of underinvestment in R&D.

“There’s a lot of demystification that needs to happen about what AI actually is, what it’s capabilities are now, and what its capabilities will be later,” Bergin said.

Scores more discussions are slated through Wednesday afternoon.

Underscoring the role AI can play for good, speakers from the Johns Hopkins University Applied Physics Laboratory and the Joint AI Center will discuss how they’re harnessing AI to provide humanitarian assistance and disaster relief.

Expect their discussion — of how they harnessed airborne and satellite imagery data after Hurricane Florence hit North and South Carolina in 2018 — to point the way to more groundbreaking AI feats to come.

The post U.S. Government CTO, CIO Among Leaders Flocking to GTC DC appeared first on The Official NVIDIA Blog.

The Buck Starts Here: NVIDIA’s Ian Buck on What’s Next for the AI Revolution

AI is still young, but software is available to help even relatively unsophisticated users harness it.

That’s according to Ian Buck, general manager of NVIDIA’s accelerated computing group, who shared his views in our latest AI Podcast.

Buck, who helped lay the foundation for GPU computing as a Stanford doctoral candidate, will deliver the keynote address at GTC DC on Nov. 5. His talk will give an audience inside the Beltway a software-flavored update on the status and outlook of AI.

Like the tech industry, the U.S. government is embracing deep learning. “A few years ago, there was still some skepticism, but today that’s not the case,” said Buck.

Federal planners have “gotten the message for sure. You can see from the executive orders coming out and the work of the Office of Science and Technology Policy that they are putting out mandates and putting money into budgets — it’s great to see that literally billions of dollars are being invested,” he said.

The next steps will include nurturing a wide variety of AI projects to come.

“We have the mandate and budget, now we have to help all the agencies and parts of the government down to state and local levels help take advantage of this disruptive technology in areas like predictive maintenance, traffic congestion, power-grid management and disaster relief,” Buck said.

From Computer Vision to Tougher Challenges

On the commercial horizon, users already deeply engaged in AI are moving from work in computer vision to tougher challenges in natural language processing. The neural network models needed to understand human speech can be hundreds of thousands of times larger than the early models used, for example, to identify breeds of cats in the seminal 2012 ImageNet contest.

“Conversational AI represents a new level of complexity and a new level of opportunity with new use cases,” Buck said.

AI is definitely hard, he said. The good news is that companies like NVIDIA are bundling 80 percent of the software modules users need to get started into packages tailored for specific markets such as Clara for healthcare or Metropolis for smart cities.

Unleashing GPUs

Software is a field close to Ian Buck’s heart. As part of his PhD work, he developed the Brook language to harness the power of GPUs for parallel computing. His efforts evolved into CUDA, GPU programming tools at the foundation of offerings such as Clara, Metropolis and NVIDIA DRIVE software for automated vehicles.

Users “can program down at the CUDA level” or at the higher level of frameworks such as Pytorch and TensorFlow, “or go up the stack to work with our vertical market solutions,” Buck said.

It’s a journey that’s just getting started.

“AI will be pervasive all the way down to the doorbell and thermostat. NVIDIA’s mission is to help enable that future,” Buck said.

To hear our full conversation with Buck and other AI luminaries, tune into our AI Podcast wherever you download your podcasts.

(You can see Buck’s keynote live by attending GTC DC. Use the promotional code GMPOD for a 20 percent discount.) 

Help Make the AI Podcast Better

Have a few minutes to spare? Fill out this short listener survey. Your answers will help us make a better podcast.

How to Tune in to the AI Podcast

Get the AI Podcast through iTunes, Castbox, DoggCatcher, Overcast, PlayerFM, Pocket Casts, Podbay, PodBean, PodCruncher, PodKicker, Soundcloud, Stitcher and TuneIn. Your favorite not listed here? Email us at aipodcast [at] nvidia [dot] com.

The post The Buck Starts Here: NVIDIA’s Ian Buck on What’s Next for the AI Revolution appeared first on The Official NVIDIA Blog.