More Space, Less Jam: Transportation Agency Uses NVIDIA DRIVE for Federal Highway Pilot

It could be just a fender bender or an unforeseen rain shower, but a few seconds of disruption can translate to extra minutes or even hours of mind-numbing highway traffic. But how much of this congestion could be avoided with AI at the wheel? That’s what the Contra Costa Transportation Authority is working to determine Read article >

The post More Space, Less Jam: Transportation Agency Uses NVIDIA DRIVE for Federal Highway Pilot appeared first on The Official NVIDIA Blog.

More Space, Less Jam: Transportation Agency Uses NVIDIA DRIVE for Federal Highway Pilot

It could be just a fender bender or an unforeseen rain shower, but a few seconds of disruption can translate to extra minutes or even hours of mind-numbing highway traffic.

But how much of this congestion could be avoided with AI at the wheel?

That’s what the Contra Costa Transportation Authority is working to determine in one of three federally funded automated driving system pilots in the next few years. Using vehicles retrofitted with the NVIDIA DRIVE AGX Pegasus platform, the agency will estimate just how much intelligent transportation can improve the efficiency of everyday commutes.

“As the population grows, there are more demands on roadways and continuing to widen them is just not sustainable,” said Randy Iwasaki, executive director of the CCTA. “We need to find better ways to move people, and autonomous vehicle technology is one way to do that.”

The CCTA was one of eight awardees – and the only local agency – of the Automated Driving System Demonstration Grants Program from the U.S. Department of Transportation, which aims to test the safe integration of self-driving cars into U.S. roads.

The Bay Area agency is using the funds for the highway pilot, as well as two other projects to develop robotaxis equipped with self-docking wheelchair technology and test autonomous shuttles for a local retirement community.

A More Intelligent Interstate

From the 101 to the 405, California is known for its constantly congested highways. In Contra Costa, Interstate 680 is one of those high-traffic corridors, funneling many of the area’s 120,000 daily commuters. This pilot will explore how the Highway Capacity Manual – which sets assumptions for modeling freeway capacity – can be updated to incorporate future automated vehicle technology.

Iwasaki estimates that half of California’s congestion is recurrent, meaning demand for roadways is higher than supply.  The other half is non-recurrent and can be attributed to things like weather events, special events — such as concerts or parades — and accidents. By eliminating human driver error, which has been estimated by the National Highway Traffic Safety Administration to be the cause of 94 percent of traffic accidents, the system becomes more efficient and reliable.

Autonomous vehicles don’t get distracted or drowsy, which are two of the biggest causes of human error while driving. They also use redundant and diverse sensors as well as high-definition maps to detect and plan the road ahead much farther than a human driver can.

These attributes make it easier to maintain constant speeds as well as space for vehicles to merge in and out of traffic for a smoother daily commute.

Driving Confidence

The CCTA will be using a fleet of autonomous test vehicles retrofitted with sensors and NVIDIA DRIVE AGX to gauge how much this technology can improve highway capacity.

The NVIDIA DRIVE AGX Pegasus AI compute platform uses the power of two Xavier systems-on-a-chip and two NVIDIA Turing architecture GPUs to achieve an unprecedented 320 trillion operations per second of supercomputing performance. The platform is designed and built for Level 4 and Level 5 autonomous systems, including robotaxis.

NVIDIA DRIVE AGX Pegasus

Iwasaki said the agency tapped NVIDIA for this pilot because the company’s vision matches its own: to solve real problems that haven’t been solved before, using proactive safety measures every step of the way.

With half of adult drivers reporting they’re fearful of self-driving technology, this approach to autonomous vehicles is critical to gaining public acceptance, he said.

“We need to get the word out that this technology is safer and let them know who’s behind making sure it’s safer,” Iwasaki said.

The post More Space, Less Jam: Transportation Agency Uses NVIDIA DRIVE for Federal Highway Pilot appeared first on The Official NVIDIA Blog.

Safe Travels: Voyage Intros Ambulance-Grade, Self-Cleaning Driverless Vehicle Powered by NVIDIA DRIVE

Self-driving cars continue to amaze passengers as a truly transformative technology. However, in the time of COVID-19, a self-cleaning car may be even more appealing.

Robotaxi startup Voyage introduced its third-generation vehicle, the G3, this week. The  autonomous vehicle, a Chrysler Pacifica Hybrid minivan retrofitted with self-driving technology, is the company’s first designed to operate without a driver and is equipped with an ambulance-grade ultraviolet light disinfectant system to keep passengers healthy.

The new vehicles use the NVIDIA DRIVE AGX Pegasus compute platform to enable the startup’s self-driving AI for robust perception and planning. The automotive-grade platform delivers safety to the core of Voyage’s autonomous fleet.

Given the enclosed space and the proximity of the driver and passengers, ride-hailing currently poses a major risk in a COVID-19 world. By implementing a disinfecting system alongside driverless technology, Voyage is ensuring self-driving cars will continue to develop as a safer, more efficient alternative to everyday mobility.

The G3 vehicle uses an ultraviolet-C system from automotive supplier GHSP to destroy pathogens in the vehicle between rides. UV-C works by inactivating a pathogen’s DNA, blocking its reproductive cycle. It’s been proven to be up to 99.9 percent effective and is commonly used to sterilize ambulances and hospital rooms.

The G3 is production-ready and currently testing on public roads in San Jose, Calif., with production vehicles planned to come out next year.

G3 Compute Horsepower Takes Off with DRIVE AGX Pegasus

Voyage has been using the NVIDIA DRIVE AGX platform in its previous-generation vehicles to power its Shield automatic emergency braking system.

With the G3, the startup is unleashing the 320 TOPS of performance from NVIDIA DRIVE AGX Pegasus to process sensor data and run diverse and redundant deep neural networks simultaneously for driverless operation. Voyage’s onboard computers are automotive grade and safety certified, built to handle the harsh vehicle environment for safe daily operation.

NVIDIA DRIVE AGX Pegasus delivers the compute necessary for level 4 and level 5 autonomous driving.

DRIVE AGX Pegasus is built on two NVIDIA Xavier systems-on-a-chip. Xavier is the first SoC built for autonomous machines and was recently determined by global safety agency TÜV SÜD to meet all applicable requirements of ISO 26262. This stringent assessment means it meets the strictest standard for functional safety.

Xavier’s safety architecture combined with the AI compute horsepower of the DRIVE AGX Pegasus platform delivers the robustness and performance necessary for the G3’s fully autonomous capabilities.

Moving Forward as the World Shelters in Place

As the COVID-19 pandemic continues to limit the way people live and work, transportation must adapt to keep the world moving.

In addition to the UV-C lights, Voyage has also equipped the car with HEPA-certified air filters to ensure safe airflow inside the car. The startup uses its own employees to manage and operate the fleet, enacting strict contact tracing and temperature checks to help minimize virus spread.

The Voyage G3 is equipped with a UV-C light system to disinfect the vehicle between rides.

While these measures are in place to specifically protect against the COVID-19 virus, they demonstrate the importance of an autonomous vehicle as a place where passengers can feel safe. No matter the condition of the world, autonomous transportation translates to a worry-free voyage, every time.

The post Safe Travels: Voyage Intros Ambulance-Grade, Self-Cleaning Driverless Vehicle Powered by NVIDIA DRIVE appeared first on The Official NVIDIA Blog.

Fleet Dreams Are Made of These: TuSimple and Navistar to Build Autonomous Trucks Powered by NVIDIA DRIVE

Self-driving trucks are coming to an interstate near you.

Autonomous trucking startup TuSimple and truck maker Navistar recently announced they will build self-driving semi trucks, powered by the NVIDIA DRIVE AGX platform. The collaboration is one of the first to develop autonomous trucks, set to begin production in 2024.

Over the past decade, self-driving truck developers have relied on traditional trucks retrofitted with the sensors, hardware and software necessary for autonomous driving. Building these trucks from the ground up, however, allows for companies to custom-build them for the needs of a self-driving system as well as take advantage of the infrastructure of a mass production truck manufacturer.

This transition is the first step from research to widespread deployment, said Chuck Price, chief product officer at TuSimple.

“Our technology, developed in partnership with NVIDIA, is ready to go to production with Navistar,” Price said. “This is a significant turning point for the industry.”

Tailor-Made Trucks

Developing a truck to drive on its own takes more than a software upgrade.

Autonomous driving relies on redundant and diverse deep neural networks, all running simultaneously to handle perception, planning and actuation. This requires massive amounts of compute.

The NVIDIA DRIVE AGX platform delivers high-performance, energy-efficient compute to enable AI-powered and autonomous driving capabilities. TuSimple has been using the platform in its test vehicles and pilots, such as its partnership with the United States Postal Service.

Building dedicated autonomous trucks makes it possible for TuSimple and Navistar to develop a centralized architecture optimized for the power and performance of the NVIDIA DRIVE AGX platform. The platform is also automotive grade, meaning it is built to withstand the wear and tear of years driving on interstate highways.

Invaluable Infrastructure

In addition to a customized architecture, developing an autonomous truck in partnership with a manufacturer opens up valuable infrastructure.

Truck makers like Navistar provide nationwide support for their fleets, with local service centers and vehicle tracking. This network is crucial for deploying self-driving trucks that will criss-cross the country on long-haul routes, providing seamless and convenient service to maintain efficiency.

TuSimple is also building out an HD map network of the nation’s highways for the routes its vehicles will travel. Combined with the widespread fleet management network, this infrastructure makes its autonomous trucks appealing to a wide variety of partners — UPS, U.S. Xpress, Penske Truck Leasing and food service supply chain company McLane Inc., a Berkshire Hathaway company, have all signed on to this autonomous freight network.

And backed by the performance of NVIDIA DRIVE AGX, these vehicles will continue to improve, delivering safer, more efficient logistics across the country.

“We’re really excited as we move into production to have a partner like NVIDIA with us the whole way,” Price said.

The post Fleet Dreams Are Made of These: TuSimple and Navistar to Build Autonomous Trucks Powered by NVIDIA DRIVE appeared first on The Official NVIDIA Blog.

Driving the Future: What Is an AI Cockpit?

From Knight Rider’s KITT to Ironman’s JARVIS, intelligent copilots have been a staple of forward-looking pop culture.

Advancements in AI and high-performance processors are turning these sci-fi concepts into reality. But what, exactly, is an AI cockpit, and how will it change the way we move?

AI is enabling a range of new software-defined, in-vehicle capabilities across the transportation industry. With centralized, high-performance compute, automakers can now build vehicles that become smarter over time.

A vehicle’s cockpit typically requires a collection of electronic control units and switches to perform basic functions, such as powering entertainment or adjusting temperature. Consolidating these components with an AI platform such as NVIDIA DRIVE AGX simplifies the architecture while creating more compute headroom to add new features. In addition, NVIDIA DRIVE IX provides an open and extensible software framework for a software-defined cockpit experience.

Mercedes-Benz released the first such intelligent cockpit, the MBUX AI system, powered by NVIDIA technology, in 2018. The system is currently in more than 20 Mercedes-Benz models, with the second generation debuting in the upcoming S-Class.

The second-generation MBUX system is set to debut in the Mercedes-Benz S-Class.

MBUX and other such AI cockpits orchestrate crucial safety and convenience features much more smoothly than the traditional vehicle architecture. They centralize compute for streamlined functions, and they’re constantly learning. By regularly delivering new features, they extend the joy of ownership throughout the life of the vehicle.

Always Alert

But safety is the foremost benefit of AI in the vehicle. AI acts as an extra set of eyes on the 360-degree environment surrounding the vehicle, as well as an intelligent guardian for drivers and passengers inside.

One key feature is driver monitoring. As automated driving functions become more commonplace across vehicle fleets, it’s critical to ensure the human at the wheel is alert and paying attention.

AI cockpits use interior cameras to monitor whether the driver is paying attention to the road.

Using interior-facing cameras, AI-powered driver monitoring can track driver activity, head position and facial movements to analyze whether the driver is paying attention, drowsy or distracted. The system can then alert the driver, bringing attention back to the road.

This system can also help keep those inside and outside the vehicle safe and alert. By sensing whether a passenger is about to exit a car and using exterior sensors to monitor the outside environment, AI can warn of oncoming traffic or pedestrians and bikers potentially in the path of the opening door.

It also acts as a guardian in emergency situations. If a passenger is not sitting properly in their seat, the system can prevent an airbag activation that would harm rather than help them. It can also use AI to detect the presence of children or pets left behind in the vehicle, helping prevent heat stroke.

An AI cockpit is always on the lookout for a vehicle’s occupants, adding an extra level of safety with full cabin monitoring so they can enjoy the ride.

Constant Convenience

In addition to safety, AI helps make the daily drive easier and more enjoyable.

With crystal-clear graphics, drivers can receive information about their route, as well as what the sensors on the car see, quickly and easily. Augmented reality heads-up displays and virtual reality views of the vehicle’s surroundings deliver the most important data (such as parking assistance, directions, speed and oncoming obstacles) without disrupting the driver’s line of sight.

These visualizations help build trust in the driver assistance system as well as understanding of its capabilities and limitations for a safer and more effective driving experience.

Using natural language processing, drivers can control vehicle settings without taking their eyes off the road. Conversational AI enables easy access to search queries, like finding the best coffee shops or sushi restaurants along a given route. The same system that monitors driver attention can also interpret gesture controls, providing another way for drivers to communicate with the cockpit without having to divert their gaze.

Natural language processing makes it possible to access vehicle controls without taking your eyes off the road.

These technologies can also be used to personalize the driving experience. Biometric user authentication and voice recognition allow the car to identify who is driving, and adjust settings and preferences accordingly.

AI cockpits are being integrated into more models every year, making them smarter and safer and constantly adding new features. High-performance, energy-efficient AI compute platforms, consolidate in-car systems with a centralized architecture to enable the open NVIDIA DRIVE IX software platform to meet future cockpit needs.

What used to be fanciful fiction will soon be part of our daily driving routine.

The post Driving the Future: What Is an AI Cockpit? appeared first on The Official NVIDIA Blog.

No Supervision Required: Helm.ai Aims to Streamline Self-Driving Development

Removing the human driver from behind the steering wheel may require removing the human being from the development lab.

Helm.ai, a startup based in Menlo Park, Calif., that recently came out of stealth, seeks to dramatically reduce the bottlenecks in autonomous vehicle development cycles with an AI training method known as unsupervised learning. Like other advanced learning tools such as active learning, unsupervised learning takes an intelligent approach to training to lessen the burden on human annotators.

An autonomous vehicle must learn from a massive amount of data — measured in the petabytes, or millions of gigabytes — to safely drive without a human at the wheel. Even lower levels of autonomy, such as level 2+ and level 3 AI-assisted driving, require intensive training to operate effectively.

Embedded in this massive amount of data are the images of pedestrians, cars, signs and other objects in the millions of frames. For supervised learning, all these must be labeled, so that deep neural networks can learn to recognize them on their own. It’s an incredibly expensive, time- and labor-intensive process, making it difficult to develop and iterate new capabilities quickly.

Helm.ai uses the high performance of NVIDIA data center GPUs to run its unsupervised learning techniques to train its self-driving algorithms. The startup is also relying on NVIDIA inside the vehicle, running its self-driving software on NVIDIA DRIVE AGX Xavier.

Removing the Training Wheels

Unsupervised learning is a method of training neural networks without labeled data. It’s an open area of AI research and comes in a variety of flavors, with some previous approaches aiming to identify new patterns in data or pre-processing a pool of data.

Helm.ai is working to expand the scope of unsupervised learning as well as mathematical modeling to efficiently scale autonomous vehicle training.

Rather than pursue traditional approaches, the startup is looking to master unsupervised learning to remove the need for large-scale fleet data and armies of human annotators. The result is scalable AI software that can achieve autonomous driving capabilities on an improved timeline and budget.

“We identified some key challenges that we felt weren’t being addressed with the traditional approaches, in particular regarding the scalability and accuracy of AI software,” said Vladislav Voroninski, founder and CEO of Helm.ai. “We built some prototypes early on that made us believe that we can actually take this all the way.”

Innovating with High-Performance Compute

To achieve these breakthroughs in AI training, Helm.ai is relying on industry-leading computing in the data center and in the vehicle.

NVIDIA V100 Tensor Core GPUs in the data center make it possible to process petabytes of data without experiencing costly roadblocks, enabling advanced learning techniques like Helm.ai’s.

Once trained, the startup’s level 2+ self-driving software can then be tested in the vehicle using the NVIDIA DRIVE AGX Xavier AI compute platform. DRIVE AGX Xavier delivers 30 trillion operations per second for level 2+ and level 3 automated driving.

At its core is the Xavier system-on-a-chip, the first-ever production auto-grade SoC, which incorporates six different types of processors for high-performance, energy-efficient AI compute.

With more accurate and cost-effective AI training, Helm.ai and NVIDIA are enabling the industry to safely deploy transportation technologies that will transform the way people and goods move.

The post No Supervision Required: Helm.ai Aims to Streamline Self-Driving Development appeared first on The Official NVIDIA Blog.

No Supervision Required: Helm.ai Aims to Streamline Self-Driving Development

Removing the human driver from behind the steering wheel may require removing the human being from the development lab. Helm.ai, a startup based in Menlo Park, Calif., that recently came out of stealth, seeks to dramatically reduce the bottlenecks in autonomous vehicle development cycles with an AI training method known as unsupervised learning. Like other Read article >

The post No Supervision Required: Helm.ai Aims to Streamline Self-Driving Development appeared first on The Official NVIDIA Blog.

Step Inside Our AI Garage: NVIDIA Experts Present Insights into Self-Driving Software and Infrastructure

Intelligent vehicles require intelligent development.

That’s why NVIDIA has built a complete AI-powered portfolio — from data centers to in-vehicle computers — that enables software-defined autonomous vehicles. And this month during GTC Digital, we’re providing an inside look at how this development process works, plus how we’re approaching safer, more efficient transportation.

Autonomous vehicles must be able to operate in thousands of conditions around the world to be truly driverless. The key to reaching this level of capability is mountains of data.

To put that in perspective, a fleet of just 50 vehicles driving six hours a day generates about 1.6 petabytes of sensor data a day. If all that data were stored on standard 1GB flash drives, they’d cover more than 100 football fields. This data must then be curated and labeled to train the deep neural networks (DNNs) that will run in the car, performing a variety of dedicated functions, such as object detection and localization.

The infrastructure to train and test this software must include high-performance supercomputers to handle these enormous data needs. To run efficiently, the system must be able to intelligently curate and organize this data. Finally, it must be traceable — making it easy to find and fix bugs in the process — and repeatable, going over the same scenario over and over again to ensure a DNN’s proficiency.

As part of the GTC Digital series, we present this complete development and training infrastructure as well as some of the DNNs it has produced, driving progress toward deploying the car of the future.

Born and Raised in the Data Center

While today’s vehicles are put together on the factory floor assembly line, autonomous vehicles are born in the data center. In a GTC digital session, Clemént Farabet, vice president of AI Infrastructure at NVIDIA, details this high-performance, end-to-end platform for autonomous vehicle development.

Clemént Farabet, NVIDIA VP of AI Infrastructure

The NVIDIA internal AI infrastructure includes NVIDIA DGX servers that store and process the petabytes of driving data. For comprehensive training, developers must work with five to 10 billion frames to develop and then evaluate a DNN’s performance.

High-performance data center GPUs help speed up the time it takes to process this data. In addition, Farabet’s team optimizes development times using advanced learning methods such as active learning.

Rather than rely solely on humans to curate and label driving data for DNN training, active learning makes it possible for the DNN to choose the data it needs to learn from. A dedicated neural network goes through a pool of frames, flagging those in which it demonstrates uncertainty. The flagged frames are then labeled manually and used to train the DNN, ensuring that it’s learning from the exact data that’s new or confusing.

High-performance data center GPUs enables developers to train, test and validate self-driving DNNs at scale.

Once trained, these DNNs can then be tested and validated on the NVIDIA DRIVE Constellation simulation platform. The cloud-based solution enables millions of miles to be driven in virtual environments across a broad range of scenarios — from routine driving to rare or even dangerous situations — with greater efficiency, cost-effectiveness and safety than what is possible in the real world.

DRIVE Constellation’s high-fidelity simulation ensures these DNNs can be tested over and over, in every possible scenario and every possible condition before operating on public roads.

When combined with data center training, simulation allows developers to constantly improve upon their software in an automated, traceable and repeatable development process.

DNNs at the Edge

Once trained and validated, these DNNs can then operate in the car.

During GTC Digital, Neda Cvijetic, NVIDIA senior manager of autonomous vehicles and host of the DRIVE Labs video series, gave  an inside look at a sampling of self-driving DNNs we’ve developed.

Neda Cvijetic, NVIDIA senior manager of autonomous vehicles

Autonomous vehicles run an array of DNNs covering perception, mapping and localization to operate safely. To humans, these tasks seem straightforward, however, they’re all complex processes that require intelligent approaches to be performed successfully.

For example, to classify road objects, pedestrians and drivable space, one DNN uses a process known as panoptic segmentation, which can identify a scene with pixel-level accuracy.

To help it perceive parking spaces in a variety of environments, developers taught the ParkNet DNN to identify a spot as a four-sided polygon rather than a rectangle, so it could discern slanted spaces as well as their entry points.

And our LidarNet DNN addresses challenges in processing lidar data for localization by fusing multiple perspectives for accurate and complete perception information.

The LidarNet DNN uses multiple perspectives for highly accurate localization.

By combining these and other DNNs and running them on high-performance in-vehicle compute, such as the NVIDIA DRIVE AGX platform, an autonomous vehicle can perform comprehensive perception and planning and control without a human driver.

The GTC Digital site hosts these and other free sessions, with new content from NVIDIA experts and the DRIVE ecosystem added every Thursday until April 23. Stay up to date and register here.

The post Step Inside Our AI Garage: NVIDIA Experts Present Insights into Self-Driving Software and Infrastructure appeared first on The Official NVIDIA Blog.

Step Inside Our AI Garage: NVIDIA Experts Present Insights into Self-Driving Software and Infrastructure

Intelligent vehicles require intelligent development. That’s why NVIDIA has built a complete AI-powered portfolio — from data centers to in-vehicle computers — that enables software-defined autonomous vehicles. And this month during GTC Digital, we’re providing an inside look at how this development process works, plus how we’re approaching safer, more efficient transportation. Autonomous vehicles must Read article >

The post Step Inside Our AI Garage: NVIDIA Experts Present Insights into Self-Driving Software and Infrastructure appeared first on The Official NVIDIA Blog.

Coming to a Desktop Near You: The Future of Self-Driving

There has never been a better time to learn how AI will transform the way people, goods and services move.

During GTC Digital, anyone can experience the latest developments in AI technology for free, from the comfort of their own home. Hundreds of expert talks and training sessions covering autonomous vehicles, robotics, healthcare, finance and more will soon be available at the click of a button.

Beginning March 25, GTC Digital attendees can tune in to sessions hosted by autonomous driving leaders from Ford, Toyota, Zoox and more, as well as receive virtual training from NVIDIA experts on developing AI for self-driving.

Check out what’s in store for the NVIDIA autonomous vehicle ecosystem.

Learn from Leaders

GTC Digital talks let AI experts and developers delve into their latest work, sharing key insights on how to deploy intelligent vehicle technology.

This year’s automotive speakers are covering the latest topics in self-driving development, including AI training, simulation and software.

  • Neda Cvijetic, senior manager of Autonomous Vehicles at NVIDIA, will apply an engineering focus to widely acknowledged autonomous vehicle challenges, and explain how NVIDIA is tackling them. This session will air live with a question-and-answer segment to follow.
  • Clement Farabet, vice president of AI Infrastructure at NVIDIA, will discuss NVIDIA’s end-to-end AI platform for developing NVIDIA DRIVE software. This live talk will cover how to scale the infrastructure to train self-driving deep neural networks and include a Q&A session.
  • Tokihiko Akita, project research fellow, Toyota Technical Institute, will detail how deep neural networks can be used for autonomous vehicle object recognition and tracking in adverse weather with millimeter-wave radar sensors.
  • Nikita Jaipuria, research scientist for computer vision and machine learning, and Rohan Bhasin, research engineer, both at Ford Motor Company, will discuss in their April 2 session how to leverage generative adversarial networks to create synthetic data for autonomous vehicle training and validation.
  • Zejia Zheng and Jeff Pyke, software engineers at Zoox, will outline the Zoox TensorRT conversion pipeline to optimize deep neural network deployment on high-performance NVIDIA GPUs.

Virtual Hands-On Training

Developers will also get a chance to dive deeper into autonomy with NVIDIA Deep Learning Institute courses as well as interact virtually with experts in autonomous vehicle development.

DLI will offer a variety of instructor-led training sessions addressing the biggest developer challenges in areas such as autonomous vehicles, manufacturing and robotics. Receive intensive instruction on topics such as autonomous vehicle perception and sensor integration in our live sessions this April.

Get detailed answers to all your development questions in our Connect with Experts sessions on intelligent cockpits, autonomous driving software development and validation and more with NVIDIA’s in-house specialists.

Register to access these sessions and more for free and receive the latest updates on GTC Digital.

The post Coming to a Desktop Near You: The Future of Self-Driving appeared first on The Official NVIDIA Blog.