Audiences are making a round-trip to the moon with a science documentary that showcases China’s recent lunar explorations. Fly to the Moon, a series produced by the China Media Group (CMG) entirely in NVIDIA Omniverse, details the history of China’s space missions and shares some of the best highlights of the Chang ‘e-4 lunar lander, Read article >
Audiences are making a round-trip to the moon with a science documentary that showcases China’s recent lunar explorations.
Fly to the Moon, a series produced by the China Media Group (CMG) entirely in NVIDIA Omniverse, details the history of China’s space missions and shares some of the best highlights of the Chang ‘e-4 lunar lander, which achieved humanity’s first soft landing on the far side of the moon in January of 2019
CMG wanted to celebrate recent achievements of science and technology by creating an immersive visual experience for audiences. But designing a realistic lunar surface required extensive collaboration across different teams throughout the development and production cycles.
To handle the massive amount of data and assets needed to build space scenes, and to create detailed, stunning graphics in high resolution, CMG turned to Omniverse, a real-time simulation and collaboration platform.
CMG used the newly released NVIDIA Omniverse Machinima, an app that simplifies cinematic 3D storytelling, to bring viewers into immersive cosmic environments. NVIDIA RTX real-time ray tracing and AI helped the team enhance content creation workflows and produce photorealistic graphics for the documentary.
Content Creation That’s Out of This World
NVIDIA Omniverse lets CMG’s artists and designers easily unite their 3D assets and multiple software applications in a single, virtual collaboration space. There, they could work together and create simultaneously on shared scenes. They could also easily import and export USD files, which saved time spent on model extractions.
Omniverse Nucleus, the real-time collaboration engine of the platform, helped the team manage the integrity of their 3D assets and accelerated production workflows. Omniverse Connectors for Epic Games Unreal Engine, Autodesk Maya and Adobe Substance enabled live-sync collaborative creation, while the Omniverse RTX Renderer provided real-time previews of scenes and images.
Working simultaneously across multiple applications, combined with the ability to review content with real-time ray and path-traced realism, helped CMG speed production times.
CMG also used Omniverse to let its artists create 3D models and high-fidelity renders for their immersive space environments. NVIDIA Quadro RTX 8000 and NVIDIA RTX A6000 GPUs provided large video memory for loading massive amounts of data and reducing rendering times.
Using multi-GPU rendering, real-time ray tracing and material definition language, CMG created detailed images with realistic lighting and shadows, all in ultra-high 4K resolution.
Through technical processes like AI deep learning, logical binding, technical art and live real-time driving, CMG designed realistic “digital astronauts” in 4K resolution. It also developed a facial action coding system using deep learning running on the NVIDIA graphics cards to recognize the features of actors by time-of-flight cameras, and driving digital twin humans in real time by infrared camera.
“We wanted to create a real-time VFX production workflow — one where visual effects producers, directors, artists and developers can see both progress and effects in the same context, so that what you see is what you get,” said Jay Wang, technical director of the VFX group at CMG. “We found that Omniverse Kit and Omniverse View help us realize this goal.”
CMG anticipates gaining a 30-40 percent time savings from pre-, production, and post-production with continued implementation of NVIDIA Omniverse. The company plans to use NVIDIA Omniverse running on NVIDIA RTX to create stunning photorealistic images, enhance production pipelines and build more complex scientific visualizations for film and television in the future.
Researchers at The Ohio State University are aiming to take autonomous driving to the limit. Autonomous vehicles require extensive development and testing for safe widespread deployment. A team at The Ohio State Center for Automotive Research (CAR) is building a Mobility Cyber Range (MCR) — a dedicated platform for cybersecurity testing — in a self-driving Read article >
Researchers at The Ohio State University are aiming to take autonomous driving to the limit.
Autonomous vehicles require extensive development and testing for safe widespread deployment. A team at The Ohio State Center for Automotive Research (CAR) is building a Mobility Cyber Range (MCR) — a dedicated platform for cybersecurity testing — in a self-driving car. Researchers and students will rigorously test the platform to identify potential safety and security issues as well as use it to educate a new generation of AV developers.
The research pilot will initially focus on establishing standards and recommendations for best practices in AV safety and cybersecurity. CAR plans to use the vehicle as an ongoing tool for research, with safety and security at the center of every project.
This technology can be used in a wide range of connected and autonomous vehicle applications — recently, the CyberCAR was featured in a demonstration that used autonomous vehicles to quickly transport organs for emergency transplant surgeries.
And by conducting this development work on NVIDIA DRIVE, CAR is also training students on AI compute technology that is widespread in the AV industry.
“We are aiming to generate a workforce that understands these safety and security challenges, as well as one that is familiar with the DRIVE platform and equipped with the right AV skill set for success,” said Qadeer Ahmed, associate professor of research at the Departments of Mechanical and Aerospace Engineering, and Electrical and Computer Engineering, and associate fellow of CAR.
Tools for Success
Future cars will be packed with more technology than any computing system today. They’ll be completely programmable computers, and managing a system with multiple complex applications is incredibly difficult.
NVIDIA DRIVE compute platforms enable autonomous vehicle developers to build on centralized, high-performance and energy-efficient AI computing to process the array of deep neural networks running in the vehicle. It also allows new features to be added continuously via over-the-air updates.
The NVIDIA DRIVE platform also comes with a comprehensive software stack for developers to build an AV system — from the DRIVE OS operating system, to DriveWorks middleware, to DRIVE AV and DRIVE IX autonomous driving and intelligent cockpit software stacks.
“We needed to have a full-fledged, autonomous driving onboard computer optimized for AV functionalities for the Mobility Cyber Range,” Ahmed said. “The aim was to use those tools that would be highly accepted in the market and enhance student competitiveness.”
The Sky’s the Limit
Cybersecurity testing is just the beginning for the CAR research team.
Ahmed said the team plans to learn as much as they can from building the autonomous vehicle itself, then research the safety and security of individual modules, such as perception, planning and actuation.
This work will be used to inform industry best practices and standards to help ensure the successful widespread deployment of autonomous vehicle technology.
And by leveraging NVIDIA DRIVE solutions in the process, Ahmed’s students can continue to innovate in the field long after they’ve completed the program.
It’s time for autonomous vehicle developers to blaze new trails. NVIDIA has agreed to acquire DeepMap, a startup dedicated to building high-definition maps for autonomous vehicles to navigate the world safely. “The acquisition is an endorsement of DeepMap’s unique vision, technology and people,” said Ali Kani, vice president and general manager of Automotive at NVIDIA. “DeepMap is expected to extend our mapping products, help us Read article >
It’s time for autonomous vehicle developers to blaze new trails.
NVIDIA has agreed to acquire DeepMap, a startup dedicated to building high-definition maps for autonomous vehicles to navigate the world safely.
“The acquisition is an endorsement of DeepMap’s unique vision, technology and people,” said Ali Kani, vice president and general manager of Automotive at NVIDIA. “DeepMap is expected to extend our mapping products, help us scale worldwide map operations and expand our full self-driving expertise.”
“NVIDIA is an amazing, world-changing company that shares our vision to accelerate safe autonomy,” said James Wu, co-founder and CEO of DeepMap. “Joining forces with NVIDIA will allow our technology to scale more quickly and benefit more people sooner. We look forward to continuing our journey as part of the NVIDIA team.”
Maps that are accurate to within a few meters are good enough when providing turn-by-turn directions for humans. AVs, however, require much greater precision. They must operate with centimeter-level precision for accurate localization, the ability of an AV to locate itself in the world.
Proper localization also requires constantly updated maps. These maps must also reflect current road conditions, such as a work zone or a lane closure. These maps need to efficiently scale across AV fleets, with fast processing and minimal data storage. Finally, they must be able to function worldwide.
DeepMap was founded five years ago by James Wu and Mark Wheeler, veterans of Google, Apple and Baidu, among other companies. The U.S.-based company has developed a high-definition mapping solution that meets these requirements and has already been validated by the AV industry with a wide array of potential customers around the world.
The team, primarily located in the San Francisco Bay Area has many decades of collective experience in mapping technology, and developed a solution that considers autonomous vehicles both map creators and map consumers. Using crowdsourced data from vehicle sensors lets DeepMap build a high-definition map that’s continuously updated as the car drives.
Ongoing Partner Support
NVIDIA will continue working with DeepMap’s ecosystem to meet their needs, investing in new capabilities and services for new and existing partners.
NVIDIA DRIVE is a software-defined, end-to-end platform—from deep neural network training and validation in the data center to high-performance compute in the vehicle—that enables continuous improvement and deployment via over-the-air updates.
DeepMap’s technology will further bolster the mapping and localization capabilities available on NVIDIA DRIVE, ensuring autonomous vehicles always know precisely where they are, and where they’re going.
“We are excited to welcome the DeepMap team to NVIDIA. They have a proven track record and are entrepreneurial, nimble, and engineering-focused. DeepMap meets a deep need in the industry, and together we will develop and extend these capabilities,” said Kani.
The acquisition is expected to close in the third calendar quarter of 2021, subject to regulatory approval and customary closing conditions.
Thousands of U.S. traffic lights may soon be getting the green light on AI for safer streets. That’s because startup CVEDIA has designed better and faster vehicle and pedestrian detections to improve traffic flow and pedestrian safety for Cubic Transportation Systems. These new AI capabilities will be integrated into Cubic’s GRIDSMART Solution, a single-camera intersection Read article >
Thousands of U.S. traffic lights may soon be getting the green light on AI for safer streets.
That’s because startup CVEDIA has designed better and faster vehicle and pedestrian detections to improve traffic flow and pedestrian safety for Cubic Transportation Systems. These new AI capabilities will be integrated into Cubic’s GRIDSMART Solution, a single-camera intersection detection and actuation technology solution used across the United States.
Cubic needs computer vision models trained with specialized datasets for its new pedestrian safety and traffic systems. But curating data and training models from scratch takes months, so they are partnered with CVEDIA for synthetic data and model development.
CVEDIA’s synthetic algorithm technology accelerates development of object detection and image classification networks. Adopting the NVIDIA Transfer Learning Toolkit has enabled it to further compress development time. The traffic light implementation for smarter intersections is now being deployed in more than 6,000 intersections spanning 49 states.
NVIDIA today released Transfer Learning Toolkit version 3.0 into general availability.
“By using NVIDIA Transfer Learning Toolkit, we cut model training time in half and achieved the same level of model accuracy and throughput performance,” said Rodrigo Orph, CTIO and co-founder of CVEDIA.
Metropolis Boosts Infrastructure
CVEDIA develops applications using NVIDIA Metropolis and is a member of NVIDIA Inception, a virtual accelerator program that helps startups in AI and data science get to market faster.
NVIDIA Metropolis is an application framework for smart infrastructure. It provides powerful developer tools, including the DeepStream SDK, Transfer Learning Toolkit, pre-trained models on NGC, and NVIDIA TensorRT.
Transfer learning is a deep learning technique that enables developers to tap a pre-trained AI model used on one task and customize it for use in another domain. NVIDIA Transfer Learning Toolkit is used to build custom, production quality models faster with no coding required.
“Safety is the most fundamental need for all drivers and vulnerable road users traveling through intersections. CVEDIA’s AI and synthetic data expertise allow us to both augment our existing AI models and rapidly iterate for new applications,” said Jeff Price, Vice President and General Manager of Cubic Transportation Systems’ ITS unit.
Signaling Smarter Intersections
Cubic’s GRIDSMART Solution is using 360-degree view cameras to optimize traffic flow by gathering and interpreting important traffic data. GRIDSMART empowers traffic engineers to adjust signal timing and traffic flow strategies, and enables real-time monitoring and visual assessment.
For this new system, CVEDIA is developing image classification and object detection models to follow the movement of vehicles, people, bicycles, pets and other safety concerns in intersections.
“Cubic wants to detect dangerous areas in an intersection and dangerous areas where a pedestrian might cross, and they want to better control traffic,” said Rodrigo Orph.
GFN Thursday is our weekly celebration of games streaming from GeForce NOW. This week, we’re kicking off Legends of GeForce NOW, a special event that challenges gamers to show off the best Apex Legends: Legacy moments using one of the features that makes GeForce NOW unique — NVIDIA Highlights. Let No Victory Go Unrecorded That Read article >
GFN Thursday is our weekly celebration of games streaming from GeForce NOW. This week, we’re kicking off Legends of GeForce NOW, a special event that challenges gamers to show off the best Apex Legends: Legacy moments using one of the features that makes GeForce NOW unique — NVIDIA Highlights.
Let No Victory Go Unrecorded
That one time, you did that one thing, and you wish you were recording? That’s where NVIDIA Highlights comes in.
Highlights enables automatic video capture of key moments, clutch kills and match-winning plays. GeForce NOW adds Highlights support to additional games with the use of smart pattern and image recognition from our cloud servers to automatically detect and capture epic moments in games like Apex Legends.
As you play, Highlights are saved locally, without interrupting your flow. They can be reviewed in the GeForce NOW app’s Gallery, found in the top-left corner of the app. You can even stitch clips together for the ultimate Highlight montage, and share your victories with the world on Twitter, YouTube, Discord and more.
Highlights is available for GeForce NOW members using the PC and Mac apps. You’ll see a prompt to turn it on when you launch a supported game through the app, or you can toggle the feature from the in-game overlay by hitting Ctrl+G on PC or Cmd+G on Mac.
But that’s not all. The GeForce NOW in-game overlay is where to manually capture gameplay, turn on Instant Replay, take screenshots, enable NVIDIA Freestyle and a whole lot more. Use this article to learn about the powerful feature.
NVIDIA Highlights supports many games on GeForce NOW alongside Apex Legends, including popular titles like Counter-Strike: Global Offensive, Destiny 2, Fortnite, League of Legends, Rocket League and more.
Legends of GeForce NOW
We want to see your best moments in Apex Legends: Legacy, so we’re launching “Legends of GeForce NOW.” Submit your best Apex Legends highlights and share on social media between now and June 24, 2021 and you’ll have a chance to win some big prizes.
Entering is easy: Upload a clip captured via Highlights to GFNLegends.com, and share on social media using #GFNLegends. Each entry increases the prize pool on the site and unlocks more (and bigger) giveaways.
NVIDIA SHIELD TV Pros
Razer Kishi mobile controllers
Keyboards and mice from Logitech and Razer
Gaming headsets from ASTRO
Google Pixelbook Gos
Apple iPads and Macbook Pros
Apex Coins for Apex Legends
But hurry, this legendary event only runs through June 24. Get those epic moments submitted today. We can’t wait to see your best Highlights!
Special Offer: 3-for-Free
Celebrate E3 with a special cloud gaming upgrade. For a limited time, new GeForce NOW members who register for a free account by Tuesday, June 15, will get three free days of Priority access, while supplies last.1
GeForce NOW Priority Memberships, named for their priority access to gaming sessions, provide members front-of-the-line access to their gaming rigs in the cloud, ahead of free members.
In addition, Priority members get extended session lengths and RTX ON for beautifully ray-traced graphics and DLSS in supported games.
Following the three-day promo, memberships are available at $9.99 a month. A new annual membership option, at $99.99, provides the best value.
Newest Additions of the Week
GFN Thursday wouldn’t be complete without new games, starting with a day-and-date release from Tripwire.
We can’t wait to see all the Apex Legends highlights in the Legends of GeForce NOW event! Rise to the challenge and enter here.
1 Register for a free GeForce NOW account between Thursday, June 10, 2021 at 12:01 a.m. Pacific time and Tuesday, June 15, 2021 at 11:59 p.m. Pacific time to be eligible. On or before Friday, June 18, 2021 eligible members will receive an email with redemption instructions. Supplies are limited, only available in NVIDIA-operated regions, and will be provided on a first-come, first-served basis.