No Pain, No Grain: Autodesk VRED Accelerates Design Workflows with AI Denoising and Real-Time Rendering

Automotive design has a new driver behind the wheel: NVIDIA RTX.

Autodesk announced that the latest release of VRED 2021, its automotive 3D visualization software, now supports NVIDIA Quadro RTX, letting designers tap into powerful GPU technology to create and design like never before.

Autodesk VRED, which was previously limited to CPU, now leverages RTX technology to support the high demands of consumers and provide interactive ray tracing and AI-powered denoising. This allows users to gain immediate visual feedback to see how a vehicle’s aesthetics will interact with different environments in real time.

By transforming massive amounts of digital design and engineering data into photorealistic renders in VRED, professionals can make better-informed decisions throughout the design and development process, allowing them to bring products to market faster.

Design with Real-Time Rendering, Deliver Real-Time Results

Autodesk VRED enables users to create digital prototypes so they can gain insight into how vehicles will look and perform. To be effective in guiding design decisions, the digital prototypes need to look and behave as close as possible to the real thing.

“Real-time ray tracing has a huge impact on design visualization, which is why Autodesk is making our industry-leading VRED 3D visualization software even more powerful by embracing NVIDIA Quadro RTX GPUs in the latest release,” said Thomas Heermann, associate vice president of Automotive + Conceptual Design at Autodesk.

With RTX accelerating VRED, professionals can create product presentations, design reviews and virtual prototypes with photorealistic details in real time. VRED users can also achieve physically accurate scalable rendering on GPU, and they can connect several NVIDIA RTX Servers to speed up real-time and offline rendering performance.

Designing the Fastest Trains with the Fastest Tools

Alstom, a leading company in the transportation industry, designs and builds locomotives, including France’s TGV, one of the world’s fastest trains. While many automotive designers are adopting VR for their design workflows, Alstom pushes the graphics envelope further to bring massive train models to life for real-time design reviews.

“With NVIDIA Quadro RTX and VRED GPU ray tracing, we are able to achieve a significant performance increase over CPU ray tracing, accelerating our renderings from minutes to milliseconds,” said Fabien Normand, virtual reality expert at Alstom. “With RTX Server scalability and Autodesk VRED, we can now spend less time optimizing our models for ray-tracing performance and more time designing. The results look amazing on our 32-million-pixel stereoscopic power wall.”

AI Denoiser Delivers Noise-Free Images in Real Time

In addition to real-time ray tracing and rendering, VRED uses RTX to deliver AI denoising.

By combining NVIDIA’s AI-accelerated denoiser with VRED’s scalable ray tracing, designers can remove the last bit of grain in their renders and create noise-free, interactive images in real time.

This workflow provides users with immediate visual feedback so they can see and interact with their latest designs, allowing them to explore with different variables like light, materials, viewing angle and shadows.

Designers can now see how a vehicle’s aesthetics will interact in different environments, which is critical to ensure automotive prototypes meet certain requirements before the design is finalized.

“Professional VRED users demand the performance Quadro RTX provides for both photorealism and real-time interactivity for design reviews and exploration on desktop, large-scale review environments and in VR,” said Heermann. “Coupled with NVIDIA Quadro RTX 6000 and RTX 8000 GPUs, our latest release of VRED delivers performance improvements that make a difference to our customers.”

Register for GTC Digital to learn more about the latest release of Autodesk VRED 2021.

The post No Pain, No Grain: Autodesk VRED Accelerates Design Workflows with AI Denoising and Real-Time Rendering appeared first on The Official NVIDIA Blog.

Full Throttle: Altair Accelerates Engineering Simulations with NVIDIA GPUs

Whether analyzing fluid dynamics or speccing performance, engineers need to create high-quality simulations long before a single physical prototype takes shape.

To help engineers gain deeper insights into their designs, two products from engineering software company Altair — Altair AcuSolve and Thea Render — now offer enhanced support for NVIDIA GPUs, which produce incredible performance and speedups.

In addition to GPU support, Altair announced NVIDIA RTX Server validation of Altair ultraFluidX, an aerodynamics CFD software, as well as Altair nanoFluidX, a particle-based CFD software. A powerful reference design, RTX Server allows engineers to use high performance computing for simulating physics and iterating designs — all with GPU-accelerated rendering and computer-aided engineering simulation computing times.

AcuSolve Enhances CAE Workflows

AcuSolve is a general-purpose CFD software that helps engineers simulate the fluid flow with turbulence and heat transfer of a design. With NVIDIA GPUs, AcuSolve users can perform simulations up to 4x faster than on CPU only configuration.

Blog images courtesy of Altair.

Thea Render Brings Design to Life

Thea Render, a powerful 3D rendering and animation tool, enables engineers to visualize realistic previews of their work. The CUDA-based renderer can now deliver ray tracing with NVIDIA OptiX AI-accelerated denoiser.

In addition, Altair Inspire Studio design software with Thea Render and visualization app ParaView both leverage GPU-accelerated AI to reduce the time it takes to render high-quality, noiseless images.

Altair applications use CUDA, the parallel computing and programming platform featured on NVIDIA GPUs, to gain significant increases in speed and throughput. This lets engineers freely and quickly explore designs and make decisions based on more accurate results.

RTX Server Accelerates Computations Overnight

Running design validation workloads on RTX Server similarly enhances GPU-based CFD solvers, which are software “engines” that use advanced computational algorithms to predict physical performance.

With RTX Server powering Altair ultraFluidX and nanoFluidX, users can design new models faster and more efficiently during the day on NVIDIA Quadro virtual workstations. The same RTX Server can then be used to complete large-scale CFD simulations overnight instead of taking days to compute the data.

When engineers arrive the next morning, the simulations are ready for them to analyze. This allows teams to understand the performance, behavior and mechanics of their models earlier in the design process — all while using a more energy-efficient computing system that’s accessible on premises or in the cloud.

Learn more about Altair and NVIDIA RTX Server for engineering.

The post Full Throttle: Altair Accelerates Engineering Simulations with NVIDIA GPUs appeared first on The Official NVIDIA Blog.

Working Remotely: Connecting to Your Office Workstation

With so many people working from home amid the COVID-19 outbreak, staying productive can be challenging.

At NVIDIA, some of us have RTX laptops and remote-working capabilities powered by our virtual GPU software via on-prem servers and the cloud. To help support the many other businesses with GPUs in their servers, we recently made vGPU licenses free for up to 500 users for 90 days to explore their virtualization options.

But many still require access to physical Quadro desktop workstations due to specific hardware configurations or data requirements. And we know this situation is hardly unique.

Many designers, engineers, artists and architects have Quadro RTX mobile workstations that are on par with their desktop counterparts, which helps them stay productive anywhere. However, a vast number of professionals don’t have access to their office-based workstations — with multiple high-end GPUs, large memory and storage, as well as applications and data.

These workstations are critical for keeping  everything from family firms to multinational corporations going. And this has forced IT teams to explore different ways to address the challenges of working from home by connecting remotely to an office workstation.

Getting Started: Tools for Remote Connections

The list below shows several publicly available remote-working tools that are helpful to get going quickly. For details on features and licensing, contact the respective providers.

Managing Access, Drivers and Reboots

Once you’re up and running, keep these considerations in mind:

Give yourself a safety net when working on a remote system 

There are times that your tools can stop working, so it’s a good idea to have a safety net. Always install a VNC server on the machine (https://www.tightvnc.com/, https://www.realvnc.com/en/ or others) no matter what remote access tool you use. It’s also a good idea to enable Access to Microsoft Remote Desktop as another option. These run quietly in the background, but are ready if you need them in an emergency

Updating your driver remotely

We recommend you use a VNC connection to upgrade your drivers. Changing the driver often changes the parts the driver the remote access tools are using, so you often lose the connection. VNC doesn’t connect into the driver at a low level, so keeps working as the old driver is changed out to the new. Once the driver is updated, you can go back to your other remote access tools.

Rebooting your machine remotely

Normally you can reboot with the windows menus. Give the system a few minutes to restart and then log back in. If your main remote-working tools have stopped functioning, try a VNC connection. You can also restart from a PowerShell Window or command prompt from your local machine with the command: shutdown /r /t 0 /m \\[machine-name]

App-Specific Resources

Several software makers with applications for professionals working in the manufacturing, architecture, and media and entertainment industries have provided instructions on using their applications from home. Here are links to a few recent articles:

Where to Get Help

Given the inherent variability in working from home, there’s no one-size-fits-all solution. If you run into technical issues and have questions, feel free to contact us at desktop-remoting@nvidia.com. We’ll do our best to help.

The post Working Remotely: Connecting to Your Office Workstation appeared first on The Official NVIDIA Blog.

NVIDIA Expands Free Access to GPU Virtualization Software to Support Remote Workers

In challenging times, our strength comes from working together. With many companies needing to quickly support employees now working remotely, NVIDIA is expanding our free, 90-day virtual GPU software evaluation from 128 to 500 licenses.

With vGPU software licenses, companies can use their on-premises NVIDIA GPUs to provide accelerated virtual infrastructure so people can work and collaborate from anywhere. Companies can also temporarily repurpose NVIDIA GPUs being used on other projects to support their remote workers.

Every organization is working hard to address these needs: Healthcare providers are supporting care from new locations. Schools are expanding their virtual classrooms. Agencies are coordinating critical services.

Whether supporting financial professionals working with data on multiple screens, scientists conducting research, or designers working in graphics-intensive applications, enterprises are faced with different workloads that have different requirements.

NVIDIA offers a variety of customized vGPU software to meet these diverse needs. All three tiers of the company’s specialized vGPU software are available through the expanded free licensing:

  • NVIDIA GRID software delivers responsive VDI by virtualizing systems and applications for knowledge workers.
  • NVIDIA Quadro Virtual Data Center Workstation software provides workstation-class performance for creators using high-end graphics applications.
  • NVIDIA Virtual Compute Server software accelerates server virtualization with GPUs to power the most compute-intensive workflows, such as AI, deep learning and data science on a virtual machine.

Virtualized Performance, Enterprise Security and Broad Ecosystem Support

In addition to providing high performance and reducing latency for remote workers, NVIDIA vGPU software ensures protection for sensitive data and digital assets, which remain in the data center and aren’t saved to local client devices. This is an important security requirement for remote work across many industries, including visual effects and design, as well as for research and development.

NVIDIA vGPU software is certified on a broad ecosystem of hypervisors, platforms, user applications and management software to help IT teams quickly scale out support for remote workers.

Companies can deploy virtual workstations, compute and VDI from their on-prem data centers by installing the vGPU software licenses on all NVIDIA GPUs based on the Pascal, Volta and Turing architectures, including NVIDIA Quadro RTX 6000 and RTX 8000 GPUs, and NVIDIA M10 and M60 GPUs.

Get the Virtual GPU Evaluation.

NVIDIA is also providing genomics researchers studying COVID-19 free access to Parabricks software for 90 days. See our post on Parabricks to learn more.

Supporting links:

The post NVIDIA Expands Free Access to GPU Virtualization Software to Support Remote Workers appeared first on The Official NVIDIA Blog.

Netherlands Cancer Institute Uses Virtualization to Enhance Patient Care, Advance Cancer Research

Cancer doesn’t take nights off. To help beat the disease, the Netherlands Cancer Institute, rated one of the top 10 comprehensive cancer centers in the world, is using virtualization to have its IT infrastructure work the late shift, as well.

The NKI, which consists of both research facilities and cancer clinics, has dual goals: to accelerate research, while also improving the efficiency and productivity of its physicians.

Whether analyzing images to diagnose breast cancer or running DNA computations with large datasets, the institute relies on innovative, flexible technology to drive new discoveries in cancer research.

To keep up with the increasing needs of doctors and researchers, NKI upgraded to a state-of-the-art, software-defined infrastructure using NVIDIA virtual GPU technology powered by NVIDIA T4 GPUs and Hewlett Packard Enterprise DL380 Gen10 servers.

During the day, this virtual desktop infrastructure provides healthcare professionals with fast, flexible and secure access to patient data. At night, researchers use the same VDI platform to run computationally intensive GPU workloads.

With this high-performance yet flexible IT infrastructure, healthcare professionals can spend more time focusing on patients, while researchers can advance breakthroughs in cancer treatment.

Virtual Desktops Enhances Security and Mobility

Before NKI moved to a virtualized platform, doctors would handle patient data the old-fashioned way: working on physical desktops that stored local apps and information. The doctors would need to go to computer labs, manually log into the system, and open applications up to 20 times a day, a tedious and time-consuming process.

Maintenance and security were challenging. A few times PCs had been stolen, so NKI wanted better security to protect sensitive patient information and research data.

With VDI, physicians can move around the hospital more freely. With workstations available in each room, they can log into a virtual desktop session in one part of the hospital, then easily move to another area and get right back into their session with a swipe of their badge.

This ability to quickly switch allows NKI’s healthcare professionals to work much faster and more efficiently, providing clinicians with greater flexibility to move from one patient to another.

The VDI platform also stores all data in a safe, central environment rather than individual devices. Doctors and nurses who can securely access apps and information on mobile phones, tablets, at home or on the road.

vGPUs Bring More Power, Speed to Research 

NKI’s new infrastructure is made up of 78 HPE servers, each with three NVIDIA T4 GPUs, to provide doctors and researchers with massive GPU power for computations like DNA or image analysis.

Its clinics are busiest during the day. But most users log off in the evening, freeing up a majority of the GPU resources for cancer research. With virtualization, researchers can repurpose the T4 GPUs that aren’t in use for running complex compute workloads at night.

“Before, it would take a week for researchers to get their photos analyzed at an imaging facility,” said Roel Sijstermans, IT manager at NKI. “With the new virtual GPU infrastructure, we can send the pictures in the evening and the images will be finished by the morning.”

With data being processed overnight, researchers can analyze breast cancer tumors or increase image quality at a faster rate than before, giving doctors more time to plan their care for patients.

Register now for GTC Digital to learn more about the Netherlands Cancer Institute and how virtualization is changing the future of healthcare.

The post Netherlands Cancer Institute Uses Virtualization to Enhance Patient Care, Advance Cancer Research appeared first on The Official NVIDIA Blog.

Virtual Peril: Booz Allen Hamilton Builds Immersive VR for Hazardous Job Training

How do you train someone to deactivate a landmine, work with pathogens in a lab or triage disaster victims with head injuries? Very, very carefully.

Typically, training for hazardous, high-risk jobs is an expensive production involving dozens of individuals playing roles, supervisors taking notes and an environment that doesn’t account for unexpected issues that can arise in the chaos of actual events.

A team of technologists at management and technology consulting and engineering services firm Booz Allen Hamilton is turning to immersive virtual reality environments, powered by NVIDIA GPUs, to bring a dose of realism to hazardous job training and better prepare workers for the stress of the roles.

VR allows the Booz Allen team to replicate the numerous stimuli of a life-threatening situation, with all of the sights, noises and traumatic emotions that can come with such moments.

VR for Bang Up Jobs

Traditional hazardous duty training is expensive and limiting, according to Sandra Marshall, chief technologist at Booz Allen. “There can be high travel costs, and the ability to collect metrics is less than ideal,” she said. “There’s a real need for improvement in training for high-risk jobs.”

Marshall and Elyse Heob, lead technologist at Booz Allen, are leading a team that’s applying immersive technology to hazardous job training. Their latest effort is in medical triage for battlefield settings.

The application they’ve built exposes trainees to virtual peril, with the aim of saving the expense of assembling complex live trainings, as well as making trainees much better at their jobs.

Booz Allen Hamilton emergency medicine application
Emergency medicine application. Image courtesy of Booz Allen Hamilton.

This type of model is applicable to many more settings. Whether inside a biological lab or a field filled with unexploded landmines, virtual training has the potential to transform numerous jobs.

“It’s much easier to give routine exposure of these things in virtual reality,” Marshall said. “It’s really a life-saving tool.”

Booz Allen Hamilton virtual laboratory application
Virtual laboratory application. Image courtesy of Booz Allen Hamilton.

Goldmine of Data

Heob noted that training in a virtual setting triggers all of the learning systems in trainees’ bodies — emotional, cognitive, experiential and behavioral — simultaneously.

That feeds another goal of their work, which isn’t simply to better prepare people for managing stressful stimuli, but also to collect more data on how they respond to it. That data will help to refine and personalize their applications, making both the training and the trainees more resilient in the process.

Having more insight into trainees’ reactions can help trainers to gauge whether trainees’ stress levels are too high, and if they might not be ready to be in a particular environment, Heob said.

For instance, the information gathered during a virtual training might indicate that a prospective triage worker is traumatized by witnessing serious injuries, and thus might not be best suited for that role.

Reusable Training Components

To build these applications, Marshall and Heob’s team has been combining NVIDIA GPUs with an array of VR technologies — tethered VR platforms, biometric sensors, and eye and object trackers — and developing their apps on the open-platform Unity game-development engine.

In doing so, the Booz Allen team is building a library of reusable multi-player components that function similar to video games. Each component is essentially a feature — one tracks reactions times, another scores responses, etc. — that can be incorporated into any training app the team develops.

“Different industries may want similar features,” said Marshall, “so we configure those tools so they can be reused.”

Booz Allen Hamilton counter-IED application
Counter-IED application. Image courtesy of Booz Allen Hamilton.

Wait Until They Add AI

Eventually, Marshall envisions layering deep learning-infused AI over the apps and teaching them to learn from each training session.

“I’d like to see immersive technology radically transform the way enterprises do training so we can accumulate training data from users over time and develop more intelligent applications and informed policies,” said Marshall.

While the defense market is currently most eager to adopt immersive VR technology, Heob believes it will one day benefit any job in which people need to be prepared for emergencies or hazardous environments.

Ultimately, Marshall said, it’s all about how the training applications can address real-world training deficiencies with the data they collect. Analyzing reaction times and stress levels while tracking biometric and neurofeedback data can create targeted training to capitalize on the trainee’s strengths while improving areas of weakness.

“You’re going to have a better equipped, smarter workforce, and a safer one,” said Marshall. “And you’re going to have reduced costs for training.”

The post Virtual Peril: Booz Allen Hamilton Builds Immersive VR for Hazardous Job Training appeared first on The Official NVIDIA Blog.

AI-Listers: Oscar-Nominated Irishman, Avengers Set Stage for AI Visual Effects

This weekend’s Academy Awards show features a twice-nominated newcomer to the Oscars: AI-powered visual effects.

Two nominees in the visual effects category, The Irishman and Avengers: Endgame, used AI to push the boundaries between human actors and digital characters — de-aging the stars of The Irishman and bringing the infamous villain Thanos to life in Avengers.

Behind this groundbreaking, AI-enhanced storytelling are VFX studios Industrial Light & Magic and Digital Domain, which use NVIDIA Quadro RTX GPUs to accelerate production.

AI Time Machine

From World War II to a nursing home in the 2000s, and every decade in between, Netflix’s The Irishman tells the tale of hitman Frank Sheeran through scenes from different times in his life.

But all three leads in the film — Robert DeNiro, Al Pacino and Joe Pesci — are in their 70s. A makeup department couldn’t realistically transform the actors back to their 20s and 30s. And director Martin Scorcese was against using the typical motion capture markers or other intrusive equipment that gets in the way of raw performances during filming.

To meet this requirement, ILM developed a new three-camera rig to capture the actors’ performances on set — using the director’s camera flanked by two infrared cameras to record 3D geometry and textures. The team also developed software called ILM Facefinder that used AI to sift through thousands of images of the actors’ past performances.

The tool located frames that matched the camera angle, framing, lighting and expression of the scene being rendered, giving ILM artists a relevant reference to compare against every frame in the shot. These visual references were used to refine digital doubles created for each actor, so they could be transformed into the target age for each specific scene in the film.

“AI and machine learning are becoming a part of everything we do in VFX,” said Pablo Helman, VFX supervisor on The Irishman at ILM. “Paired with the NVIDIA Quadro RTX GPUs powering our production pipeline, these technologies have us excited for what the next decade will bring.”

Building Better VFX Villains

The highest-grossing film of all time, Marvel’s Avengers: Endgame included over 2,500 visual effects shots. VFX teams at Digital Domain used machine learning to animate actor Josh Brolin’s performance onto the digital version of the film franchise’s villain, the mighty Thanos.

A machine learning system called Masquerade was developed to take low resolution scans of the actor’s performance and facial movements, and then accurately transfer his expressions onto the high-resolution mesh of Thanos’ face. The technology saves time for VFX artists who would otherwise have to painstakingly animate the subtle facial movements manually to generate a realistic, emoting digital human.

“Key to this process were immediate realistic rendered previews of the characters’ emotional performances, which was made possible using NVIDIA GPU technology,” said Darren Hendler, head of Digital Humans at Digital Domain. “We now use NVIDIA RTX technology to drive all of our real-time ray-traced digital human projects.”

RTX It in Post: Studios, Apps Adopt AI-Accelerated VFX 

ILM and Digital Domain are just two of a growing set of visual effects studios and apps adopting AI tools accelerated by NVIDIA RTX GPUs.

In HBO’s The Righteous Gemstones series, lead actor John Goodman looks 30 years younger than he is. This de-aging effect was achieved with Shapeshifter, a custom software that uses AI to analyze face motion — how the skin stretches and moves over muscle and bone.

VFX studio Gradient Effects used Shapeshifter to transform the actor’s face in a process that, using NVIDIA GPUs, took weeks instead of months.

Companies such as Adobe, Autodesk and Blackmagic Design have developed RTX-accelerated apps to tackle other visual effects challenges with AI, including live-action scene depth reclamation, color adjustment, relighting and retouching, speed warp motion estimation for retiming and upscaling.

Netflix Greenlights AI-Powered Predictions 

Offscreen, streaming services such as Netflix use AI-powered recommendation engines to provide customers with personalized content based on their viewing history, or a similarity index that serves up content watched by people with similar viewing habits.

Netflix also customizes movie thumbnails to appeal to individual users, and uses AI to help optimize streaming quality at lower bandwidths. The company uses NVIDIA GPUs to accelerate its work with complex data models, enabling rapid iteration.

Rolling Out the Red Carpet at GTC 2020

Top studios including Lucasfilm’s ILMxLAB, Magnopus and Digital Domain will be speaking at NVIDIA’s GPU Technology Conference in San Jose, March 23-26.

Check out the lineup of media and entertainment talks and register to attend. Early pricing ends Feb. 13.

Feature image courtesy of Industrial Light & Magic. © 2019 NETFLIX

 

The post AI-Listers: Oscar-Nominated Irishman, Avengers Set Stage for AI Visual Effects appeared first on The Official NVIDIA Blog.

For 12th Year Running, NVIDIA Quadro Powers Every Oscar-Nominated Film for Best Visual Effects

For the 12th consecutive year, NVIDIA Quadro GPUs have powered the stunning visuals behind every Academy Award nominee for Best Visual Effects.

The red carpet will roll out for the five VFX-nominated films at the 92nd annual Academy Awards on Sunday, Feb. 9. They are:

  • Avengers: Endgame
  • The Irishman
  • The Lion King
  • 1917
  • Star Wars: The Rise of Skywalker

For over a decade, Quadro has been behind award-winning graphics in films, bringing the latest advancements in visual effects to studios across the industry.

Avengers: Endgame amazed audiences with over 2,500 stunning visual effects shots, breaking box office records along the way.

The visual effects development team at Digital Domain brought innovation to the film like never before, using custom machine learning technology running on NVIDIA Quadro GPUs to animate Josh Brolin’s performance on Marvel’s infamous villain, Thanos.

“For Digital Domain’s work on Avengers: Endgame, we built a machine learning system that understood Josh Brolin’s facial movements, and we taught the system how to transfer Josh to Thanos’ face,” said Darren Hendler, head of Digital Humans at Digital Domain. “Key to this process were immediate realistic rendered previews of the characters’ emotional performances, which was made possible using NVIDIA GPU technology. We now use NVIDIA RTX technology to drive all of our real-time ray-traced digital human projects.”

The innovators at Industrial Light & Magic also gave machine learning a starring role in Martin Scorsese’s The Irishman. The markerless de-aging of Robert DeNiro, Al Pacino and Joe Pesci enabled the film to tell a story that spanned decades while allowing the actors to perform without facial markers or helmet cams.

With the new NVIDIA Quadro RTX GPUs, artists and studios can take advantage of the latest AI-driven tools and techniques. Quadro RTX accelerates many custom AI-based tools and commercial creative applications from companies like Adobe, Autodesk, Blackmagic Design and more, so professionals can take visual effects to the next level with enhanced capabilities.

Disney’s visually stunning remake of The Lion King also broke new ground by transporting live-action crews to the CG environment in virtual reality. Through the use of Quadro GPUs, VFX supervisor Rob Legato and virtual production supervisor Ben Grossmann brought traditional filmmaking techniques in virtual worlds that gave us the most cinematic CG movie ever created.

“Rendering is the killer of fast turnaround and iterative creativity. You really need global illumination and ray tracing for real-time feedback and micro adjustments, just like on a live action film stage, and that’s what the NVIDIA Quadro graphics cards give you,” said Robert Legato, VFX supervisor on The Lion King.

Presenting a heart-pounding, up close and personal view of World War I, 1917 features a single-shot perspective that follows the mission of two soldiers delivering an important message to the front line. MPC used NVIDIA GPUs to create what looks like the longest single visual effects shot ever put on the big screen.

Bringing an epic saga to a close, Lucasfilm’s Star Wars: The Rise of Skywalker gave audiences iconic lightsaber duels, monumental starship battles, and a new era of heroes and villains. Industrial Light & Magic used NVIDIA GPUs to beautifully blend VFX with practical effects, creating some of the most incredible worlds and battles ever seen in film.

While only one visual effects team will take to the podium to accept an Oscar, thousands of artists around the world are bringing new worlds and characters to life every day. From powering creative applications to enabling new AI techniques, NVIDIA RTX accelerates the workstations and servers the film industry is using to define the future of visual computing.

NVIDIA engineering is also no stranger to Academy Awards — seven NVIDIANs are past recipients of Sci-Tech Awards.

See How Oscar-Nominated Visual Effects Are Created

Get behind the scenes of the world’s most advanced visual effects at the NVIDIA GPU Technology Conference in San Jose, March 23-26.

Come see Ben Grossmann, CEO of Magnopus and virtual production supervisor of Disney’s The Lion King, present the latest advancements in virtual production techniques.

Doug Roble, senior director of Software R&D at Digital Domain, will take GTC attendees on a journey featuring the most photorealistic real-time digital human ever made.

And learn from other media and entertainment industry luminaries at GTC, including Vicki Dobbs Beck from Lucasfilm’s ILMxLAB, Max Liani from Pixar, David Crabtree from DNEG and David Morin from Epic Games.

Feature image credit: © 2019 NETFLIX

The post For 12th Year Running, NVIDIA Quadro Powers Every Oscar-Nominated Film for Best Visual Effects appeared first on The Official NVIDIA Blog.

For 12th Year Running, NVIDIA Quadro Powers Every Oscar-Nominated Film for Best Visual Effects

For the 12th consecutive year, NVIDIA Quadro GPUs have powered the stunning visuals behind every Academy Award nominee for Best Visual Effects.

The red carpet will roll out for the five VFX-nominated films at the 92nd annual Academy Awards on Sunday, Feb. 9. They are:

  • Avengers: Endgame
  • The Irishman
  • The Lion King
  • 1917
  • Star Wars: The Rise of Skywalker

For over a decade, Quadro has been behind award-winning graphics in films, bringing the latest advancements in visual effects to studios across the industry.

Avengers: Endgame amazed audiences with over 2,500 stunning visual effects shots, breaking box office records along the way.

The visual effects development team at Digital Domain brought innovation to the film like never before, using custom machine learning technology running on NVIDIA Quadro GPUs to animate Josh Brolin’s performance on Marvel’s infamous villain, Thanos.

“For Digital Domain’s work on Avengers: Endgame, we built a machine learning system that understood Josh Brolin’s facial movements, and we taught the system how to transfer Josh to Thanos’ face,” said Darren Hendler, head of Digital Humans at Digital Domain. “Key to this process were immediate realistic rendered previews of the characters’ emotional performances, which was made possible using NVIDIA GPU technology. We now use NVIDIA RTX technology to drive all of our real-time ray-traced digital human projects.”

The innovators at Industrial Light & Magic also gave machine learning a starring role in Martin Scorsese’s The Irishman. The markerless de-aging of Robert DeNiro, Al Pacino and Joe Pesci enabled the film to tell a story that spanned decades while allowing the actors to perform without facial markers or helmet cams.

With the new NVIDIA Quadro RTX GPUs, artists and studios can take advantage of the latest AI-driven tools and techniques. Quadro RTX accelerates many custom AI-based tools and commercial creative applications from companies like Adobe, Autodesk, Blackmagic Design and more, so professionals can take visual effects to the next level with enhanced capabilities.

Disney’s visually stunning remake of The Lion King also broke new ground by transporting live-action crews to the CG environment in virtual reality. Through the use of Quadro GPUs, VFX supervisor Rob Legato and virtual production supervisor Ben Grossmann brought traditional filmmaking techniques in virtual worlds that gave us the most cinematic CG movie ever created.

“Rendering is the killer of fast turnaround and iterative creativity. You really need global illumination and ray tracing for real-time feedback and micro adjustments, just like on a live action film stage, and that’s what the NVIDIA Quadro graphics cards give you,” said Robert Legato, VFX supervisor on The Lion King.

Presenting a heart-pounding, up close and personal view of World War I, 1917 features a single-shot perspective that follows the mission of two soldiers delivering an important message to the front line. MPC used NVIDIA GPUs to create what looks like the longest single visual effects shot ever put on the big screen.

Bringing an epic saga to a close, Lucasfilm’s Star Wars: The Rise of Skywalker gave audiences iconic lightsaber duels, monumental starship battles, and a new era of heroes and villains. Industrial Light & Magic used NVIDIA GPUs to beautifully blend VFX with practical effects, creating some of the most incredible worlds and battles ever seen in film.

While only one visual effects team will take to the podium to accept an Oscar, thousands of artists around the world are bringing new worlds and characters to life every day. From powering creative applications to enabling new AI techniques, NVIDIA RTX accelerates the workstations and servers the film industry is using to define the future of visual computing.

NVIDIA engineering is also no stranger to Academy Awards — seven NVIDIANs are past recipients of Sci-Tech Awards.

See How Oscar-Nominated Visual Effects Are Created

Get behind the scenes of the world’s most advanced visual effects at the NVIDIA GPU Technology Conference in San Jose, March 23-26.

Come see Ben Grossmann, CEO of Magnopus and virtual production supervisor of Disney’s The Lion King, present the latest advancements in virtual production techniques.

Doug Roble, senior director of Software R&D at Digital Domain, will take GTC attendees on a journey featuring the most photorealistic real-time digital human ever made.

And learn from other media and entertainment industry luminaries at GTC, including Vicki Dobbs Beck from Lucasfilm’s ILMxLAB, Max Liani from Pixar, David Crabtree from DNEG and David Morin from Epic Games.

Feature image credit: © 2019 NETFLIX

The post For 12th Year Running, NVIDIA Quadro Powers Every Oscar-Nominated Film for Best Visual Effects appeared first on The Official NVIDIA Blog.

What Is AI Upscaling?

Putting on a pair of prescription glasses for the first time can feel like instantly snapping the world into focus.

Suddenly, trees have distinct leaves. Fine wrinkles and freckles show up on faces. Footnotes in books and even street names on roadside signs become legible.

Upscaling — converting lower resolution media to a higher resolution — offers a similar experience.

But with new AI upscaling techniques, the enhanced visuals look more crisp and realistic than ever.

Why Is Upscaling Necessary? 

One-third of television-owning households in the U.S. have a 4K TV, known as ultra-high definition. But much of the content people watch on popular streaming services like YouTube, HBO and Netflix is only available at lower resolutions.

digital video resolutions
4K TVs can muddy visuals by having to stretch lower-resolution images to fit their screen. AI upscaling makes lower-resolution images fit with unrivaled crispness.

Standard definition video, widely used in TVs until the 1990s, was just 480 pixels high. High-definition TVs bumped that up to 720 or 1080 pixels — and is still the most common resolution format for content on TV and the web.

Owners of ultra-HD displays get the most out of their screens when watching 4K-mastered content. But when watching lower-resolution content, the video must be upscaled to fill out the entire display.

For example, 1080p images, known as full HD, have just a quarter of the pixels in 4K images. To display a 1080p shot from edge to edge on a 4K screen, the picture has to be stretched to match the TV’s pixels.

Upscaling is done by the streaming device being used — such as a smart TV or streaming media player. But typically, media players use basic upscaling algorithms that are unable to significantly improve high-definition content for 4K TVs.

What Is Basic Upscaling? 

Basic upscaling is the simplest way of stretching a lower resolution image onto a larger display. Pixels from the lower resolution image are copied and repeated to fill out all the pixels of the higher resolution display.

Filtering is applied to smooth the image and round out unwanted jagged edges that may become visible due to the stretching. The result is an image that fits on a 4K display, but can often appear muted or blurry.

What Is AI Upscaling? 

Traditional upscaling starts with a low-resolution image and tries to improve its visual quality at higher resolutions. AI upscaling takes a different approach: Given a low-resolution image, a deep learning model predicts a high-resolution image that would downscale to look like the original, low-resolution image.

To predict the upscaled images with high accuracy, a neural network model must be trained on countless images. The deployed AI model can then take low-resolution video and produce incredible sharpness and enhanced details no traditional scaler can recreate. Edges look sharper, hair looks scruffier and landscapes pop with striking clarity.

ai upscaling

AI Upscaling on NVIDIA SHIELD TV

The NVIDIA SHIELD TV is the first streaming media player to feature AI upscaling. It can upscale 720p or 1080p HD content to 4K at up to 30 frames per second in real time.

Trained offline on a dataset of popular TV shows and movies, the model uses SHIELD’s NVIDIA Tegra X1+ processor for real-time inference. AI upscaling makes HD video content for top apps including HBO, Hulu, Netflix, Prime Video and YouTube appear sharper on 4K TVs, creating a more immersive viewing experience.

To see the difference between “basic upscaling” and “AI-enhanced upscaling” on SHIELD, click the image below and move the slider left and right.

NVIDIA SHIELD owners can toggle between basic and AI-enhanced modes in their device settings. A demo mode allows users to see a side-by-side comparison between regular content and AI-upscaled visuals. AI upscaling can be adjusted for high, medium or low detail enhancement — adjusting the confidence level of the neural network for detail prediction.

Learn more about upscaling on the NVIDIA SHIELD TV.

The post What Is AI Upscaling? appeared first on The Official NVIDIA Blog.