No Pain, No Grain: Autodesk VRED Accelerates Design Workflows with AI Denoising and Real-Time Rendering

Automotive design has a new driver behind the wheel: NVIDIA RTX.

Autodesk announced that the latest release of VRED 2021, its automotive 3D visualization software, now supports NVIDIA Quadro RTX, letting designers tap into powerful GPU technology to create and design like never before.

Autodesk VRED, which was previously limited to CPU, now leverages RTX technology to support the high demands of consumers and provide interactive ray tracing and AI-powered denoising. This allows users to gain immediate visual feedback to see how a vehicle’s aesthetics will interact with different environments in real time.

By transforming massive amounts of digital design and engineering data into photorealistic renders in VRED, professionals can make better-informed decisions throughout the design and development process, allowing them to bring products to market faster.

Design with Real-Time Rendering, Deliver Real-Time Results

Autodesk VRED enables users to create digital prototypes so they can gain insight into how vehicles will look and perform. To be effective in guiding design decisions, the digital prototypes need to look and behave as close as possible to the real thing.

“Real-time ray tracing has a huge impact on design visualization, which is why Autodesk is making our industry-leading VRED 3D visualization software even more powerful by embracing NVIDIA Quadro RTX GPUs in the latest release,” said Thomas Heermann, associate vice president of Automotive + Conceptual Design at Autodesk.

With RTX accelerating VRED, professionals can create product presentations, design reviews and virtual prototypes with photorealistic details in real time. VRED users can also achieve physically accurate scalable rendering on GPU, and they can connect several NVIDIA RTX Servers to speed up real-time and offline rendering performance.

Designing the Fastest Trains with the Fastest Tools

Alstom, a leading company in the transportation industry, designs and builds locomotives, including France’s TGV, one of the world’s fastest trains. While many automotive designers are adopting VR for their design workflows, Alstom pushes the graphics envelope further to bring massive train models to life for real-time design reviews.

“With NVIDIA Quadro RTX and VRED GPU ray tracing, we are able to achieve a significant performance increase over CPU ray tracing, accelerating our renderings from minutes to milliseconds,” said Fabien Normand, virtual reality expert at Alstom. “With RTX Server scalability and Autodesk VRED, we can now spend less time optimizing our models for ray-tracing performance and more time designing. The results look amazing on our 32-million-pixel stereoscopic power wall.”

AI Denoiser Delivers Noise-Free Images in Real Time

In addition to real-time ray tracing and rendering, VRED uses RTX to deliver AI denoising.

By combining NVIDIA’s AI-accelerated denoiser with VRED’s scalable ray tracing, designers can remove the last bit of grain in their renders and create noise-free, interactive images in real time.

This workflow provides users with immediate visual feedback so they can see and interact with their latest designs, allowing them to explore with different variables like light, materials, viewing angle and shadows.

Designers can now see how a vehicle’s aesthetics will interact in different environments, which is critical to ensure automotive prototypes meet certain requirements before the design is finalized.

“Professional VRED users demand the performance Quadro RTX provides for both photorealism and real-time interactivity for design reviews and exploration on desktop, large-scale review environments and in VR,” said Heermann. “Coupled with NVIDIA Quadro RTX 6000 and RTX 8000 GPUs, our latest release of VRED delivers performance improvements that make a difference to our customers.”

Register for GTC Digital to learn more about the latest release of Autodesk VRED 2021.

The post No Pain, No Grain: Autodesk VRED Accelerates Design Workflows with AI Denoising and Real-Time Rendering appeared first on The Official NVIDIA Blog.

Full Throttle: Altair Accelerates Engineering Simulations with NVIDIA GPUs

Whether analyzing fluid dynamics or speccing performance, engineers need to create high-quality simulations long before a single physical prototype takes shape.

To help engineers gain deeper insights into their designs, two products from engineering software company Altair — Altair AcuSolve and Thea Render — now offer enhanced support for NVIDIA GPUs, which produce incredible performance and speedups.

In addition to GPU support, Altair announced NVIDIA RTX Server validation of Altair ultraFluidX, an aerodynamics CFD software, as well as Altair nanoFluidX, a particle-based CFD software. A powerful reference design, RTX Server allows engineers to use high performance computing for simulating physics and iterating designs — all with GPU-accelerated rendering and computer-aided engineering simulation computing times.

AcuSolve Enhances CAE Workflows

AcuSolve is a general-purpose CFD software that helps engineers simulate the fluid flow with turbulence and heat transfer of a design. With NVIDIA GPUs, AcuSolve users can perform simulations up to 4x faster than on CPU only configuration.

Blog images courtesy of Altair.

Thea Render Brings Design to Life

Thea Render, a powerful 3D rendering and animation tool, enables engineers to visualize realistic previews of their work. The CUDA-based renderer can now deliver ray tracing with NVIDIA OptiX AI-accelerated denoiser.

In addition, Altair Inspire Studio design software with Thea Render and visualization app ParaView both leverage GPU-accelerated AI to reduce the time it takes to render high-quality, noiseless images.

Altair applications use CUDA, the parallel computing and programming platform featured on NVIDIA GPUs, to gain significant increases in speed and throughput. This lets engineers freely and quickly explore designs and make decisions based on more accurate results.

RTX Server Accelerates Computations Overnight

Running design validation workloads on RTX Server similarly enhances GPU-based CFD solvers, which are software “engines” that use advanced computational algorithms to predict physical performance.

With RTX Server powering Altair ultraFluidX and nanoFluidX, users can design new models faster and more efficiently during the day on NVIDIA Quadro virtual workstations. The same RTX Server can then be used to complete large-scale CFD simulations overnight instead of taking days to compute the data.

When engineers arrive the next morning, the simulations are ready for them to analyze. This allows teams to understand the performance, behavior and mechanics of their models earlier in the design process — all while using a more energy-efficient computing system that’s accessible on premises or in the cloud.

Learn more about Altair and NVIDIA RTX Server for engineering.

The post Full Throttle: Altair Accelerates Engineering Simulations with NVIDIA GPUs appeared first on The Official NVIDIA Blog.

Working Remotely: Connecting to Your Office Workstation

With so many people working from home amid the COVID-19 outbreak, staying productive can be challenging.

At NVIDIA, some of us have RTX laptops and remote-working capabilities powered by our virtual GPU software via on-prem servers and the cloud. To help support the many other businesses with GPUs in their servers, we recently made vGPU licenses free for up to 500 users for 90 days to explore their virtualization options.

But many still require access to physical Quadro desktop workstations due to specific hardware configurations or data requirements. And we know this situation is hardly unique.

Many designers, engineers, artists and architects have Quadro RTX mobile workstations that are on par with their desktop counterparts, which helps them stay productive anywhere. However, a vast number of professionals don’t have access to their office-based workstations — with multiple high-end GPUs, large memory and storage, as well as applications and data.

These workstations are critical for keeping  everything from family firms to multinational corporations going. And this has forced IT teams to explore different ways to address the challenges of working from home by connecting remotely to an office workstation.

Getting Started: Tools for Remote Connections

The list below shows several publicly available remote-working tools that are helpful to get going quickly. For details on features and licensing, contact the respective providers.

Managing Access, Drivers and Reboots

Once you’re up and running, keep these considerations in mind:

Give yourself a safety net when working on a remote system 

There are times that your tools can stop working, so it’s a good idea to have a safety net. Always install a VNC server on the machine (https://www.tightvnc.com/, https://www.realvnc.com/en/ or others) no matter what remote access tool you use. It’s also a good idea to enable Access to Microsoft Remote Desktop as another option. These run quietly in the background, but are ready if you need them in an emergency

Updating your driver remotely

We recommend you use a VNC connection to upgrade your drivers. Changing the driver often changes the parts the driver the remote access tools are using, so you often lose the connection. VNC doesn’t connect into the driver at a low level, so keeps working as the old driver is changed out to the new. Once the driver is updated, you can go back to your other remote access tools.

Rebooting your machine remotely

Normally you can reboot with the windows menus. Give the system a few minutes to restart and then log back in. If your main remote-working tools have stopped functioning, try a VNC connection. You can also restart from a PowerShell Window or command prompt from your local machine with the command: shutdown /r /t 0 /m \\[machine-name]

App-Specific Resources

Several software makers with applications for professionals working in the manufacturing, architecture, and media and entertainment industries have provided instructions on using their applications from home. Here are links to a few recent articles:

Where to Get Help

Given the inherent variability in working from home, there’s no one-size-fits-all solution. If you run into technical issues and have questions, feel free to contact us at desktop-remoting@nvidia.com. We’ll do our best to help.

The post Working Remotely: Connecting to Your Office Workstation appeared first on The Official NVIDIA Blog.

AI-Listers: Oscar-Nominated Irishman, Avengers Set Stage for AI Visual Effects

This weekend’s Academy Awards show features a twice-nominated newcomer to the Oscars: AI-powered visual effects.

Two nominees in the visual effects category, The Irishman and Avengers: Endgame, used AI to push the boundaries between human actors and digital characters — de-aging the stars of The Irishman and bringing the infamous villain Thanos to life in Avengers.

Behind this groundbreaking, AI-enhanced storytelling are VFX studios Industrial Light & Magic and Digital Domain, which use NVIDIA Quadro RTX GPUs to accelerate production.

AI Time Machine

From World War II to a nursing home in the 2000s, and every decade in between, Netflix’s The Irishman tells the tale of hitman Frank Sheeran through scenes from different times in his life.

But all three leads in the film — Robert DeNiro, Al Pacino and Joe Pesci — are in their 70s. A makeup department couldn’t realistically transform the actors back to their 20s and 30s. And director Martin Scorcese was against using the typical motion capture markers or other intrusive equipment that gets in the way of raw performances during filming.

To meet this requirement, ILM developed a new three-camera rig to capture the actors’ performances on set — using the director’s camera flanked by two infrared cameras to record 3D geometry and textures. The team also developed software called ILM Facefinder that used AI to sift through thousands of images of the actors’ past performances.

The tool located frames that matched the camera angle, framing, lighting and expression of the scene being rendered, giving ILM artists a relevant reference to compare against every frame in the shot. These visual references were used to refine digital doubles created for each actor, so they could be transformed into the target age for each specific scene in the film.

“AI and machine learning are becoming a part of everything we do in VFX,” said Pablo Helman, VFX supervisor on The Irishman at ILM. “Paired with the NVIDIA Quadro RTX GPUs powering our production pipeline, these technologies have us excited for what the next decade will bring.”

Building Better VFX Villains

The highest-grossing film of all time, Marvel’s Avengers: Endgame included over 2,500 visual effects shots. VFX teams at Digital Domain used machine learning to animate actor Josh Brolin’s performance onto the digital version of the film franchise’s villain, the mighty Thanos.

A machine learning system called Masquerade was developed to take low resolution scans of the actor’s performance and facial movements, and then accurately transfer his expressions onto the high-resolution mesh of Thanos’ face. The technology saves time for VFX artists who would otherwise have to painstakingly animate the subtle facial movements manually to generate a realistic, emoting digital human.

“Key to this process were immediate realistic rendered previews of the characters’ emotional performances, which was made possible using NVIDIA GPU technology,” said Darren Hendler, head of Digital Humans at Digital Domain. “We now use NVIDIA RTX technology to drive all of our real-time ray-traced digital human projects.”

RTX It in Post: Studios, Apps Adopt AI-Accelerated VFX 

ILM and Digital Domain are just two of a growing set of visual effects studios and apps adopting AI tools accelerated by NVIDIA RTX GPUs.

In HBO’s The Righteous Gemstones series, lead actor John Goodman looks 30 years younger than he is. This de-aging effect was achieved with Shapeshifter, a custom software that uses AI to analyze face motion — how the skin stretches and moves over muscle and bone.

VFX studio Gradient Effects used Shapeshifter to transform the actor’s face in a process that, using NVIDIA GPUs, took weeks instead of months.

Companies such as Adobe, Autodesk and Blackmagic Design have developed RTX-accelerated apps to tackle other visual effects challenges with AI, including live-action scene depth reclamation, color adjustment, relighting and retouching, speed warp motion estimation for retiming and upscaling.

Netflix Greenlights AI-Powered Predictions 

Offscreen, streaming services such as Netflix use AI-powered recommendation engines to provide customers with personalized content based on their viewing history, or a similarity index that serves up content watched by people with similar viewing habits.

Netflix also customizes movie thumbnails to appeal to individual users, and uses AI to help optimize streaming quality at lower bandwidths. The company uses NVIDIA GPUs to accelerate its work with complex data models, enabling rapid iteration.

Rolling Out the Red Carpet at GTC 2020

Top studios including Lucasfilm’s ILMxLAB, Magnopus and Digital Domain will be speaking at NVIDIA’s GPU Technology Conference in San Jose, March 23-26.

Check out the lineup of media and entertainment talks and register to attend. Early pricing ends Feb. 13.

Feature image courtesy of Industrial Light & Magic. © 2019 NETFLIX

 

The post AI-Listers: Oscar-Nominated Irishman, Avengers Set Stage for AI Visual Effects appeared first on The Official NVIDIA Blog.

For 12th Year Running, NVIDIA Quadro Powers Every Oscar-Nominated Film for Best Visual Effects

For the 12th consecutive year, NVIDIA Quadro GPUs have powered the stunning visuals behind every Academy Award nominee for Best Visual Effects.

The red carpet will roll out for the five VFX-nominated films at the 92nd annual Academy Awards on Sunday, Feb. 9. They are:

  • Avengers: Endgame
  • The Irishman
  • The Lion King
  • 1917
  • Star Wars: The Rise of Skywalker

For over a decade, Quadro has been behind award-winning graphics in films, bringing the latest advancements in visual effects to studios across the industry.

Avengers: Endgame amazed audiences with over 2,500 stunning visual effects shots, breaking box office records along the way.

The visual effects development team at Digital Domain brought innovation to the film like never before, using custom machine learning technology running on NVIDIA Quadro GPUs to animate Josh Brolin’s performance on Marvel’s infamous villain, Thanos.

“For Digital Domain’s work on Avengers: Endgame, we built a machine learning system that understood Josh Brolin’s facial movements, and we taught the system how to transfer Josh to Thanos’ face,” said Darren Hendler, head of Digital Humans at Digital Domain. “Key to this process were immediate realistic rendered previews of the characters’ emotional performances, which was made possible using NVIDIA GPU technology. We now use NVIDIA RTX technology to drive all of our real-time ray-traced digital human projects.”

The innovators at Industrial Light & Magic also gave machine learning a starring role in Martin Scorsese’s The Irishman. The markerless de-aging of Robert DeNiro, Al Pacino and Joe Pesci enabled the film to tell a story that spanned decades while allowing the actors to perform without facial markers or helmet cams.

With the new NVIDIA Quadro RTX GPUs, artists and studios can take advantage of the latest AI-driven tools and techniques. Quadro RTX accelerates many custom AI-based tools and commercial creative applications from companies like Adobe, Autodesk, Blackmagic Design and more, so professionals can take visual effects to the next level with enhanced capabilities.

Disney’s visually stunning remake of The Lion King also broke new ground by transporting live-action crews to the CG environment in virtual reality. Through the use of Quadro GPUs, VFX supervisor Rob Legato and virtual production supervisor Ben Grossmann brought traditional filmmaking techniques in virtual worlds that gave us the most cinematic CG movie ever created.

“Rendering is the killer of fast turnaround and iterative creativity. You really need global illumination and ray tracing for real-time feedback and micro adjustments, just like on a live action film stage, and that’s what the NVIDIA Quadro graphics cards give you,” said Robert Legato, VFX supervisor on The Lion King.

Presenting a heart-pounding, up close and personal view of World War I, 1917 features a single-shot perspective that follows the mission of two soldiers delivering an important message to the front line. MPC used NVIDIA GPUs to create what looks like the longest single visual effects shot ever put on the big screen.

Bringing an epic saga to a close, Lucasfilm’s Star Wars: The Rise of Skywalker gave audiences iconic lightsaber duels, monumental starship battles, and a new era of heroes and villains. Industrial Light & Magic used NVIDIA GPUs to beautifully blend VFX with practical effects, creating some of the most incredible worlds and battles ever seen in film.

While only one visual effects team will take to the podium to accept an Oscar, thousands of artists around the world are bringing new worlds and characters to life every day. From powering creative applications to enabling new AI techniques, NVIDIA RTX accelerates the workstations and servers the film industry is using to define the future of visual computing.

NVIDIA engineering is also no stranger to Academy Awards — seven NVIDIANs are past recipients of Sci-Tech Awards.

See How Oscar-Nominated Visual Effects Are Created

Get behind the scenes of the world’s most advanced visual effects at the NVIDIA GPU Technology Conference in San Jose, March 23-26.

Come see Ben Grossmann, CEO of Magnopus and virtual production supervisor of Disney’s The Lion King, present the latest advancements in virtual production techniques.

Doug Roble, senior director of Software R&D at Digital Domain, will take GTC attendees on a journey featuring the most photorealistic real-time digital human ever made.

And learn from other media and entertainment industry luminaries at GTC, including Vicki Dobbs Beck from Lucasfilm’s ILMxLAB, Max Liani from Pixar, David Crabtree from DNEG and David Morin from Epic Games.

Feature image credit: © 2019 NETFLIX

The post For 12th Year Running, NVIDIA Quadro Powers Every Oscar-Nominated Film for Best Visual Effects appeared first on The Official NVIDIA Blog.

For 12th Year Running, NVIDIA Quadro Powers Every Oscar-Nominated Film for Best Visual Effects

For the 12th consecutive year, NVIDIA Quadro GPUs have powered the stunning visuals behind every Academy Award nominee for Best Visual Effects.

The red carpet will roll out for the five VFX-nominated films at the 92nd annual Academy Awards on Sunday, Feb. 9. They are:

  • Avengers: Endgame
  • The Irishman
  • The Lion King
  • 1917
  • Star Wars: The Rise of Skywalker

For over a decade, Quadro has been behind award-winning graphics in films, bringing the latest advancements in visual effects to studios across the industry.

Avengers: Endgame amazed audiences with over 2,500 stunning visual effects shots, breaking box office records along the way.

The visual effects development team at Digital Domain brought innovation to the film like never before, using custom machine learning technology running on NVIDIA Quadro GPUs to animate Josh Brolin’s performance on Marvel’s infamous villain, Thanos.

“For Digital Domain’s work on Avengers: Endgame, we built a machine learning system that understood Josh Brolin’s facial movements, and we taught the system how to transfer Josh to Thanos’ face,” said Darren Hendler, head of Digital Humans at Digital Domain. “Key to this process were immediate realistic rendered previews of the characters’ emotional performances, which was made possible using NVIDIA GPU technology. We now use NVIDIA RTX technology to drive all of our real-time ray-traced digital human projects.”

The innovators at Industrial Light & Magic also gave machine learning a starring role in Martin Scorsese’s The Irishman. The markerless de-aging of Robert DeNiro, Al Pacino and Joe Pesci enabled the film to tell a story that spanned decades while allowing the actors to perform without facial markers or helmet cams.

With the new NVIDIA Quadro RTX GPUs, artists and studios can take advantage of the latest AI-driven tools and techniques. Quadro RTX accelerates many custom AI-based tools and commercial creative applications from companies like Adobe, Autodesk, Blackmagic Design and more, so professionals can take visual effects to the next level with enhanced capabilities.

Disney’s visually stunning remake of The Lion King also broke new ground by transporting live-action crews to the CG environment in virtual reality. Through the use of Quadro GPUs, VFX supervisor Rob Legato and virtual production supervisor Ben Grossmann brought traditional filmmaking techniques in virtual worlds that gave us the most cinematic CG movie ever created.

“Rendering is the killer of fast turnaround and iterative creativity. You really need global illumination and ray tracing for real-time feedback and micro adjustments, just like on a live action film stage, and that’s what the NVIDIA Quadro graphics cards give you,” said Robert Legato, VFX supervisor on The Lion King.

Presenting a heart-pounding, up close and personal view of World War I, 1917 features a single-shot perspective that follows the mission of two soldiers delivering an important message to the front line. MPC used NVIDIA GPUs to create what looks like the longest single visual effects shot ever put on the big screen.

Bringing an epic saga to a close, Lucasfilm’s Star Wars: The Rise of Skywalker gave audiences iconic lightsaber duels, monumental starship battles, and a new era of heroes and villains. Industrial Light & Magic used NVIDIA GPUs to beautifully blend VFX with practical effects, creating some of the most incredible worlds and battles ever seen in film.

While only one visual effects team will take to the podium to accept an Oscar, thousands of artists around the world are bringing new worlds and characters to life every day. From powering creative applications to enabling new AI techniques, NVIDIA RTX accelerates the workstations and servers the film industry is using to define the future of visual computing.

NVIDIA engineering is also no stranger to Academy Awards — seven NVIDIANs are past recipients of Sci-Tech Awards.

See How Oscar-Nominated Visual Effects Are Created

Get behind the scenes of the world’s most advanced visual effects at the NVIDIA GPU Technology Conference in San Jose, March 23-26.

Come see Ben Grossmann, CEO of Magnopus and virtual production supervisor of Disney’s The Lion King, present the latest advancements in virtual production techniques.

Doug Roble, senior director of Software R&D at Digital Domain, will take GTC attendees on a journey featuring the most photorealistic real-time digital human ever made.

And learn from other media and entertainment industry luminaries at GTC, including Vicki Dobbs Beck from Lucasfilm’s ILMxLAB, Max Liani from Pixar, David Crabtree from DNEG and David Morin from Epic Games.

Feature image credit: © 2019 NETFLIX

The post For 12th Year Running, NVIDIA Quadro Powers Every Oscar-Nominated Film for Best Visual Effects appeared first on The Official NVIDIA Blog.

GeForce NOW Open for All

Today we’re making GeForce NOW — and PC gaming — accessible to more gamers.

More than 1.2 billion players have made the PC the world’s largest gaming platform, yet only a fraction have a modern PC with the power to play their favorite games.

GeForce NOW lets you use the cloud to join in. It’s the power to play PC games anywhere, on any device — even the billion devices that aren’t game ready. You’re upgrading to a state-of-the-art gaming rig by virtually adding a GeForce graphics card to your PC, Mac, SHIELD or Android phone.

Just like the PC, GeForce NOW is an open platform. It’s powered by our world-class GPU architecture and uses our Game Ready Drivers for the best performance. And, because these are real PC games, GeForce NOW gives you the precision of keyboard and mouse gaming, as well as optimizations for game controllers.

It’s an opportunity for you to connect to the millions of gamers online and join your friends in your favorite PC communities. Most importantly, it instantly connects you to the games you already own on the digital game stores you own them on, or the latest free-to-play game everyone is playing.

Your games, your devices, GeForce power.

Game ON

Funny thing about gamers. They have games. Lots of them. Our GeForce NOW beta members have large libraries. Often 50 games or more. You shouldn’t have to leave them behind.

That’s why we’re working with existing PC game stores and publishers. With GeForce NOW, you can keep playing the games you already own and continue building libraries from the same stores you already use every day. That’s what it means to be an open platform.

If you buy it, you own it. If you already own it, play it. Your purchases are always yours.

GeForce NOW games

GeForce NOW is the only cloud gaming service with access to a wide range of free-to-play games, more than 30 and growing. In total, there are hundreds of games from more than 50 publishers that, once you own, are available for instant play. All these games are patched automatically in the cloud, so your library is always game ready.

While we continue to grow the library of instantly accessible games, there are also more than 1,000 games that can be played through single-session installs.

Games get added to the service weekly based on member requests, game popularity and publishers’ input. If you don’t see a game that you want to play, let us and the publisher know.

Play ON

GeForce NOW takes PC gaming and 60+ FPS action where it’s never been before. Underpowered? No problem. Incompatible? No problem.

GeForce NOW platforms

Your old laptop never gamed like this before. That Mac, which for years has seen fewer games published for it or lost compatibility, can now play the latest games. Your rig doesn’t have to move from room to room to game on the largest screen in your house — your TV. And most recently, PC games go with you on the screen you take everywhere: your Android phone.

And later this year we’ll add Chromebooks, giving members yet another screen to play on.

ON for Everyone

Over 300,000 beta testers have streamed more than 70 million hours of gameplay in 30 countries throughout North America and Europe. GeForce NOW is better because of every second that’s been played. During the beta, over 80 percent of members instantly upgraded from systems without GeForce GPUs to the latest PC graphics.

We’ve reached a point in our journey where we’re ready to remove the waitlist, exit beta and open GeForce NOW up to even more gamers. Members will have the option of a free or premium experience.

GeForce NOW memberships

The free membership provides one-hour sessions with standard access to GeForce NOW servers. There’s no limit to the number of sessions you can play. All GeForce NOW beta members automatically have had their beta accounts converted to this free plan. Simply sign in and continue gaming.

Gamers can also upgrade to a no-wait, longer session-length premium experience. These members will enjoy priority access to GeForce NOW servers ahead of free members and extended session lengths of up to six hours. They’ll also have exclusive access to RTX content on GeForce NOW.

For a limited time, we’re offering the premium experience as a Founders membership. The first three months are free, followed by a discounted rate of $4.99 a month for all of 2020. This special offer is part of our commitment to working with the community that continues to help us improve GeForce NOW.

Both memberships will work across any supported device that you already own. No additional hardware needed.

RTX ON

We’ll continue to launch new features, including today’s introduction of ray tracing. The holy grail of gaming graphics, ray tracing simulates the physical behavior of light to bring real-time, cinematic-quality rendering to even the most visually intense games. Founders members will have instant access to RTX games.

RTX ON Deliver Us The Moon

GeForce NOW continues to evolve and we’re working every day to improve your service.

Because of You, GeForce NOW is ON

We couldn’t — and wouldn’t — be here without the beta members who have helped guide us on our journey. From Mac to PC to SHIELD and then to mobile, every ounce of feedback has left an impression on GeForce NOW. Thank you.

And to everyone who wants the power of PC gaming that 200 million GeForce gamers already know and love, welcome to GeForce NOW. More power, more people — game on.

The post GeForce NOW Open for All appeared first on The Official NVIDIA Blog.

3D History in the Baking: NVIDIA RTX Accelerates Texture Design Tools in Substance Painter

When it comes to baking 3D textures and materials on designs, NVIDIA RTX gives Substance Painter an extra boost.

Nikie Monteleone, a lead look development artist in the animation industry, creates 3D graphics and images by integrating colors with complex, stylized patterns on all types of surfaces. Part of her design process is sculpting and hand painting every detail, from a chameleon’s scales to an octopus’s tentacles.

Monteleone uses Substance Painter, a 3D painting program from Substance by Adobe, to create new textures and patterns by baking maps, which she then uses to give realistic appearances to her 3D designs and models.

“If you’re doing any sort of heavy texturing or GPU rendering, having an NVIDIA RTX card in your machine will pay for itself.” — Nikie Monteleon, 3D artist

Texture baking allows artists to transfer details from one model to another — but it can be a time-consuming process, and Monteleone needed a graphics card that could keep up. This is especially important because inspiration hits when she least expects it. With NVIDIA RTX, Monteleone can quickly turn her creative ideas into reality.

More companies are incorporating RTX capabilities into their software, and Substance Painter is one of the creative applications that’s using RTX to speed up certain processes — including baking.

With RTX-powered Substance Painter, Monteleone can bake maps up to 8x faster than before. The speed and performance of the NVIDIA Quadro RTX 6000 graphics card lets Monteleone skip the 2D concept and take her ideas straight to 3D modeling.

The artist-friendly Substance Painter lets animators have fun with developing looks and textures for their designs. Images courtesy of Nikie Monteleone.

RTX-tra Features in Substance Painter

The first time Monteleone opened Substance Painter, she was immediately hooked. The interface and navigation in Substance Painter are easy to use, allowing her to create new looks and play around with designs faster than before.

“For an artist who’s constantly iterating, the RTX 6000 is proving to be quite a timesaver and is really fun to use,” said Monteleone. “I’m able to do real-time rendering and get designs approved on the fly, with the ability to perform faster on more 3D asset iterations than before.”

 

But there’s one place where RTX really shines in Substance Painter: hitting the bakes with the new GPU-accelerated features.

Monteleone relies heavily on baked maps, using them to multiply, overlay or add definition to her designs. As she started to specialize more in surfacing, she upgraded to using the Quadro RTX 6000.

With the new graphics card, Monteleone was seeing massive performance boosts with baking maps, being able to navigate larger scenes and load higher resolution maps on the spot.

Previously, Monteleone would wait up to 78 minutes for a bake to finish. But with the RTX card, that time reduced to seven minutes — barely enough time for a quick coffee break.

With Quadro RTX, Monteleone can spend less time waiting for bakes to finish and more time bringing her characters and designs to life.

Learn more about NVIDIA RTX.

The post 3D History in the Baking: NVIDIA RTX Accelerates Texture Design Tools in Substance Painter appeared first on The Official NVIDIA Blog.

NVIDIA Brings the Future into Focus at CES 2020

CES 2020 will be bursting with vivid visual entertainment and smart everything, powered, in part, by NVIDIA and its partners.

Attendees packing the annual techfest will experience the latest additions to GeForce, the world’s most powerful PC gaming platform and the first to deliver ray tracing. They’ll see powerful displays and laptops, ultra-realistic game titles and capabilities offering new levels of game play.

NVIDIA’s Vegas headliners include three firsts  — a 360Hz esports display, the first 14-inch laptops and all-in-one PCs delivering the graphics realism of ray tracing.

The same GPU technologies powering next-gen gaming are also spawning an age of autonomous machines. CES 2020 will be alive with robots such as Toyota’s new T-HR3, thanks to advances in the NVIDIA Isaac platform. And the newly minted DRIVE AGX Orin promises 7x performance gains for future autonomous vehicles.

Together, they’re knitting together an AI-powered Internet of Things from the cloud to the network’s edge that will touch everything from entertainment to healthcare and transportation.

A 2020 Vision for Play

NVIDIA’s new G-SYNC display for esports gamers delivers a breakthrough at 360Hz, projecting a vision of game play that’s more vivid than ever.  NVIDIA and ASUS this week unveiled the ASUS ROG 360, the world’s fastest display, powered by NVIDIA G-SYNC. Its 360Hz refresh rate in a 24.5-inch form factor let esports and competitive gamers keep every pixel of action in their field of view during the heat of competition.

The 24-inch ASUS ROG Swift sports a 360Hz refresh rate.

Keeping the picture crisp, Acer, Asus and LG are expanding support for G-SYNC. First introduced in 2013, G-SYNC is best known for its innovative Variable Refresh Rate technology that eliminates screen tearing by synchronizing the refresh rate of the display with the GPU’s frame rate.

In 2019, LG became the first TV manufacturer to offer NVIDIA G-SYNC compatibility, bringing the must-have gaming feature to select OLED TV models. Thirteen new models for 2020 will provide a flawless gaming experience on the big screen, without screen tearing or other distracting visual artifacts.

In addition, Acer and Asus are showcasing two upcoming G-SYNC ULTIMATE displays. They feature the latest full-array direct backlight technology with 1,400 nits brightness, significantly increasing display contrast for darker blacks and more vibrant colors. Gamers will enjoy the fast response time and ultra-low lag of these displays running at up to 144Hz at 4K.

Game On, RTX On

The best gaming monitors need awesome content to shine. So today, Bethesda turned on ray tracing in Wolfenstein: Youngblood, bringing a new level of realism to the popular title. An update that sports ray-tracing reflections and DLSS is available as a free downloadable patch starting today for gamers with a GeForce RTX GPU.

Bethesda joins the world’s leading publishers who are embracing ray tracing as the next big thing in their top franchises. Call of Duty Modern Warfare and Control — IGN’s Game of the Year — both feature incredible real-time ray-tracing effects.

VR is donning new headsets, games and innovations for CES 2020.

NVIDIA’s new rendering technique, Variable Rate Super Sampling, in the latest Game Ready Driver improves image quality in VR games. It uses Variable Rate Shading, part of the NVIDIA Turing architecture, to dynamically apply up to 8x supersampling to the center. or foveal region. of the VR headset, enhancing image quality where it matters most while delivering stellar performance.

In addition, Game Ready Drivers now make it possible to set the max frame rate a 3D application or game can render to save power and reduce system latency. They enable the best gaming experience by keeping a G-SYNC display within the range where the technology shines.

Creators’ Visions Coming into Focus

A total of 14 hardware OEMs introduced new RTX Studio systems at CES 2020. Combined with NVIDIA Studio Drivers, they’re powering more than 55 creative and design apps with RTX-accelerated ray tracing and AI.

HP launched the ENVY 32 All-in-One with GeForce RTX graphics, configurable with up to GeForce RTX 2080. Acer has three new systems from its ConceptD line. And ten other system builders across North America, Europe and China all now have RTX Studio offerings.

These RTX Studio systems adhere to stringent hardware and software requirements to empower creativity at the speed of imagination. They also ship with NVIDIA’s Studio Drivers, providing the ultimate performance and stability for creative applications.

Robots Ring in the New Year

The GPU technology that powers games is also driving AI, accelerating the development of a host of autonomous vehicles and robots at CES 2020.

Toyota’s new T-HR3 humanoid partner robot will have a Vegas debut at its booth (LVCC, North Hall, Booth 6919). A human operator wearing a VR headset controls the system using augmented video and perception data fed from an NVIDIA Jetson AGX Xavier computer in the robot.

Toyota’s T-HR3 makes its Vegas debut at CES 2020.

Attendees can try out the autonomous wheelchair from WHILL, which had won a CES 2019 Innovation of the Year award, powered by a Jetson TX2. Sunflower Labs will demo its new home security robot, also packing a Jetson TX2. Other NVIDIA-powered systems at CES include a delivery robot from PostMates and an inspection snake robot from Sarcos.

The Isaac software development kit marks a milestone in establishing a unified AI robotic development platform we call NVIDIA Isaac, an open environment for mapping, model training, simulation and computing. It includes a variety of camera-based perception deep neural networks for functions such as object detection, 3D pose estimation and 2D human pose estimation.

This release also introduces Isaac Sim, which lets developers train on simulated robots and deploy their lessons to real ones, promising to greatly accelerate robotic development especially for environments such as large logistics operations. Isaac Simulation will add early-access availability for manipulation later this month.

Driving an Era of Autonomous Vehicles

This marks a new decade of automotive performance, defined by AI compute rather than horsepower. It will spread autonomous capabilities across today’s $10 trillion transportation industry. The transformation will require dramatically more compute performance to handle exponential growth in AI models being developed to ensure autonomous vehicles  are both functional and safe.

NVIDIA DRIVE AV, an end-to-end, software-defined platform for AVs, delivers just that. It includes a development flow, data center infrastructure, an in-vehicle computer and the highest quality pre-trained AI models that can be adapted by OEMs.

Last month, NVIDIA announced the latest piece of that platform, DRIVE AGX Orin, a highly advanced software-defined platform for autonomous vehicles.

The platform is powered by a new system-on-a-chip called Orin, which achieves 200 TOPS — nearly 7x the performance of the previous generation SoC Xavier. It’s designed to handle the large number of applications and DNNs that run simultaneously in autonomous vehicles, while achieving systematic safety standards such as ISO 26262 ASIL-D.

NVIDIA is now providing access to its pre-trained DNNs and cutting-edge training processes on the NGC container registry. With industry-leading networks and advanced learning techniques such as active learning, transfer learning and federated learning, developers can turbo charge development and custom applications

Working Together

NVIDIA’S AI ecosystem of innovators is spread across the CES 2020 show floor, including more than 100 members of Inception, a company program that nurtures cutting-edge startups that are revolutionizing industries with AI.

Among established leaders, Mercedes-Benz, an NVIDIA DRIVE customer, will open the show Monday night with a keynote on the future of intelligent transportation. And GeForce partners will crank up the gaming excitement in demos across the event.

The post NVIDIA Brings the Future into Focus at CES 2020 appeared first on The Official NVIDIA Blog.

Upon Reflection, 3D Artist Creates Realistic Renders with NVIDIA Quadro RTX

For artists working on product designs and visualizations, time is an important factor — they want to quickly iterate and deliver results without waiting hours for renders to complete.

This is why 3D artist David Baylis chose NVIDIA Quadro RTX graphics to create high-quality renders for his clients.

He’s able to cut rendering times on his visualizations, with projects that range from winter cabin architectural designs to a virtual recreation of comedian and former Tonight Show host Jay Leno’s garage featuring Formula One champion Michael Shumacher’s F2004 and other automotive memorabilia.

Baylis’ visualizations typically require high-resolution textures and polygons. He uses Unreal Engine to render the graphics and create interactive experiences for his clients. But getting realistic details for building designs and car models requires a powerful GPU. When Baylis upgraded to the Quadro RTX 6000, he found the horsepower and speed he was looking for.

It wasn’t just the rendering performance in Unreal Engine that got major speedups. With Quadro RTX, Baylis saw accelerated workflows throughout his design process.

He was able to edit content and encode videos more smoothly when Adobe Premiere Pro was running on the GPU. And in Substance Painter, the time for baking textures in 4K reduced from 30 seconds on the CPU down to 18 seconds, which makes a considerable difference when doing multiple objects.

Baylis also saw performance speedups using the latest NVIDIA GPU-accelerated rendering feature in KeyShot 9. With traditional rendering on a CPU, the results were 72 samples in 90 seconds. When Baylis switched to the Quadro GPU, he was able to render 1,033 samples in 30 seconds — over 40x faster.

After achieving cinematic-quality images with almost no wait time, Baylis experienced how RTX real-time ray tracing gives 3D artists a big advantage in creative workflows.

Quadro-powered coziness: an architectural visualization by David Baylis.

RTX Keeps It Real With New Levels of Detail

The details make all the difference when it comes to taking designs from concept to reality. With Quadro RTX, Baylis is able to create photorealistic images and bring extra fidelity to his renders. It’s especially noticeable in automotive visualizations, where Baylis heavily relies on real-time reflections.

When creating the virtual scene of Leno’s garage, Baylis used the ray-tracing features of Quadro RTX 6000 to achieve the realistic lighting and accurate reflections on the Formula One race car showcased in the environment.

Using his iPhone, he controlled the camera from Unreal Engine and was able to walk around in the virtual scene as if he were standing next to the car. This provided live feedback in real time — Baylis could look at the scene, make edits and then wait as little as 20 minutes for a scene to render in 4K.

“With Quadro RTX 6000, I have been pushing the boundaries of real-time ray tracing at a level I couldn’t achieve previously,” said Baylis. “RTX is a huge leap in the CG industry and I simply cannot work without it anymore.”

Learn more about NVIDIA Quadro RTX or check out a few of Baylis’ projects.

The post Upon Reflection, 3D Artist Creates Realistic Renders with NVIDIA Quadro RTX appeared first on The Official NVIDIA Blog.