Working Remotely: Connecting to Your Office Workstation

With so many people working from home amid the COVID-19 outbreak, staying productive can be challenging.

At NVIDIA, some of us have RTX laptops and remote-working capabilities powered by our virtual GPU software via on-prem servers and the cloud. To help support the many other businesses with GPUs in their servers, we recently made vGPU licenses free for up to 500 users for 90 days to explore their virtualization options.

But many still require access to physical Quadro desktop workstations due to specific hardware configurations or data requirements. And we know this situation is hardly unique.

Many designers, engineers, artists and architects have Quadro RTX mobile workstations that are on par with their desktop counterparts, which helps them stay productive anywhere. However, a vast number of professionals don’t have access to their office-based workstations — with multiple high-end GPUs, large memory and storage, as well as applications and data.

These workstations are critical for keeping  everything from family firms to multinational corporations going. And this has forced IT teams to explore different ways to address the challenges of working from home by connecting remotely to an office workstation.

Getting Started: Tools for Remote Connections

The list below shows several publicly available remote-working tools that are helpful to get going quickly. For details on features and licensing, contact the respective providers.

Managing Access, Drivers and Reboots

Once you’re up and running, keep these considerations in mind:

Give yourself a safety net when working on a remote system 

There are times that your tools can stop working, so it’s a good idea to have a safety net. Always install a VNC server on the machine (https://www.tightvnc.com/, https://www.realvnc.com/en/ or others) no matter what remote access tool you use. It’s also a good idea to enable Access to Microsoft Remote Desktop as another option. These run quietly in the background, but are ready if you need them in an emergency

Updating your driver remotely

We recommend you use a VNC connection to upgrade your drivers. Changing the driver often changes the parts the driver the remote access tools are using, so you often lose the connection. VNC doesn’t connect into the driver at a low level, so keeps working as the old driver is changed out to the new. Once the driver is updated, you can go back to your other remote access tools.

Rebooting your machine remotely

Normally you can reboot with the windows menus. Give the system a few minutes to restart and then log back in. If your main remote-working tools have stopped functioning, try a VNC connection. You can also restart from a PowerShell Window or command prompt from your local machine with the command: shutdown /r /t 0 /m \\[machine-name]

App-Specific Resources

Several software makers with applications for professionals working in the manufacturing, architecture, and media and entertainment industries have provided instructions on using their applications from home. Here are links to a few recent articles:

Where to Get Help

Given the inherent variability in working from home, there’s no one-size-fits-all solution. If you run into technical issues and have questions, feel free to contact us at desktop-remoting@nvidia.com. We’ll do our best to help.

The post Working Remotely: Connecting to Your Office Workstation appeared first on The Official NVIDIA Blog.

NVIDIA Expands Free Access to GPU Virtualization Software to Support Remote Workers

In challenging times, our strength comes from working together. With many companies needing to quickly support employees now working remotely, NVIDIA is expanding our free, 90-day virtual GPU software evaluation from 128 to 500 licenses.

With vGPU software licenses, companies can use their on-premises NVIDIA GPUs to provide accelerated virtual infrastructure so people can work and collaborate from anywhere. Companies can also temporarily repurpose NVIDIA GPUs being used on other projects to support their remote workers.

Every organization is working hard to address these needs: Healthcare providers are supporting care from new locations. Schools are expanding their virtual classrooms. Agencies are coordinating critical services.

Whether supporting financial professionals working with data on multiple screens, scientists conducting research, or designers working in graphics-intensive applications, enterprises are faced with different workloads that have different requirements.

NVIDIA offers a variety of customized vGPU software to meet these diverse needs. All three tiers of the company’s specialized vGPU software are available through the expanded free licensing:

  • NVIDIA GRID software delivers responsive VDI by virtualizing systems and applications for knowledge workers.
  • NVIDIA Quadro Virtual Data Center Workstation software provides workstation-class performance for creators using high-end graphics applications.
  • NVIDIA Virtual Compute Server software accelerates server virtualization with GPUs to power the most compute-intensive workflows, such as AI, deep learning and data science on a virtual machine.

Virtualized Performance, Enterprise Security and Broad Ecosystem Support

In addition to providing high performance and reducing latency for remote workers, NVIDIA vGPU software ensures protection for sensitive data and digital assets, which remain in the data center and aren’t saved to local client devices. This is an important security requirement for remote work across many industries, including visual effects and design, as well as for research and development.

NVIDIA vGPU software is certified on a broad ecosystem of hypervisors, platforms, user applications and management software to help IT teams quickly scale out support for remote workers.

Companies can deploy virtual workstations, compute and VDI from their on-prem data centers by installing the vGPU software licenses on all NVIDIA GPUs based on the Pascal, Volta and Turing architectures, including NVIDIA Quadro RTX 6000 and RTX 8000 GPUs, and NVIDIA M10 and M60 GPUs.

Get the Virtual GPU Evaluation.

NVIDIA is also providing genomics researchers studying COVID-19 free access to Parabricks software for 90 days. See our post on Parabricks to learn more.

Supporting links:

The post NVIDIA Expands Free Access to GPU Virtualization Software to Support Remote Workers appeared first on The Official NVIDIA Blog.

AI-Listers: Oscar-Nominated Irishman, Avengers Set Stage for AI Visual Effects

This weekend’s Academy Awards show features a twice-nominated newcomer to the Oscars: AI-powered visual effects.

Two nominees in the visual effects category, The Irishman and Avengers: Endgame, used AI to push the boundaries between human actors and digital characters — de-aging the stars of The Irishman and bringing the infamous villain Thanos to life in Avengers.

Behind this groundbreaking, AI-enhanced storytelling are VFX studios Industrial Light & Magic and Digital Domain, which use NVIDIA Quadro RTX GPUs to accelerate production.

AI Time Machine

From World War II to a nursing home in the 2000s, and every decade in between, Netflix’s The Irishman tells the tale of hitman Frank Sheeran through scenes from different times in his life.

But all three leads in the film — Robert DeNiro, Al Pacino and Joe Pesci — are in their 70s. A makeup department couldn’t realistically transform the actors back to their 20s and 30s. And director Martin Scorcese was against using the typical motion capture markers or other intrusive equipment that gets in the way of raw performances during filming.

To meet this requirement, ILM developed a new three-camera rig to capture the actors’ performances on set — using the director’s camera flanked by two infrared cameras to record 3D geometry and textures. The team also developed software called ILM Facefinder that used AI to sift through thousands of images of the actors’ past performances.

The tool located frames that matched the camera angle, framing, lighting and expression of the scene being rendered, giving ILM artists a relevant reference to compare against every frame in the shot. These visual references were used to refine digital doubles created for each actor, so they could be transformed into the target age for each specific scene in the film.

“AI and machine learning are becoming a part of everything we do in VFX,” said Pablo Helman, VFX supervisor on The Irishman at ILM. “Paired with the NVIDIA Quadro RTX GPUs powering our production pipeline, these technologies have us excited for what the next decade will bring.”

Building Better VFX Villains

The highest-grossing film of all time, Marvel’s Avengers: Endgame included over 2,500 visual effects shots. VFX teams at Digital Domain used machine learning to animate actor Josh Brolin’s performance onto the digital version of the film franchise’s villain, the mighty Thanos.

A machine learning system called Masquerade was developed to take low resolution scans of the actor’s performance and facial movements, and then accurately transfer his expressions onto the high-resolution mesh of Thanos’ face. The technology saves time for VFX artists who would otherwise have to painstakingly animate the subtle facial movements manually to generate a realistic, emoting digital human.

“Key to this process were immediate realistic rendered previews of the characters’ emotional performances, which was made possible using NVIDIA GPU technology,” said Darren Hendler, head of Digital Humans at Digital Domain. “We now use NVIDIA RTX technology to drive all of our real-time ray-traced digital human projects.”

RTX It in Post: Studios, Apps Adopt AI-Accelerated VFX 

ILM and Digital Domain are just two of a growing set of visual effects studios and apps adopting AI tools accelerated by NVIDIA RTX GPUs.

In HBO’s The Righteous Gemstones series, lead actor John Goodman looks 30 years younger than he is. This de-aging effect was achieved with Shapeshifter, a custom software that uses AI to analyze face motion — how the skin stretches and moves over muscle and bone.

VFX studio Gradient Effects used Shapeshifter to transform the actor’s face in a process that, using NVIDIA GPUs, took weeks instead of months.

Companies such as Adobe, Autodesk and Blackmagic Design have developed RTX-accelerated apps to tackle other visual effects challenges with AI, including live-action scene depth reclamation, color adjustment, relighting and retouching, speed warp motion estimation for retiming and upscaling.

Netflix Greenlights AI-Powered Predictions 

Offscreen, streaming services such as Netflix use AI-powered recommendation engines to provide customers with personalized content based on their viewing history, or a similarity index that serves up content watched by people with similar viewing habits.

Netflix also customizes movie thumbnails to appeal to individual users, and uses AI to help optimize streaming quality at lower bandwidths. The company uses NVIDIA GPUs to accelerate its work with complex data models, enabling rapid iteration.

Rolling Out the Red Carpet at GTC 2020

Top studios including Lucasfilm’s ILMxLAB, Magnopus and Digital Domain will be speaking at NVIDIA’s GPU Technology Conference in San Jose, March 23-26.

Check out the lineup of media and entertainment talks and register to attend. Early pricing ends Feb. 13.

Feature image courtesy of Industrial Light & Magic. © 2019 NETFLIX

 

The post AI-Listers: Oscar-Nominated Irishman, Avengers Set Stage for AI Visual Effects appeared first on The Official NVIDIA Blog.

For 12th Year Running, NVIDIA Quadro Powers Every Oscar-Nominated Film for Best Visual Effects

For the 12th consecutive year, NVIDIA Quadro GPUs have powered the stunning visuals behind every Academy Award nominee for Best Visual Effects.

The red carpet will roll out for the five VFX-nominated films at the 92nd annual Academy Awards on Sunday, Feb. 9. They are:

  • Avengers: Endgame
  • The Irishman
  • The Lion King
  • 1917
  • Star Wars: The Rise of Skywalker

For over a decade, Quadro has been behind award-winning graphics in films, bringing the latest advancements in visual effects to studios across the industry.

Avengers: Endgame amazed audiences with over 2,500 stunning visual effects shots, breaking box office records along the way.

The visual effects development team at Digital Domain brought innovation to the film like never before, using custom machine learning technology running on NVIDIA Quadro GPUs to animate Josh Brolin’s performance on Marvel’s infamous villain, Thanos.

“For Digital Domain’s work on Avengers: Endgame, we built a machine learning system that understood Josh Brolin’s facial movements, and we taught the system how to transfer Josh to Thanos’ face,” said Darren Hendler, head of Digital Humans at Digital Domain. “Key to this process were immediate realistic rendered previews of the characters’ emotional performances, which was made possible using NVIDIA GPU technology. We now use NVIDIA RTX technology to drive all of our real-time ray-traced digital human projects.”

The innovators at Industrial Light & Magic also gave machine learning a starring role in Martin Scorsese’s The Irishman. The markerless de-aging of Robert DeNiro, Al Pacino and Joe Pesci enabled the film to tell a story that spanned decades while allowing the actors to perform without facial markers or helmet cams.

With the new NVIDIA Quadro RTX GPUs, artists and studios can take advantage of the latest AI-driven tools and techniques. Quadro RTX accelerates many custom AI-based tools and commercial creative applications from companies like Adobe, Autodesk, Blackmagic Design and more, so professionals can take visual effects to the next level with enhanced capabilities.

Disney’s visually stunning remake of The Lion King also broke new ground by transporting live-action crews to the CG environment in virtual reality. Through the use of Quadro GPUs, VFX supervisor Rob Legato and virtual production supervisor Ben Grossmann brought traditional filmmaking techniques in virtual worlds that gave us the most cinematic CG movie ever created.

“Rendering is the killer of fast turnaround and iterative creativity. You really need global illumination and ray tracing for real-time feedback and micro adjustments, just like on a live action film stage, and that’s what the NVIDIA Quadro graphics cards give you,” said Robert Legato, VFX supervisor on The Lion King.

Presenting a heart-pounding, up close and personal view of World War I, 1917 features a single-shot perspective that follows the mission of two soldiers delivering an important message to the front line. MPC used NVIDIA GPUs to create what looks like the longest single visual effects shot ever put on the big screen.

Bringing an epic saga to a close, Lucasfilm’s Star Wars: The Rise of Skywalker gave audiences iconic lightsaber duels, monumental starship battles, and a new era of heroes and villains. Industrial Light & Magic used NVIDIA GPUs to beautifully blend VFX with practical effects, creating some of the most incredible worlds and battles ever seen in film.

While only one visual effects team will take to the podium to accept an Oscar, thousands of artists around the world are bringing new worlds and characters to life every day. From powering creative applications to enabling new AI techniques, NVIDIA RTX accelerates the workstations and servers the film industry is using to define the future of visual computing.

NVIDIA engineering is also no stranger to Academy Awards — seven NVIDIANs are past recipients of Sci-Tech Awards.

See How Oscar-Nominated Visual Effects Are Created

Get behind the scenes of the world’s most advanced visual effects at the NVIDIA GPU Technology Conference in San Jose, March 23-26.

Come see Ben Grossmann, CEO of Magnopus and virtual production supervisor of Disney’s The Lion King, present the latest advancements in virtual production techniques.

Doug Roble, senior director of Software R&D at Digital Domain, will take GTC attendees on a journey featuring the most photorealistic real-time digital human ever made.

And learn from other media and entertainment industry luminaries at GTC, including Vicki Dobbs Beck from Lucasfilm’s ILMxLAB, Max Liani from Pixar, David Crabtree from DNEG and David Morin from Epic Games.

Feature image credit: © 2019 NETFLIX

The post For 12th Year Running, NVIDIA Quadro Powers Every Oscar-Nominated Film for Best Visual Effects appeared first on The Official NVIDIA Blog.

For 12th Year Running, NVIDIA Quadro Powers Every Oscar-Nominated Film for Best Visual Effects

For the 12th consecutive year, NVIDIA Quadro GPUs have powered the stunning visuals behind every Academy Award nominee for Best Visual Effects.

The red carpet will roll out for the five VFX-nominated films at the 92nd annual Academy Awards on Sunday, Feb. 9. They are:

  • Avengers: Endgame
  • The Irishman
  • The Lion King
  • 1917
  • Star Wars: The Rise of Skywalker

For over a decade, Quadro has been behind award-winning graphics in films, bringing the latest advancements in visual effects to studios across the industry.

Avengers: Endgame amazed audiences with over 2,500 stunning visual effects shots, breaking box office records along the way.

The visual effects development team at Digital Domain brought innovation to the film like never before, using custom machine learning technology running on NVIDIA Quadro GPUs to animate Josh Brolin’s performance on Marvel’s infamous villain, Thanos.

“For Digital Domain’s work on Avengers: Endgame, we built a machine learning system that understood Josh Brolin’s facial movements, and we taught the system how to transfer Josh to Thanos’ face,” said Darren Hendler, head of Digital Humans at Digital Domain. “Key to this process were immediate realistic rendered previews of the characters’ emotional performances, which was made possible using NVIDIA GPU technology. We now use NVIDIA RTX technology to drive all of our real-time ray-traced digital human projects.”

The innovators at Industrial Light & Magic also gave machine learning a starring role in Martin Scorsese’s The Irishman. The markerless de-aging of Robert DeNiro, Al Pacino and Joe Pesci enabled the film to tell a story that spanned decades while allowing the actors to perform without facial markers or helmet cams.

With the new NVIDIA Quadro RTX GPUs, artists and studios can take advantage of the latest AI-driven tools and techniques. Quadro RTX accelerates many custom AI-based tools and commercial creative applications from companies like Adobe, Autodesk, Blackmagic Design and more, so professionals can take visual effects to the next level with enhanced capabilities.

Disney’s visually stunning remake of The Lion King also broke new ground by transporting live-action crews to the CG environment in virtual reality. Through the use of Quadro GPUs, VFX supervisor Rob Legato and virtual production supervisor Ben Grossmann brought traditional filmmaking techniques in virtual worlds that gave us the most cinematic CG movie ever created.

“Rendering is the killer of fast turnaround and iterative creativity. You really need global illumination and ray tracing for real-time feedback and micro adjustments, just like on a live action film stage, and that’s what the NVIDIA Quadro graphics cards give you,” said Robert Legato, VFX supervisor on The Lion King.

Presenting a heart-pounding, up close and personal view of World War I, 1917 features a single-shot perspective that follows the mission of two soldiers delivering an important message to the front line. MPC used NVIDIA GPUs to create what looks like the longest single visual effects shot ever put on the big screen.

Bringing an epic saga to a close, Lucasfilm’s Star Wars: The Rise of Skywalker gave audiences iconic lightsaber duels, monumental starship battles, and a new era of heroes and villains. Industrial Light & Magic used NVIDIA GPUs to beautifully blend VFX with practical effects, creating some of the most incredible worlds and battles ever seen in film.

While only one visual effects team will take to the podium to accept an Oscar, thousands of artists around the world are bringing new worlds and characters to life every day. From powering creative applications to enabling new AI techniques, NVIDIA RTX accelerates the workstations and servers the film industry is using to define the future of visual computing.

NVIDIA engineering is also no stranger to Academy Awards — seven NVIDIANs are past recipients of Sci-Tech Awards.

See How Oscar-Nominated Visual Effects Are Created

Get behind the scenes of the world’s most advanced visual effects at the NVIDIA GPU Technology Conference in San Jose, March 23-26.

Come see Ben Grossmann, CEO of Magnopus and virtual production supervisor of Disney’s The Lion King, present the latest advancements in virtual production techniques.

Doug Roble, senior director of Software R&D at Digital Domain, will take GTC attendees on a journey featuring the most photorealistic real-time digital human ever made.

And learn from other media and entertainment industry luminaries at GTC, including Vicki Dobbs Beck from Lucasfilm’s ILMxLAB, Max Liani from Pixar, David Crabtree from DNEG and David Morin from Epic Games.

Feature image credit: © 2019 NETFLIX

The post For 12th Year Running, NVIDIA Quadro Powers Every Oscar-Nominated Film for Best Visual Effects appeared first on The Official NVIDIA Blog.

Upon Reflection, 3D Artist Creates Realistic Renders with NVIDIA Quadro RTX

For artists working on product designs and visualizations, time is an important factor — they want to quickly iterate and deliver results without waiting hours for renders to complete.

This is why 3D artist David Baylis chose NVIDIA Quadro RTX graphics to create high-quality renders for his clients.

He’s able to cut rendering times on his visualizations, with projects that range from winter cabin architectural designs to a virtual recreation of comedian and former Tonight Show host Jay Leno’s garage featuring Formula One champion Michael Shumacher’s F2004 and other automotive memorabilia.

Baylis’ visualizations typically require high-resolution textures and polygons. He uses Unreal Engine to render the graphics and create interactive experiences for his clients. But getting realistic details for building designs and car models requires a powerful GPU. When Baylis upgraded to the Quadro RTX 6000, he found the horsepower and speed he was looking for.

It wasn’t just the rendering performance in Unreal Engine that got major speedups. With Quadro RTX, Baylis saw accelerated workflows throughout his design process.

He was able to edit content and encode videos more smoothly when Adobe Premiere Pro was running on the GPU. And in Substance Painter, the time for baking textures in 4K reduced from 30 seconds on the CPU down to 18 seconds, which makes a considerable difference when doing multiple objects.

Baylis also saw performance speedups using the latest NVIDIA GPU-accelerated rendering feature in KeyShot 9. With traditional rendering on a CPU, the results were 72 samples in 90 seconds. When Baylis switched to the Quadro GPU, he was able to render 1,033 samples in 30 seconds — over 40x faster.

After achieving cinematic-quality images with almost no wait time, Baylis experienced how RTX real-time ray tracing gives 3D artists a big advantage in creative workflows.

Quadro-powered coziness: an architectural visualization by David Baylis.

RTX Keeps It Real With New Levels of Detail

The details make all the difference when it comes to taking designs from concept to reality. With Quadro RTX, Baylis is able to create photorealistic images and bring extra fidelity to his renders. It’s especially noticeable in automotive visualizations, where Baylis heavily relies on real-time reflections.

When creating the virtual scene of Leno’s garage, Baylis used the ray-tracing features of Quadro RTX 6000 to achieve the realistic lighting and accurate reflections on the Formula One race car showcased in the environment.

Using his iPhone, he controlled the camera from Unreal Engine and was able to walk around in the virtual scene as if he were standing next to the car. This provided live feedback in real time — Baylis could look at the scene, make edits and then wait as little as 20 minutes for a scene to render in 4K.

“With Quadro RTX 6000, I have been pushing the boundaries of real-time ray tracing at a level I couldn’t achieve previously,” said Baylis. “RTX is a huge leap in the CG industry and I simply cannot work without it anymore.”

Learn more about NVIDIA Quadro RTX or check out a few of Baylis’ projects.

The post Upon Reflection, 3D Artist Creates Realistic Renders with NVIDIA Quadro RTX appeared first on The Official NVIDIA Blog.

BERT Does Europe: AI Language Model Learns German, Swedish

BERT is at work in Europe, tackling natural-language processing jobs in multiple industries and languages with help from NVIDIA’s products and partners.

The AI model formally known as Bidirectional Encoder Representations from Transformers debuted just last year as a state-of-the-art approach to machine learning for text. Though new, BERT is already finding use in avionics, finance, semiconductor and telecom companies on the continent, said developers optimizing it for German and Swedish.

“There are so many use cases for BERT because text is one of the most common data types companies have,” said Anders Arpteg, head of research for Peltarion, a Stockholm-based developer that aims to make the latest AI techniques such as BERT inexpensive and easy for companies to adopt.

Natural-language processing will outpace today’s AI work in computer vision because “text has way more apps than images — we started our company on that hypothesis,” said Milos Rusic, chief executive of deepset in Berlin. He called BERT “a revolution, a milestone we bet on.”

Deepset is working with PricewaterhouseCoopers to create a system that uses BERT to help strategists at a chip maker query piles of annual reports and market data for key insights. In another project, a manufacturing company is using NLP to search technical documents to speed maintenance of their products and predict needed repairs.

Peltarion, a member of NVIDIA’s Inception program that nurtures startups with access to its technology and ecosystem, packed support for BERT into its tools in November. It is already using NLP to help a large telecom company automate parts of its process for responding to product and service requests. And it’s using the technology to let a large market research company more easily query its database of surveys.

Work in Localization

Peltarion is collaborating with three other organizations on a three-year, government-backed project to optimize BERT for Swedish. Interestingly, a new model from Facebook called XLM-R suggests training on multiple languages at once could be more effective than optimizing for just one.

“In our initial results, XLM-R, which Facebook trained on 100 languages at once, outperformed a vanilla version of BERT trained for Swedish by a significant amount,” said Arpteg, whose team is preparing a paper on their analysis.

Nevertheless, the group hopes to have before summer a first version of a Swedish BERT model that performs really well, said Arpteg, who headed up an AI research group at Spotify before joining Peltarion three years ago.

An analysis by deepset of its German version of BERT.

In June, deepset released as open source a version of BERT optimized for German. Although its performance is only a couple percentage points ahead of the original model, two winners in an annual NLP competition in Germany used the deepset model.

Right Tool for the Job

BERT also benefits from optimizations for specific tasks such as text classification, question answering and sentiment analysis, said Arpteg. Peltarion researchers plans to publish in 2020 results of an analysis of gains from tuning BERT for areas with their own vocabularies such as medicine and legal.

The question-answering task has become so strategic for deepset it created Haystack, a version of its FARM transfer-learning framework to handle the job.

In hardware, the latest NVIDIA GPUs are among the favorite tools both companies use to tame big NLP models. That’s not surprising given NVIDIA recently broke records lowering BERT training time.

“The vanilla BERT has 100 million parameters and XML-R has 270 million,” said Arpteg, whose team recently purchased systems using NVIDIA Quadro and TITAN GPUs with up to 48GB of memory. It also has access to NVIDIA DGX-1 servers because “for training language models from scratch, we need these super-fast systems,” he said.

More memory is better, said Rusic, whose German BERT models weigh in at 400MB. Deepset taps into NVIDIA V100 Tensor Core 100 GPUs on cloud services and uses another NVIDIA GPU locally.

The post BERT Does Europe: AI Language Model Learns German, Swedish appeared first on The Official NVIDIA Blog.

Visual Effects Studio Pioneers RTX Servers to Boost Final-Frame Rendering

Creating cinematic-quality visuals has traditionally been a time-consuming, costly process.

But that’s changing, with UNIT, a London-based visual effects company pioneering the use of NVIDIA RTX Servers to dramatically accelerate the rendering workflow for its work on post- production, design, grade, audio, and computer graphics for film, TV and commercials.

In searching for the best rendering solution, UNIT tested Maxon’s Redshift renderer and found that, with NVIDIA Quadro RTX cards, its performance was far faster than CPU-based systems.

So UNIT custom-built a render server with eight NVIDIA Quadro RTX GPUs and updated each of their artist’s workstations with a Quadro RTX card. The RTX Servers streamline the entire operation, making GPUs easily interchangeable and accessible to all the artists.

“NVIDIA RTX delivers incredible rendering speed, creating a major paradigm shift for UNIT.“ — Nuno Pereira, creative director at UNIT

UNIT Speeds Through Rendering with RTX

Rendering is an important part of the creative process, but throughput for production rendering can be a challenge. UNIT wanted to boost the production pipeline while maintaining the power required for creating rich, cinematic-quality visuals.

In one of the studio’s initial tests of the accelerated rendering performance of NVIDIA RTX Server, the rendering time dropped from 80 minutes on CPU down to 55 seconds.

“Some artists are even using final-frame renders to preview shots, since their render speed is so fast now,” said Nuno Pereira, creative director of 3D and head of VFX at UNIT. “With Quadro RTX, we get the fast, reliable performance we need to push creativity even further.”

The speed of the software made it much easier to get more iterations for a project, so UNIT was able to achieve near-final renders and images much quicker. The render times for the canary scene below, featured in Sky’s Monday Night Football commercial, shrank from 50 minutes per frame down to 2 minutes.

Image courtesy of UNIT.

With Quadro RTX powering all its workstations, UNIT’s artists get the fast, reliable and high-quality performance they need to complete creative projects faster than before.

To learn more, sign up for our upcoming webinar, NVIDIA RTX Server for Today’s Mixed Workloads.

The post Visual Effects Studio Pioneers RTX Servers to Boost Final-Frame Rendering appeared first on The Official NVIDIA Blog.

AWS Outposts Station a GPU Garrison in Your Datacenter

All the goodness of GPU acceleration on Amazon Web Services can now also run inside your own data center.

AWS Outposts powered by NVIDIA T4 Tensor Core GPUs are generally available starting today. They bring cloud-based Amazon EC2 G4 instances inside your data center to meet user requirements for security and latency in a wide variety of AI and graphics applications.

With this new offering, AI is no longer a research project.

Most companies still keep their data inside their own walls because they see it as their core intellectual property. But for deep learning to transition from research into production, enterprises need the flexibility and ease of development the cloud offers — right beside their data. That’s a big part of what AWS Outposts with T4 GPUs now enables.

With this new offering, enterprises can install a fully managed rack-scale appliance next to the large data lakes stored securely in their data centers.

AI Acceleration Across the Enterprise

To train neural networks, every layer of software needs to be optimized, from NVIDIA drivers to container runtimes and application frameworks. AWS services like Sagemaker, Elastic MapReduce and many others designed on custom-built Amazon Machine Images require model development to start with the training on large datasets. With the introduction of NVIDIA-powered AWS Outposts, those services can now be run securely in enterprise data centers.

The GPUs in Outposts accelerate deep learning as well as high performance computing and other GPU applications. They all can access software in NGC, NVIDIA’s hub for GPU-accelerated software optimization, which is stocked with applications, frameworks, libraries and SDKs that include pre-trained models.

For AI inference, the NVIDIA EGX edge-computing platform also runs on AWS Outposts and works with the AWS Elastic Kubernetes Service. Backed by the power of NVIDIA T4 GPUs, these services are capable of processing orders of magnitudes more information than CPUs alone. They can quickly derive insights from vast amounts of data streamed in real time from sensors in an Internet of Things deployment whether it’s in manufacturing, healthcare, financial services, retail or any other industry.

On top of EGX, the NVIDIA Metropolis application framework provides building blocks for vision AI, geared for use in smart cities, retail, logistics and industrial inspection, as well as other AI and IoT use cases, now easily delivered on AWS Outposts.

Alternatively, the NVIDIA Clara application framework is tuned to bring AI to healthcare providers whether it’s for medical imaging, federated learning or AI-assisted data labeling.

The T4 GPU’s Turing architecture uses TensorRT to accelerate the industry’s widest set of AI models. Its Tensor Cores support multi-precision computing that delivers up to 40x more inference performance than CPUs.

Remote Graphics, Locally Hosted

Users of high-end graphics have choices, too. Remote designers, artists and technical professionals who need to access large datasets and models can now get both cloud convenience and GPU performance.

Graphics professionals can benefit from the same NVIDIA Quadro technology that powers most of the world’s professional workstations not only on the public AWS cloud, but on their own internal cloud now with AWS Outposts packing T4 GPUs.

Whether they’re working locally or in the cloud, Quadro users can access the same set of hundreds of graphics-intensive, GPU-accelerated third-party applications.

The Quadro Virtual Workstation AMI, available in AWS Marketplace, includes the same Quadro driver found on physical workstations. It supports hundreds of Quadro-certified applications such as Dassault Systèmes SOLIDWORKS and CATIA; Siemens NX; Autodesk AutoCAD and Maya; ESRI ArcGIS Pro; and ANSYS Fluent, Mechanical and Discovery Live.

Learn more about AWS and NVIDIA offerings and check out our booth 1237 and session talks at AWS re:Invent.

The post AWS Outposts Station a GPU Garrison in Your Datacenter appeared first on The Official NVIDIA Blog.

MIT Students Kick Self-Driving Mini-Cars into High Gear with GPU-Powered Data Science Workstations

Students at the Massachusetts Institute of Technology are learning about autonomous driving by taking NVIDIA-powered data science workstations for a spin.

In an undergraduate robotics class at MIT, 17 students were organized into three teams and given a miniature racing car. Their task: teach it how to drive by itself through a complex course inside the basement of the university’s Stata Center.

Sertac Karaman, associate professor of aeronautics and astronautics at MIT, wanted to teach students the process of imitation learning, a technique that uses human demonstrations to train a self-driving model.

NVIDIA’s Jetson AGX Xavier and Quadro RTX-powered Data Science Workstation deliver accelerated computing capabilities that allow Karaman and his students to create various AI-powered prototypes.

Students Wheel It in with Data Science Workstations

Through the process of imitation learning, the students needed to teach their car how to autonomously drive by training a TensorFlow neural network. But first, they needed to collect as much data as they could on the indoor course so the cars could learn how to navigate through the hallways and doors of the Stata Center.

Each car was equipped with an NVIDIA Jetson AGX Xavier embedded system-on-a-module for performance-driven autonomous machines. Using a joystick, the students manually drove the small car around the complex course and recorded data through a camera mounted on its frontend.

Then the neural network, based on the NVIDIA PilotNet architecture, processed that data, learning how to map between observation and action — so the car could estimate steering angles based on what its camera sees.

The students used the advanced computing capabilities of the data science workstations, powered by NVIDIA Quadro RTX GPUs, to train their TensorFlow models, which were then deployed on the miniature race cars for on-device AI inference.

The data science workstations provided massive speedups in performance to greatly reduce iteration times. This allowed the students to quickly train and test various models to find the best one for their race car.

“The students were successful in their projects because the time it took for training the models was faster than we’ve ever seen,” said Karaman. “The accelerated computing capabilities of NVIDIA data science workstations allowed the class to iterate multiple times, and the best performing race cars were trained in only a few minutes.”

Karaman plans to teach the robotics class once again this year, using the data science workstations and the pre-installed AI software stack.

Learn more about Quadro RTX-powered data science workstations.

The post MIT Students Kick Self-Driving Mini-Cars into High Gear with GPU-Powered Data Science Workstations appeared first on The Official NVIDIA Blog.