Non-Stop Shopping: Startup’s AI Let’s Supermarkets Skip the Line

Eli Gorovici loves to take friends sailing on the Mediterranean. As the new pilot of Trigo, a Tel Aviv-based startup, he’s inviting the whole retail industry on a cruise to a future with AI.

“We aim to bring the e-commerce experience into the brick-and-mortar supermarket,” said Gorovici, who joined the company as its chief business officer in May.

The journey starts with the sort of shopping anyone who’s waited in a long checkout line has longed for.

You fill up your bags at the market and just walk out. Magically, the store knows what you bought, bills your account and sends you a digital receipt, all while preserving your privacy.

Trigo is building that experience and more. Its magic is an AI engine linked to cameras and a few weighted shelves for small items a shopper’s hand might completely cover.

With these sensors, Trigo builds a 3D model of the store. Neural networks recognize products customers put in their bags.

When shoppers leave, the system sends the grocer the tally and a number it randomly associated with them when they chose to swipe their smartphone as they entered the store. The grocer matches the number with a shopper’s account, charges it and sends off a digital bill.

And that’s just the start.

An Online Experience in the Aisles

Shoppers get the same personalized recommendation systems they’re used to seeing online.

“If I’m standing in front of pasta, I may see on my handset a related coupon or a nice Italian recipe tailored for me,” said Gorovici. “There’s so much you can do with data, it’s mind blowing.”

The system lets stores fine-tune their inventory management systems in real time. Typical shrinkage rates from shoplifting or human error could sink to nearly zero.

AI Turns Images into Insights

Making magic is hard work. Trigo’s system gathers a petabyte of video data a day for an average-size supermarket.

It uses as many as four neural networks to process that data at mind-melting rates of up to a few hundred frames per second. (By contrast, your TV displays high-definition movies at 60 fps.)

Trigo used a dataset of up to 500,000 2D product images to train its neural networks. In their daily operations, the system uses those models to run millions of inference tasks with help from NVIDIA TensorRT software.

The AI work requires plenty of processing muscle. A supermarket outside London testing the Trigo system uses servers in its back room with 40-50 NVIDIA RTX GPUs. To boost efficiency, Trigo plans to deliver edge servers using NVIDIA T4 Tensor Core GPUs and join the NVIDIA Metropolis ecosystem starting next year.

Trigo got early access to the T4 GPUs thanks to its participation in NVIDIA Inception, a program that gives AI startups traction with tools, expertise and go-to-market support. The program also aims to introduce Trigo to NVIDIA’s retail partners in Europe.

In 2021, Trigo aims to move some of the GPU processing to Google, Microsoft and other cloud services, keeping some latency- or privacy-sensitive uses inside the store. It’s the kind of distributed architecture businesses are just starting to adopt, thanks in part to edge computing systems such as NVIDIA’s EGX platform.

Big Supermarkets Plug into AI

Tesco, the largest grocer in the U.K., has plans to open its first market using Trigo’s system. “We’ve vetted the main players in the industry and Trigo is the best by a mile,” said Tesco CEO Dave Lewis.

Israel’s largest grocer, Shufersal, also is piloting Trigo’s system, as are other retailers around the world.

Trigo was founded in 2018 by brothers Michael and Daniel Gabay, leveraging tech and operational experience from their time in elite units of the Israeli military.

Seeking his next big opportunity in his field of video technology, Gorovici asked friends who were venture capitalists for advice. “They said Trigo was the future of retail,” Gorovici said.

Like sailing in the aqua-blue Mediterranean, AI in retail is a compelling opportunity.

“It’s a trillion-dollar market — grocery stores are among the biggest employers in the world. They are all being digitized, and selling more online now given the pandemic, so maybe this next stage of digital innovation for retail will now move even faster,” he said.

Taiwanese Supercomputing Center Advances Real-Time Rendering from the Cloud with NVIDIA RTX Server and Quadro vDWS

As the stunning visual effects in movies and television advance, so do audience expectations for ever more spectacular and realistic imagery.

The National Center for High-performance Computing, home to Taiwan’s most powerful AI supercomputer, is helping video artists keep up with increasing industry demands.

NCHC delivers computing and networking platforms for filmmakers, content creators and artists. To provide them with high-quality, accelerated rendering and simulation services, the company needed some serious GPU power.

So it chose the NVIDIA RTX Server, including Quadro RTX 8000 and RTX 6000 GPUs and NVIDIA Quadro Virtual Data Center Workstation (Quadro vDWS) software, to bring accelerated rendering performance and real-time ray tracing to its customers.

NVIDIA GPUs and VDI: Driving Force Behind the Scenes

One of NCHC’s products, Render Farm, is built on NVIDIA Quadro RTX GPUs with Quadro vDWS software. It provides users with real-time rendering for high-resolution image processing.

A cloud computing platform, Render Farm enables users to rapidly render large 3D models. Its efficiency is stunning: it can reduce the time needed for opening files from nearly three hours to only three minutes.

“Last year, a team from Hollywood that reached out to us for visual effects production anticipated spending three days working on scenes,” said Chia-Chen Kuo, director of the Arts Technology Computing Division at NCHC. “But with the Render Farm computing platform, it only took one night to finish the work. That was far beyond their expectations.”

NCHC also aims to create a powerful cloud computing environment that can be accessed by anyone around the world. Quadro vDWS technology plays an important role in allowing teams to collaborate in this environment and makes its HPC resources widely available to the public.

With the rapid growth of data, physical hardware systems can’t keep up with data size and complexity. But Quadro vDWS technology makes it easy and convenient for anyone to securely access data and applications from anywhere, on any device.

Using virtual desktop infrastructure, NCHC’s Render Farm can provide up to 100 virtual workstations so users can do image processing at the same time. They only need a Wi-Fi or 4G connection to access the platform.

VMware vSphere and Horizon technology is integrated into Render Farm to provide on-demand virtual remote computing platform services. This virtualizes the HPC environment through NVIDIA virtual GPU technology and reduces by 10x the time required for redeploying the rendering environment. It also allows flexible switching between Windows and Linux operating systems.

High-Caliber Performance for High-Caliber Performers 

Over 200 video works have already been produced with NCHC’s technology services.

NCHC recently collaborated with acclaimed Taiwanese theater artist Huang Yi for one of his most popular productions, Huang Yi and KUKA. The project, which combined modern dance with visual arts and technology, was performed in over 70 locations worldwide such as the Cloud Gate Theater in northwest Taipei, the Ars Electronica Festival in Austria, and TED Conference in Vancouver.

During the program, Huang coordinated a dance with his robot companion KUKA, whose arm possessed a camera to capture the dance movements. Those images were sent to the NCHC Render Farm in Taichung, 170 km away,  to be processed in real time before projecting back to the robot on stage — with less than one second of end-to-end latency.

“I wanted to thoroughly immerse audiences in the performance so they can sense the flow of emotions. This requires strong and stable computing power,” said Huang. “NCHC’s Render Farm, powered by NVIDIA GPUs and NVIDIA virtualization technology, provides everything we need to animate the robot: exceptional computing power, extremely low latency and the remote access that you can use whenever and wherever you are.”

LeaderTek, a 3D scanning and measurement company, also uses NCHC services for image processing. With 3D and cloud rendering technology, LeaderTek is helping the Taiwan government archive historic monuments through creating advanced digital spatial models.

“Adopting Render Farm’s cloud computing platform helps us take a huge leap forward in improving our workflows,” said Hank Huang, general manager at LeaderTek. “The robust computing capabilities with NVIDIA vGPU for Quadro Virtual Workstations is also crucial for us to deliver high-quality images in a timely manner and get things done efficiently.”

Watch Huang Yi’s performance with KUKA below. And learn more about NVIDIA Quadro RTX and NVIDIA vGPU.

The post Taiwanese Supercomputing Center Advances Real-Time Rendering from the Cloud with NVIDIA RTX Server and Quadro vDWS appeared first on The Official NVIDIA Blog.

Top Content Creation Applications Turn ‘RTX On’ for Faster Performance

Whether tackling complex visualization challenges or creating Hollywood-caliber visual effects, artists and designers require powerful hardware to create their best work.

The latest application releases from Foundry, Chaos Group and Redshift by Maxon provide advanced features powered by NVIDIA RTX so creators can experience faster ray tracing and accelerated performance to elevate any design workflow.

Foundry Delivers New Features in Modo and Nuke

Foundry recently hosted Foundry LIVE, a series of virtual events where they announced the latest enhancements to their leading content creation applications, including NVIDIA OptiX 7.1 support in Modo.

Modo is Foundry’s powerful and flexible 3D modeling, texturing and rendering toolset. By upgrading to OptiX 7.1 in the mPath renderer, Version 14.1 delivers faster rendering, denoising and real-time feedback with up to 2x the memory savings on the GPU for greater flexibility when working with complex scenes.

Earlier this week, the team announced Nuke 12.2, the latest version of Foundry’s compositing, editorial and review tools. The recent release of Nuke 12.1, the NukeX Cara VR toolset for working with 360-degree video, as well as Nuke’s SphericalTransform and Bilateral nodes, takes advantage of new GPU-caching functionality to deliver significant improvements in viewer processing and rendering. The GPU-caching architecture is also available to developers creating custom GPU-accelerated tools using BlinkScript.

“Moving mPath to OptiX 7.1 dramatically reduces render times and memory usage, but the feature I’m particularly excited by is the addition of linear curves support, which now allows mPath to accelerate hair and fur rendering on the GPU,” said Allen Hastings, head of rendering at Foundry.

Image Courtesy of Foundry, model supplied by Aaron Sims Creative

NVIDIA Quadro RTX GPUs combined with Dell Precision workstations provide the performance, scalability and reliability to help artists and designers boost productivity and create amazing content faster than before. Learn more about how Foundry members in the U.S. can receive exclusive discounts and save on all Dell desktops, notebooks, servers, electronics and accessories.

Chaos Group Releases V-Ray 5 for Autodesk Maya

Chaos Group will soon release V-Ray 5 for Autodesk Maya, with a host of new GPU-accelerated features for lighting and materials.

Using LightMix in the new V-Ray Frame Buffer allows artists to freely experiment with lighting changes after they render, save out permutations and push back improvements in scenes. The new Layer Compositor allows users to fine-tune and finish images directly in the V-Ray frame buffer — without the need for a separate post-processing app.

“V-Ray 5 for Maya brings tremendous advancements for Maya artists wanting to improve their efficiency,” said Phillip Miller, vice president of product management at Chaos Group. “In addition, every new feature is supported equally by V-Ray GPU which can utilize RTX acceleration.”

V-Ray 5 for Maya image for the Nissan GTR. Image courtesy of Millergo CG.

V-Ray 5 also adds support for out-of-core geometry for rendering using NVIDIA CUDA, improving performance for artists and designers working with large scenes that aren’t able to fit into the GPU’s frame buffer.

V-Ray 5 for Autodesk Maya will be generally available in early August.

Redshift Brings Faster Ray Tracing, Bigger Memory

Maxon hosted The 3D and Motion Design Show this week, where they demonstrated Redshift 3.0 with OptiX 7 ray-tracing acceleration and NVLink for both geometry and textures.

Additional features of Redshift 3.0 include:

  • General performance improved 30 percent or more
  • Automatic sampling so users no longer need to manually tweak sampling settings
  • Maxon shader noises for all supported 3D apps
  • Hydra/Solaris support
  • Deeper traces and nested shader blending for even more visually compelling shaders

“Redshift 3.0 incorporates NVIDIA technologies such as OptiX 7 and NVLink. OptiX 7 enables hardware ray tracing so our users can now render their scenes faster than ever. And NVLink allows the rendering of larger scenes with less or no out-of-core memory access — which also means faster render times,” said Panos Zompolas, CTO at Redshift Rendering Technologies. “The introduction of Hydra and Blender support means more artists can join the ever growing Redshift family and render their projects at an incredible speed and quality.”

Redshift 3.0 will soon introduce OSL and Blender support. Redshift 3.0 is currently available to licensed customers, with general availability coming soon.

All registered participants of the 3D Motion and Design Show will be automatically entered for a chance to win an NVIDIA Quadro RTX GPU. See all prizes here.

Check out other RTX-accelerated applications that help professionals transform design workflows. And learn more about how RTX GPUs are powering high-performance NVIDIA Studio systems built to handle the most demanding creative workflows.

For developers looking to get the most out of RTX GPUs, learn more about integrating OptiX 7 into applications.


Featured blog image courtesy of Foundry.

The post Top Content Creation Applications Turn ‘RTX On’ for Faster Performance appeared first on The Official NVIDIA Blog.

Floating on Creativity: SuperBlimp Speeds Rendering Workflows with NVIDIA RTX GPUs

Rendering is a critical part of the design workflow. But as audiences and clients expect ever higher-quality graphics, agencies and studios must tap into the latest technology to keep up with rendering needs.

SuperBlimp, a creative production studio based just outside of London, knew there had to be a better way to achieve the highest levels of quality in the least amount of time. They’re leaving CPU rendering behind and moving to NVIDIA RTX GPUs, bringing significant acceleration to the rendering workflows for their unique productions.

After migrating to full GPU rendering, SuperBlimp experienced accelerated render times, making it easier to complete more iterations on their projects and develop creative visuals faster than before.

Blimping Ahead of Rendering With RTX

Because SuperBlimp is a small production studio, they needed the best performance at a low cost, so they turned to NVIDIA GeForce RTX 2080 Ti GPUs.

SuperBlimp had been using NVIDIA GPUs for the past few years, so they were already familiar with the power and performance of GPU acceleration. But they always had one foot in the CPU camp and needed to constantly switch between CPU and GPU rendering.

However, CPU render farms required too much storage space and took too much time. When SuperBlimp finally embraced full GPU rendering, they found RTX GPUs delivered the level of computing power they needed to create 3D graphics and animations on their laptops at a much quicker rate.

Powered by NVIDIA Turing, the most advanced GPU architecture for creators, RTX GPUs provide dedicated ray-tracing cores to help users speed up rendering performance and produce stunning visuals with photorealistic details.

And with NVIDIA Studio Drivers, the artists at SuperBlimp are achieving the best performance on their creative applications. NVIDIA Studio Drivers undergo extensive testing against multi-app creator workflows and multiple revisions of top creative applications, including Adobe Creative Cloud, Autodesk and more.

For one of their recent projects, an award-winning short film titled Playgrounds, SuperBlimp used Autodesk Maya for 3D modeling and Chaos Group’s V-Ray GPU software for rendering. V-Ray enabled the artists to create details that helped produce realistic surfaces, from metallic finishes to plastic materials.

“With NVIDIA GPUs, we saw render times reduce from 3 hours to 15 minutes. This puts us a great position to create compelling work,” said Antonio Milo, director at SuperBlimp. “GPU rendering opened the door for a tiny studio like us to design and produce even more eye-catching content than before.”

Image courtesy of SuperBlimp.

Now, SuperBlimp renders their projects using NVIDIA GeForce RTX 2080 Ti and GTX 1080 Ti GPUs to bring incredible speeds for rendering, so their artists can complete creative projects with the powerful, flexible and high-quality performance they need.

Learn how NVIDIA GPUs are powering the future of creativity.

The post Floating on Creativity: SuperBlimp Speeds Rendering Workflows with NVIDIA RTX GPUs appeared first on The Official NVIDIA Blog.

Productivity in Full Display: Quadro View Helps Professionals Optimize Desktop Workspaces

It’s time to put your comfort zone on display.

NVIDIA Quadro View, available now, enhances workspaces and boosts productivity by enabling professionals to manage their displays and arrange their workspaces in a way that best suits them.

Workspaces are vital to productivity because they set the tone for how people work. More and more, professionals across industries are tackling demanding workflows that require working simultaneously across multiple applications and windows.

They’re using more ultra-widescreen monitors, curved displays and multi-monitor setups to see all they need in one view. They’re also increasingly personalizing their workspace layout — from their desk space to the windows on screen — for easy access to the things they need.

The latest application in the NVIDIA Quadro Experience platform, Quadro View helps streamline workflows with a suite of desktop management tools to deliver maximum flexibility and control over displays.

With Quadro View, included with Quadro drivers or downloadable as a standalone app, users can easily customize their desktop layout, gaining full control over their displays so they can work more efficiently.

A View Designed to Fit the Way You Work

Quadro Experience provides users a range of productivity tools to choose from, including screen capture and desktop recording, so they can simplify time-consuming tasks.

Quadro View, easily launched from Quadro Experience, lets multitaskers streamline productivity even further through features like tailored workspaces with easy navigation, compatibility with top software applications, and powerful window management and deployment tools for a personalized desktop experience.

“Adjusting and organizing windows in my screen is a boring and often time-consuming daily task that takes time away from my work, especially when something changes in my workflow,” said Luis Paulo F. Mesquita, global macro financial portfolio manager at Venturestar Capital Management. “Quadro View’s simple drag-and-drop process saves precious minutes in my morning so I can get up and running in no time.”

Quadro View provides advanced capabilities that allow users to:

  • Divide workspaces using display gridlines and arrange applications to regions on monitors,  also known as window-snapping.
  • Save profiles based on personal workflows to deploy preset desktop and application configurations.
  • Specify how windows operate on desktops or display devices with advanced windows management.
  • Set up hotkeys to trigger actions and quickly access common functions.

Quadro View is available to download as a standalone application, and as a part of the latest Quadro Optimal Driver for Enterprise Release 450 U1, which includes advanced Quadro display features, added support for Windows 10 and new studio application updates.

The post Productivity in Full Display: Quadro View Helps Professionals Optimize Desktop Workspaces appeared first on The Official NVIDIA Blog.

NASA’s Day in the Sun: Space Agency Speeds Analysis of Solar Images by 150x Using Data Science Workstations

NASA is using Quadro RTX GPUs to shine some sunlight on data analytics.

The U.S. space agency’s Solar Dynamics Observatory collects images of the sun to help scientists and researchers gain insight into the different types of solar variations and how they affect life on Earth.

This data is a valuable asset for the research community, but with more than 18 petabytes of images collected, analyzing this information is a massive challenge.

With Quadro RTX-powered Z by HP data science workstations, however, the NASA team can easily sort through the data and analyze images up to 150x faster than on CPUs.

NASA’s Big Data Challenge

The observatory collects data by taking images of the sun every 1.3 seconds. Researchers have developed an algorithm that removes errors from the images, such as bad pixels, then places them into an archive that’s growing every day.

The algorithm is highly accurate, but with nearly 20 petabytes of images, there are billions of pixels that have been misclassified as errors. So, the NASA team needed to comb through 150 million error-files, in all containing about 100 billion individual detections, and find a way to sort and label the good pixels versus the bad ones.

With conventional computing, it was nearly impossible — using a CPU would take up to a few years to see any results. Even with the best multi-threaded CPU algorithm they could create, it would still take about a year to compute and analyze all the data.

“For scientists, a year still wouldn’t be enough time because we like to explore and iterate the results we find,” said Raphael Attie, solar astronomer at NASA’s Goddard Space Flight Center. “Even with one year of computation, it would still take us up to 10 years to find concrete results.”

Needing results in a much shorter time frame, the NASA team started looking at the parallel processing capabilities that were available using  NVIDIA GPUs.

Big Data Gets a Bigger Solution

Supercomputing resources at NASA are heavily restricted — researchers need to provide details as to how much computing resources they require and how long they’ll need to use it. However, this becomes challenging when the team isn’t sure how much computing resources they need in order to experiment with massive amounts of data.

But with the Z by HP data science workstations powered by two Quadro RTX 8000 GPUs, the NASA researchers were able to get supercomputing resources right at their desks. They started to explore the project using big data analytics techniques and using NVIDIA’s accelerated computing libraries to fully unlock the power of NVIDIA GPUs.

The data science workstations allowed the team to analyze the images and achieve results in less than a week.

“The data science workstations completely changed the field of possibility for us,” said Michael Kirk, research astrophysicist at NASA. “These computations that previously weren’t imaginable, we can now do 10-150x faster than we thought possible.”

The NASA team conducts a broad range of work, leveraging AI, machine learning and data analytics to learn the sun’s secrets. Most of their data science workflows are based in Python, using TensorFlow, Dask, CuPy and other apps for heavy data processing; Pandas, RAPIDS and CuDF for statistical exploration; and a variety of 2D and 3D visualization tools.

With the data science workstations, the team can utilize the power of GPUs to enhance their analytics workflows, allowing the researchers to explore and iterate calculations to get quicker results.

Once the NASA team completes their project of filtering and analyzing the current data, their next step is to use this information to analyze other pixels that were initially marked as good to make sure that they really are good in order to validate the entire data set.

A Change of Space for GPUs

In AI and big data analytics, projects can be severely impacted from non-responsive workflows in cloud environments. These interruptions break momentum, productivity and motivation in the long run. This is why Attie recommends having a local GPU-powered workstation or laptop that has enough memory to accommodate a subset of your data processing for comfortable prototyping.

“I find that a necessary condition for a responsive workflow is to have the input data rapidly accessible by your GPU devices,” said Attie. “If it’s not possible to have the data locally in the same machine as the GPU device, the network needs to be very fast and resilient, as AI applications often need fast access to the data.”

The results of Attie and Kirk’s projects get shared through publications and specialized journals. During seminars and conferences, they’ll have discussions with colleagues and deliver presentations on how they obtain data with specific frameworks or customized codes. And as more people are working from home, the NASA team is getting more familiar with remote tools to connect with others and share findings from their latest projects.

Dive deeper into this work in the webinar with NASA here.

Learn more about NVIDIA data science workstations.

Featured image, courtesy of NASA, is of the sun from SDO on May 16, 2012.

The post NASA’s Day in the Sun: Space Agency Speeds Analysis of Solar Images by 150x Using Data Science Workstations appeared first on The Official NVIDIA Blog.

NVIDIA RTX Accelerates New AI Capabilities in Adobe Substance Alchemist and Blender

The creative workflows of artists and designers everywhere are getting a serious boost from AI and NVIDIA RTX GPUs.

The latest releases of Substance Alchemist and Blender are introducing AI-powered features, like denoising and material creation, bringing the power of AI to millions of content creators around the world.

Substance Alchemist Uses Images to Create Realistic Materials

Substance Alchemist, Adobe’s material creation tool, received an AI upgrade to its Image to Material feature today. The feature allows artists to capture a photo of real-world surfaces and create textures for 3D that can be used in content creation. It’s a powerful, faster and easier way to create realistic materials.

Image to Material is accelerated by NVIDIA RTX GPUs to run an AI algorithm that recognizes shapes and objects in a photograph. It automatically generates a higher quality, accurate texture map that includes base color, height and normal, allowing artists to focus on creativity instead of time-consuming map refinement.

The end result is a more precise material which was created with the click of a button.

“The power of AI now allows artists to generate high-quality digital materials in minutes, from a single photograph,” said Baptiste Manteau, product manager at Adobe. “This is an exciting next step for Substance Alchemist, bringing the physical world into the digital material creation process.”

The AI-powered Image to Material feature replaces Bitmap 2 Material (B2M) to import photographs into Alchemist. Without the power of AI, B2M struggled to properly identify shapes in the image, resulting in texture maps that required artists to spend additional time manually refining, tweaking and de-lighting.

Quality comparison between “bitmap 2 material” and AI-powered “image to material.”

“Designers are always working against the clock,” said Daniel Margunato, co-owner and ArchViz artist at Oneblock City. “With Alchemist’s Image to Material running on a Quadro RTX 5000, I am able to load, analyze and render incredibly detailed images nearly instantaneously. I can dedicate the enormous time savings to fine tuning and elevating my designs – it’s a huge luxury.”

Blender Accelerates Viewport Interactivity with AI Denoising

With Blender 2.83, released earlier this month, artists can incorporate AI-powered denoising — a process that predicts final images from partially rendered results — while they’re designing in the viewport. This allows users to explore new ideas and quickly iterate on design choices with full confidence in how it will turn out.

Image created by Efflam Mercier.

AI denoising in Blender is based on the NVIDIA OptiX SDK and accelerated by the AI capabilities of RTX GPUs. It builds on the RTX-accelerated ray tracing introduced in Blender Cycles 2.81, and the improved speed of final frame rendering found in version 2.82.

This blistering rate of updates underscores the enthusiasm of the Blender open source development community for the game-changing advancements that RTX acceleration brings to 3D artists and designers.

“We’ve worked closely with NVIDIA to continually boost the rendering performance of Blender Cycles,” said Dalai Felinto, development manager at Blender. “Together we completed an entirely new backend for Cycles, integrating NVIDIA OptiX to achieve optimal ray-tracing performance using NVIDIA RTX GPUs. This will result in huge time savings and greater creative freedom for our users.”

Watch how the powerful combination of RTX-accelerated ray tracing and AI denoising in the viewport makes 3D creation interactive, so artists can focus their attention on exploring new ideas to get to that perfect shot.

Here is what users are saying about OptiX denoising in Blender:

“If you need a single stylized shot or clean production-quality rendering, OptiX denoising is a huge time-saver,” said Nkoro Anselem, aka askNK, a CGI artist and instructor. “You can play with light and movement and get real-time feedback. It will change how people work with Blender.”

“NVIDIA’s OptiX upgrade for Blender/Cycles is a fantastic speed boost for viewport and final render times,” said Ben Mauro, senior concept designer at 343 Industries. “The AI-based denoising has been an incredible asset and time-saver in my work, where every second counts to meet deadlines.”

The latest Blender and Alchemist releases join a growing list of the most popular creative applications that incorporate AI-powered features into the content creation workflow. Others include Adobe Premiere Pro and Lightroom, Autodesk VRED, Blackmagic Design DaVinci Resolve, Chaos Group V-Ray, CorelDRAW Graphics Suite and Luxion Keyshot. These apps are part of a paradigm shift and create big time savings for content creators.

To get the optimal experience with the latest applications releases, use the latest NVIDIA Studio Driver or Quadro ODE Driver. They can be downloaded through the GeForce Experience or Quadro Experience apps, or from the driver download page.

Learn more about other NVIDIA RTX-enabled applications.

The post NVIDIA RTX Accelerates New AI Capabilities in Adobe Substance Alchemist and Blender appeared first on The Official NVIDIA Blog.

Filmmaker Achieves Razer-Sharp Graphics and Real-Time Rendering with NVIDIA Studio Laptop

Film studios worldwide have embraced real-time technology to create stunning graphics faster than before.

Now independent creative professionals are giving some love to these same capabilities that allow them to design amazing visuals wherever they go, thanks to powerful NVIDIA RTX Studio laptops and mobile workstations.

Filmmaker Hasraf “HaZ” Dulull writes, directs and produces TV, film and animation projects. He’s worked on titles such as indie sci-fi film The Beyond on Netflix, and directed episodes for Disney’s Fast Layne miniseries. More recently, he’s been working on Battlesuit, a pilot episode for an animated series based on the graphic novel Theory.

When I saw the quality of real-time visuals, I thought to myself: with a little love, real-time ray tracing, and a powerful RTX GPU, you can make really cool animated content at a cinematic level.
—  HaZ Dulull, filmmaker

With only four months and a small team to put this project together, Dulull used a Quadro RTX-powered Razer Blade 15 Studio Edition laptop to harness the power of real-time ray tracing and produce the entire animated episode in Unreal Engine.

Quadro RTX Brings the Mobile Power

Dulull’s main workspace is his home studio, which he also uses to connect to his team remotely. But for the pilot, he needed something compact that still delivered impressive graphics power and allowed him to use virtual production filmmaking techniques for his animations.

The Quadro RTX 5000 graphics card built into the svelte Razer Blade 15 Studio Edition laptop provided Dulull with the flexibility, speed and power to design and test shot ideas quickly and easily.

One scene required a gripping action sequence, so Dulull used DragonFly by Glassbox as a virtual camera. Built for Unreal Engine, DragonFly allowed Dulull to take the animations he receives from his team and quickly shoot different takes to achieve the final result.

DragonFly is directly connected via Wi-Fi to his Razer laptop, where the animations are playing back in real time and being streamed to his camera viewport.

Dulull constantly pushed the laptop with scenes that included demanding amounts of geometry, effects and lighting. A standard VFX shot is composed of 100 frames, but some of Dulull’s scenes were up to 700 frames long. With the Quadro RTX, he employed real-time rendering and produced final graphics in Unreal Engine.

“Traditional CGI animation can be expensive because it includes multiple workflow steps in the process, including rendering several passes and compositing them afterwards,” said Dulull. “But with the Quadro RTX, I was able to use real-time rendering and create every single shot with final pixels in Unreal Engine. There was no need for compositing or post-processing, apart from color grading, because what you see is what you get.”

More Time for Style and Iterations

When working on a proof of concept or a pilot like Battlesuit, filmmakers need to get the animations and graphics to a level that is acceptable for networks and studios. Traditionally, teams will create storyboards or mockups to get an idea of what the style of the project will look like.

But with GPU ray tracing, filmmakers can find the style in real time. For Battlesuit, Dulull went into Unreal Engine and explored different styles on his own. He didn’t need to rely on other artists to create shaders or mood boards, or to send notes or comments back and forth.

“While we’re in the film, we were able to play around with style and find the look we wanted to go for,” said Dulull. “It was easy to explore and change the look in real time, and this is only possible through NVIDIA GPUs.”

The ability to make changes quickly also allowed Dulull to iterate as much as he wanted, since he didn’t have to wait for scenes or images to render. He could make revisions and tweak shots up until the final product was delivered.

With the power of real-time ray tracing at his fingertips, Dulull could push the graphic quality as far as he wanted to quickly achieve the animated film he envisioned without making any creative compromises.

Battlesuit is available to watch here.

Learn more about Quadro RTX and other NVIDIA solutions that are helping professionals work from home.

The post Filmmaker Achieves Razer-Sharp Graphics and Real-Time Rendering with NVIDIA Studio Laptop appeared first on The Official NVIDIA Blog.

COVID Caught on Camera: Startup’s Sensors Keep Hospitals Safe

Andrew Gostine’s startup aims to make hospitals more efficient, but when the coronavirus hit Chicago he pivoted to keeping them safer, too.

Gostine is a critical-care anesthesiologist at Northwestern Medicine’s 105-bed Lake Forest hospital, caring for 60 COVID-19 patients. He’s also the CEO of Whiteboard Coordinator Inc., a startup that had a network of 400 cameras and other sensors deployed across Northwestern’s 10 hospitals before the pandemic.

After the virus arrived, “the hospital said it was having a hard time screening people coming in for COVID-19 using conventional temperature probes, and asked if we could help,” he said.

Ten days later, the startup had thermal cameras linked to its network installed at 31 entrances to the hospitals. They detect about a dozen cases of fever in the 6,000 people coming through the doors each day.

The approach reduced lines waiting to get in. It also cut from four to one the number of people the hospital needed to post at each door.

Digital Window Protects Care Givers

About the same time, Northwestern asked Whiteboard for “a digital window” into COVID-19 rooms. They wanted to limit nurses’ exposure to the virus and reduce the need for the protective gear that’s now in high demand.

Whiteboard’s HIPAA-compliant thermal camera system can measure temperature to within +/- 0.3 °C on up to 36 people per video frame at a distance of nine meters.

So, the startup deployed another 400 cameras sporting night vision and microphones across the 10 hospitals. They use Whiteboard’s network of NVIDIA GPUs to transcode the video streams so they can be viewed securely on any hospital display.

“Nurses tell us the remote viewing is phenomenal. They report going into rooms less and consumption of protective gear is down. Our next challenge is using our computer-vision capabilities to track inventory of protective gear in real time,” he said.

The thermal cameras and patient monitors link to 36 NVIDIA RTX 2080 Ti GPUs. They handle transcoding and other algorithms to deliver low-latency feeds at 20 frames/second.

The current COVID-19 uses don’t require AI today, but deep learning is a core part of Whiteboard’s system. “Eighty percent of what we do is computer vision, but we can integrate different sensors for different problems,” Gostine said.

A Sensory-Friendly Guardian for Hospitals

Whiteboard’s system also supports Bluetooth and RFID sensors for a range of patient monitoring, inventory tracking, resource scheduling and security apps. One hospital increased the use of its operating rooms 27 percent while reducing its costs, thanks to the startup’s OR scheduling system. It currently runs on an NVIDIA Jetson TX2 and is being upgraded to Quadro 4000 GPUs.

For use cases such as fever and  mask detection, Whiteboard also plans to adopt NVIDIA Clara Guardian, an application framework that simplifies the deployment in hospitals of smart sensors with multi-mode AI. It is among 18 companies currently supporting Clara Guardian, software that runs on the NVIDIA EGX platform for AI computing on edge servers and embedded devices.

The pandemic spawned orders from 100 hospitals for Whiteboard’s thermal cameras. The startup currently has at least one of its systems installed in a total of 22 hospitals.

“Our biggest problem is sourcing cameras and other hardware we need because supply chains are in disarray,” Gostine said.

Seeking Better Surgery Outcomes with AI

Once the pandemic passes, the startup aims to employ AI to improve outcomes of surgical techniques used in the operating room. Long term, Whiteboard’s value will come from its expanding AI algorithms and datasets, trained on NVIDIA V100 Tensor Core GPUs in Microsoft’s Azure service, he said.

It’s a big opportunity. Accenture predicts by 2026 the top 10 AI healthcare apps will generate a $150 billion market. It will span areas such as robotic surgery, virtual nursing assistants and automated workflows.

The startup’s mission was born of Gostine’s personal passion for making hospitals more modern and efficient.

“When I got to med school, I was frustrated because it seemed we lagged behind the internet era I grew up in,” he said.

From Faxes to the Future

After graduating, he got an MBA and spent some time consulting with healthcare startups before his internship. Work with more than a dozen companies led to a position with a VC firm during his medical residency.

“It was like night and day. The venture world was thinking 10 years ahead, and I realized healthcare was really behind — we’re still using pagers and fax machines,” he said.

“There’s so much paperwork to get through every day, just so we can think about our patients. What we are doing at Whiteboard really stems from the frustrations I felt in my practice,” he added.

A series of chance encounters led him to three AI, software and medical experts who formed Whiteboard, a member of NVIDIA’s Inception program, which gives startups access to new technologies and other resources.

Whiteboard’s first product aimed to streamline OR scheduling, then it expanded into patient monitoring. Now the coronavirus has taken its networks all the way to the hospital’s front door.

The post COVID Caught on Camera: Startup’s Sensors Keep Hospitals Safe appeared first on The Official NVIDIA Blog.

Create at the Speed of Imagination with New Thin and Light Devices from Dell, HP and Microsoft

Creative workflows are getting more demanding. Project timelines continue to shrink. And with many people working remotely, a fast and reliable computer is more important than ever.

New NVIDIA-powered laptops and mobile workstations from Dell, HP and Microsoft give creators amazing choices to turn their imagination into actual creations.

These new systems launch just shy of the one-year anniversary of our introduction of NVIDIA Studio, a platform featuring dedicated drivers, performance-enhancing software development kits, and thin and light RTX Studio laptops purpose-built for creators.

Since then, we’ve worked with every major system manufacturer to expand the RTX Studio lineup and provide a wide range of choices that feature NVIDIA Quadro and GeForce RTX GPUs. In total, there are now 78 RTX Studio systems.

Dell-ightful RTX Studio Laptops and Mobile Workstations

Designed to be Dell’s most powerful XPS laptop ever, the XPS 17 muscles through intensive creative projects and gaming alike, with up to NVIDIA GeForce RTX 2060 graphics. Thanks to a thin bezel design, it’s the smallest 17-inch laptop, with similar dimensions to a typical 15-inch one.

Dell XPS 17 laptop
Dell XPS 17

Packing this much performance into an RTX Studio laptop of this size requires a little engineering ingenuity to keep the system performing smoothly. Under the hood is a new proprietary thermal design that provides more overall airflow and higher sustained performance to fuel the most demanding projects.

Today, Dell announced it has reengineered its Precision workstation portfolio. These RTX Studio mobile workstations are designed to handle demanding workloads like 8K editing, 3D rendering, data analysis and CAD modeling. The Precision 5750 and 7000 series mobile workstations are ISV certified and feature Quadro RTX graphics.

Dell Precision 5750 mobile workstation
Dell Precision 5750

The Dell Precision 5750 allows creators and engineers to see and do more with up to Quadro RTX 3000 graphics and a 16:10, four-sided InfinityEdge (up to HDR 400) display. For editors, engineers and scientists running intensive workloads, the Dell Precision 7550 and Dell Precision 7750 are available with up to Quadro RTX 5000 GPUs. They’ve been reengineered with more power and intelligent performance in an even smaller, lighter footprint.

A Laptop to ENVY

HP’s new Create Ecosystem is empowering creators of all types. That starts with the HP ENVY 15 that’ll be available later this month on HP.com.

HP ENVY 15 laptop
HP ENVY 15

The ENVY 15 is an RTX Studio laptop that can be configured with up to a GeForce RTX 2060 GPU for the ultimate in creator performance. It also features an all-aluminum chassis with 82.8 percent screen-to-body ratio, up to a 4K OLED VESA DisplayHDR 400 True Black display with touch interface display, 10th gen Intel processors, and gaming-class thermals.

Creative pros will want to keep an eye out for HP’s ZBook Studio and ZBook Create. Shipping later this year, they provide true mobility without compromise, thanks to Quadro and GeForce RTX GPUs, respectively. These systems also feature an 87 percent screen-to-body ratio and bring creators the first DreamColor display with all-day battery life.

Inside the Surface

Microsoft recently announced their most powerful laptop ever, the Surface Book 3. And it’s being powered by Quadro RTX and GeForce GTX graphics. The Surface Book 3 has the power of a desktop, the versatility of a tablet and the freedom of a thin and light laptop in one beautifully designed device.

Microsoft Surface Book 3
Microsoft Surface Book 3

In addition to NVIDIA graphics, the Surface Book 3 can be configured with 10th Generation Intel Core processors, up to 32GB of RAM, the fastest SSD Microsoft has ever shipped, a beautifully crisp, high-DPI PixelSense display, and up to 17.5 hours of battery life.

Surface Book 3 starts at $1,599 and will be available starting May 21.

GPU-Accelerated Exports in Adobe Premiere Pro

These new laptops will take advantage of over 200 NVIDIA GPU-accelerated creative and design applications, including one major addition released just yesterday.

Adobe Premiere Pro is helping content creators go from concept to completion faster with new GPU-accelerated exports. With NVIDIA encoder acceleration in Adobe Premiere Pro, editors can export high-resolution videos up to five times quicker than on CPU.

Adobe Premiere Pro
Adobe Premiere Pro, now with NVENC support

Creators can still take advantage of a limited time offer. Purchase a new RTX Studio laptop or desktop and both new and existing Adobe users get a free three-month subscription to Adobe Creative Cloud.

These new laptops, along with 10 recently announced RTX Studio laptops powered by new GeForce RTX SUPER GPUs, are powering the creative intersection between imagination and innovation.

Learn more about NVIDIA Studio and RTX Studio systems. And stay tuned for exciting announcements as the platforms continue to grow.

The post Create at the Speed of Imagination with New Thin and Light Devices from Dell, HP and Microsoft appeared first on The Official NVIDIA Blog.