Pulling the Plug on GPP, Leaning into GeForce

A lot has been said recently about our GeForce Partner Program. The rumors, conjecture and mistruths go far beyond its intent. Rather than battling misinformation, we have decided to cancel the program.

GPP had a simple goal – ensuring that gamers know what they are buying and can make a clear choice.

NVIDIA creates cutting-edge technologies for gamers. We have dedicated our lives to it. We do our work at a crazy intense level – investing billions to invent the future and ensure that amazing NVIDIA tech keeps coming. We do this work because we know gamers love it and appreciate it. Gamers want the best GPU tech. GPP was about making sure gamers who want NVIDIA tech get NVIDIA tech.

With GPP, we asked our partners to brand their products in a way that would be crystal clear. The choice of GPU greatly defines a gaming platform. So, the GPU brand should be clearly transparent – no substitute GPUs hidden behind a pile of techno-jargon.

Most partners agreed. They own their brands and GPP didn’t change that. They decide how they want to convey their product promise to gamers. Still, today we are pulling the plug on GPP to avoid any distraction from the super exciting work we’re doing to bring amazing advances to PC gaming.

This is a great time to be a GeForce partner and be part of the fastest growing gaming platform in the world. The GeForce gaming platform is rich with the most advanced technology. And with GeForce Experience, it is “the way it’s meant to be played.”

The post Pulling the Plug on GPP, Leaning into GeForce appeared first on The Official NVIDIA Blog.

Epic Games ‘Reflections’ GDC Demo Offers Peek at Gaming’s Cinematic Future

None of us have been to a galaxy far, far away. But all of us can relate to the discomfort experienced when you say say something awkward on the job — even if none of us work for someone who can send us to the icy planet Hoth.

Epic Games on Wednesday, in collaboration with ILMxLAB and NVIDIA, offered a sneak peek at gaming’s cinematic future with a stunning, witty demo featuring a pair of bungling stormtroopers who wind up out in the cold, literally.

The brief demo, titled “Reflections,” was shown during an Epic event held across the street from the Game Developers Conference in San Francisco, stunning gamers, press and industry insiders.

Running on our just announced NVIDIA RTX ray-tracing technology and NVIDIA GPUs, the demo offered a glimpse at gaming graphics that, to the untrained eye, are indistinguishable from movies. NVIDIA RTX brings real-time, cinematic-quality rendering to content creators and game developers.

“Now is a really exciting time for gamers and game developers,” Epic Games founder and CEO Tim Sweeney said in advance of the demo.

The demo showcased the graphics technique called ray tracing. Long used to generate cinematic special effects for Hollywood blockbusters, the technique relies on tracing the path of rays of light as they bounce off surfaces inside a scene. (See “What’s the Difference Between Ray Tracing and Rasterization?”)

The result is incredibly realistic graphics rendering that has been, until now, traditionally too computationally intensive for real-time use. That’s changing, though.

“There’s no other way to achieve the realism we need,”  Mohen Leo, director of content and platform strategy at ILMxLAB, said of ray tracing just before launching the “Reflections” demo.

Epic’s team along with ILMxLAB and NVIDIA began work on the demo in early December using our NVIDIA DGX Station, equipped with four Tesla V100 GPUs, Epic’s Unreal Engine and our NVIDIA RTX ray-tracing technology.

NVIDIA has also announced that the GameWorks SDK will add a ray-tracing denoiser module, helping game developers take advantage of new capabilities. This updated SDK, which is coming soon, includes support for ray-traced area light shadows, glossy reflections and ambient occlusion.

The GDC demonstration — which showcases techniques such as ray-traced reflections, ray-traced area light shadows and ray-traced ambient occlusion for characters — is just one piece of the technology story around ray tracing that will be unfolding this week at the conference.

For more from GDC, follow our coverage on Facebook, Twitter and GeForce.com.

The post Epic Games ‘Reflections’ GDC Demo Offers Peek at Gaming’s Cinematic Future appeared first on The Official NVIDIA Blog.

What’s the Difference Between Ray Tracing and Rasterization?

You’ve seen this movie before. Literally.

There may not be many people outside of computer graphics who know what ray tracing is, but there aren’t many people on the planet who haven’t seen it.

Just go to your nearest multiplex, plunk down a twenty and pick up some popcorn.

Ray tracing is the technique modern movies rely on to generate or enhance special effects. Think realistic reflections, refractions and shadows. Getting these right makes starfighters in sci-fi epics scream. It makes fast cars look furious. It makes the fire, smoke and explosions of war films look real.

Ray tracing produces images that can be indistinguishable from those captured by a camera. Live-action movies blend computer-generated effects and images captured in the real world seamlessly, while animated feature films cloak digitally generated scenes in light and shadow as expressive as anything shot by a cameraman.

The easiest way to think of ray tracing is to look around you, right now. The objects you’re seeing are illuminated by beams of light. Now turn that around and follow the path of those beams backwards from your eye to the objects that light interacts with. That’s ray tracing.

Literally cinematic: if you’ve been to the movies lately, you’ve seen ray tracing in action.
If you’ve been to the movies lately, you’ve seen ray tracing in action.

Historically, though, computer hardware hasn’t been fast enough to use these techniques in real time, such as in video games. Moviemakers can take as long as they like to render a single frame, so they do it offline in render farms. Video games have only a fraction of a second. As a result, most real-time graphics rely on another technique, rasterization.

What Is Rasterization?

Real-time computer graphics have long used a technique called “rasterization” to display three-dimensional objects on a two-dimensional screen. It’s fast. And, the results have gotten very good, even if it’s still not always as good as what ray tracing can do.

With rasterization, objects on the screen are created from a mesh of virtual triangles, or polygons, that create 3D models of objects. In this virtual mesh, the corners of each triangle — known as vertices — intersect with the vertices of other triangles of different sizes and shapes. A lot of information is associated with each vertex, including its position in space, as well as information about color, texture and its “normal,” which is used to determine the way the surface of an object is facing.

Computers then convert the triangles of the 3D models into pixels, or dots, on a 2D screen. Each pixel can be assigned an initial color value from the data stored in the triangle vertices.

Further pixel processing or “shading,” including changing pixel color based on how lights in the scene hit the pixel, and applying one or more textures to the pixel, combine to generate the final color applied to a pixel.

This is computationally intensive. There can be millions of polygons used for all the object models in a scene, and roughly 8 million pixels in a 4K display. And each frame, or image, displayed on a screen is typically refreshed 30 to 90 times each second on the display.

Additionally, memory buffers, a bit of temporary space set aside to speed things along, are used to render upcoming frames in advance before they’re displayed on screen. A depth or “z-buffer” is also used to store pixel depth information to ensure front-most objects at a pixel’s x-y screen location are displayed on-screen, and objects behind the front-most object remain hidden.

This is why modern, graphically rich computer games rely on powerful GPUs.

What Is Ray Tracing?

Ray tracing is different. In the real-world, the 3D objects we see are illuminated by light sources, and photons can bounce from one object to another before reaching the viewer’s eyes.

Light may be blocked by some objects, creating shadows. Or light may reflect from one object to another, such as when we see the images of one object reflected in the surface of another. And then there are refractions — when light changes as it passes through transparent or semi-transparent objects, like glass or water.

Ray tracing captures those effects by working back from our eye (or view camera) — a technique that was first described by IBM’s Arthur Appel, in 1969, in “Some Techniques for Shading Machine Renderings of Solids.” It traces the path of a light ray through each pixel on a 2D viewing surface out into a 3D model of the scene.

The next major breakthrough came a decade later. In a 1979 paper, “An Improved Illumination Model for Shaded Display,” Turner Whitted, now with NVIDIA Research, showed how to capture reflection, shadows and refraction.

Turner Whitted’s 1979 paper jump started a ray tracing renaissance that has remade movies.
Turner Whitted’s 1979 paper jump started a ray tracing renaissance that has remade movies.

With Whitted’s technique, when a ray encounters an object in the scene, the color and lighting information at the point of impact on the object’s surface contributes to the pixel color and illumination level. If the ray bounces off or travels through the surfaces of different objects before reaching the light source, the color and lighting information from all those objects can contribute to the final pixel color.

Another pair of papers in the 1980s laid the rest of the intellectual foundation for the computer graphics revolution that upended the way movies are made.

In 1984, Lucasfilm’s Robert Cook, Thomas Porter and Loren Carpenter detailed how ray tracing could incorporate a number of common filmmaking techniques — including motion blur, depth of field, penumbras, translucency and fuzzy reflections — that could, until then, only be created with cameras.

Two years later, CalTech professor Jim Kajiya’s paper, “The Rendering Equation,” finished the job of mapping the way computer graphics were generated to physics to better represent the way light scatters throughout a scene.

Combine this research with modern GPUs, and the results are computer-generated images that capture shadows, reflections and refractions in ways that can be indistinguishable from photographs or video of the real world. That realism is why ray tracing has gone on to conquer modern moviemaking.

Light, shadow, reflection: This computer-generated image, created by XXX using YYY, shows ray traced glass distortion in the light fixture, diffuse lighting in the window, and frosted glass in the lantern on the floor reflection on the frame picture.
This computer-generated image, created by Enrico Cerica using OctaneRender, shows ray traced glass distortion in the light fixture, diffuse lighting in the window and frosted glass in the lantern on the floor reflection on the frame picture.

It’s also very computationally intensive. That’s why movie makers rely on vast numbers of servers, or rendering farms. And it can take days, even weeks, to render complex special effects.

To be sure, many factors contribute to the overall graphics quality and performance of ray tracing. In fact, because ray tracing is so computationally intensive, it’s often used for rendering those areas or objects in a scene that benefit the most in visual quality and realism from the technique, while the rest of the scene is rendered using rasterization. Rasterization can still deliver excellent graphics quality.

What’s Next for Ray Tracing?

As GPUs continue to grow more powerful, putting ray tracing to work for ever more people is the next logical step. For example, armed with ray-tracing tools such as Arnold from Autodesk, V-Ray from Chaos Group or Pixar’s Renderman — and powerful GPUs — product designers and architects use ray tracing to generate photorealistic mockups of their products in seconds, letting them collaborate better and skip expensive prototyping.

Ray tracing has proven itself to architects and lighting designers, who are using its capabilities to model how light interacts with their designs.
Ray tracing has proven itself to architects and lighting designers, who are using its capabilities to model how light interacts with their designs.

As GPUs offer ever more computing power, video games are the next frontier for this technology. On Monday, NVIDIA announced NVIDIA RTX, a ray-tracing technology that brings real-time, movie-quality rendering to game developers. It’s the result of a decade of work in computer graphics algorithms and GPU architectures.

It consists of a ray-tracing engine running on NVIDIA Volta architecture GPUs. It’s designed to support ray tracing through a variety of interfaces. NVIDIA partnered with Microsoft to enable full RTX support via Microsoft’s new DirectX Raytracing (DXR) API.

And to help game developers take advantage of these capabilities, NVIDIA also announced the GameWorks SDK will add a ray tracing denoiser module. The updated GameWorks SDK, coming soon, includes ray-traced area shadows and ray-traced glossy reflections.

All of this will give game developers, and others, the ability to bring ray-tracing techniques to their work to create more realistic reflections, shadows and refractions. As a result, the games you enjoy at home will get more of the cinematic qualities of a Hollywood blockbuster.

The downside: You’ll have to make your own popcorn.

Check out “Physically Based Rendering: From Theory to Implementation,” by Matt Phar, Wenzel Jakob and Greg Humphreys. It offers both mathematical theories and practical techniques for putting modern photorealistic rendering to work.

Want to know what this means for gamers? See “NVIDIA RTX Technology: Making Real-Time Ray Tracing A Reality for Games,” on GeForce.com. 

The post What’s the Difference Between Ray Tracing and Rasterization? appeared first on The Official NVIDIA Blog.

Half a Billion Videos Can’t Be Wrong: NVIDIA Highlights Comes to Five New Games

“Screenshots or it didn’t happen.” If you’re a gamer, you’ve heard this challenge before. With NVIDIA Highlights (aka ShadowPlay Highlights), we’re working to ensure you’ll never have to worry about answering this challenge again.

We’re taking Highlights — a GeForce Experience feature that automatically captures your greatest gaming achievements in video or screenshot — to the next level at the Games Developers Conference this week in San Francisco.

Call of Duty WWII and Tekken 7 have become the latest games to add support for Highlights. In addition, Dying Light: Bad Blood and Escape from Tarkov will also soon be adding Highlights support.

Fortnite Battle Royale and PLAYERUNKNOWN’S BATTLEGROUNDS are the two hottest games on the planet. Something else they share: support for NVIDIA Highlights. Gamers have already recorded over half a billion videos using the feature.

NVIDIA Publicly Releases Highlights SDK at GDC

The roster of games with the feature is set to explode with today’s public release of the NVIDIA Highlights Software Development Kit (SDK) 1.0. Available for download now, the Highlights SDK 1.0 is a set of tools that enables developers to easily add support for Highlights to their games.

To further ease integration, we’ve also released Highlights plugins for two of the top game engines: Unreal Engine and Unity.

Highlights integrates directly with games to know precisely when that magic moment will happen, whether a boss fight or killing spree. Then it automatically records it using NVIDIA ShadowPlay technology to maintain the highest level of performance. When your session is over, Highlights presents a highlight reel so you can relive your victories and share them with friends on social media.  

Highlight or It Didn’t Happen…

Developers see the potential for Highlights to extend their social media impact. It brings the best of what their games can do to Facebook, YouTube, Weibo or Imgur.

Highlights is about gamers sharing with gamers, so at the behest of our army of GeForce gamers, we’ve also introduced the ability to share Highlights in GIF format. With the click of a button, gamers can save a Highlight video as a GIF and share them on Facebook, Google or Weibo.

Gamers can access Highlights through GeForce Experience, available on GeForce GTX GPUs. Because if you don’t have it captured by NVIDIA Highlights, then maybe it didn’t happen.

The post Half a Billion Videos Can’t Be Wrong: NVIDIA Highlights Comes to Five New Games appeared first on The Official NVIDIA Blog.

Intel Extreme Masters Katowice Starts March 2 with a Total Prize Pool of $950,000

intel-extreme-masters-championship-2017-4
Intel Extreme Masters returns to Katowice, Poland, on March 2-4 to conclude one of the biggest seasons yet in the world’s longest-running global esports tournament. (Credit: ESL | Helena Kristiansson)

Intel® Extreme Masters (IEM) returns to Katowice, Poland, on March 2-4 to conclude one of the biggest seasons yet in the world’s longest-running global esports tournament. At last year’s IEM Katowice, 173,000 fans packed Spodek Arena, while 46 million tuned in online – making it the most-watched broadcast in ESL’s history.

This year, an even larger audience will witness the world champions crowned for “Counter-Strike: Global Offensive” (“CS:GO”) and “StarCraft II.” In addition, the IEM Expo will feature a women’s “CS:GO” invitational, as well as the VR Challenger League Grand Finals, which concludes a groundbreaking season for the first virtual reality esports league.

To catch all the action, tune in starting Friday, March 2, on ESL TV.

Event details:

Prizes: The total prize pool for the three-day esports event is $950,000:

  • $500,000 total prize pool for “CS:GO”
  • $400,000 total prize pool for “StarCraft II”

Intel Grand Slam: The top 16 “CS:GO” teams will take the seventh step in the Intel Grand Slam series, which will award a $1 million bonus prize to the first “CS:GO” team that wins four of the last 10 “CS:GO” ESL and Dramhack events.

IEM Expo: The fourth edition of the IEM Expo takes place at the International Conference Centre, adjacent to the Spodek Arena, with a massive interactive floorshow highlighting the latest in gaming and esports technology, as well as exciting tournaments:

“Intel Extreme Masters Katowice and IEM Expo will be the center of the global esports world with over 100,000 fans cheering on their favorite teams alongside the millions watching at home,” said John Bonini, vice president and general manager of the VR, Gaming and Esports Group at Intel Corporation. “Following a groundbreaking tournament at IEM PyeongChang, Intel is proud to continue fostering esports on a global stage alongside our partners.”

About Intel® Extreme Masters

Intel® Extreme Masters is the longest-running global pro gaming tour in the world. Started in 2006 by ESL, the competition features the world’s best gamers in multiple esports titles. With IEM having over a decade of history, it is considered one of the most prestigious and traditional events in the world. Official website: www.intelextrememasters.com

The post Intel Extreme Masters Katowice Starts March 2 with a Total Prize Pool of $950,000 appeared first on Intel Newsroom.

Stream Smarter: Google Assistant Now on SHIELD TV

The most advanced streamer just got smarter. SHIELD TV now offers the Google Assistant.

SHIELD Experience Upgrade 6.0, available for download now, brings the power of the Google Assistant to your living room. And like SHIELD, your Google Assistant will continue to improve.

Ask it questions. Tell it to do things. Your Google Assistant is ready to help whenever you need it. Using natural language, many of the life-simplifying capabilities of the Google Assistant are now on your TV.

And coming soon, when paired with a SmartThings Link, SHIELD will work as a SmartThings Hub.

Meet Your Google Assistant

The Google Assistant is enhanced for the TV experience. It can search apps on SHIELD and control your media with voice commands like “pause” or “play Game of Thrones.” Plus, by taking advantage of the TV screen, you get an enhanced visual response to your questions.

This show-don’t-tell experience in the living room means you can ask “who plays Daenerys Targaryen” and quickly discover that Emilia Clarke was also Sarah Connor in Terminator Genisys.

Visual context also means that when you ask how your favorite team is doing, you’ll get the latest score, plus divisional standings, onscreen. Other visual enhancements include: seeing items on a menu, and being shown different purchase options when shopping.

The Google Assistant on SHIELD makes your living room a command center. With your Google Assistant, you can browse movies and TV shows, but you can also ask it questions, or ask to see your Google Photos: sharing photos from your trip overseas or the latest hiking adventure is simple and fun to relive on the big screen. You can also ask your Google Assistant what’s on your agenda for the day, and rather than a long-winded audio response, integration with Calendar means you’ll see what’s on your schedule at a glance.

The Google Assistant on SHIELD works with supported Assistant devices. This comes in handy in a number of ways, like when you’re watching a movie and want to dim the lights.

Look, Ma! No Hands!

There are two ways to trigger your Google Assistant – by pressing the microphone button on the remote (the SHIELD button on the controller), or with a hands-free command: you can just say, “Ok Google,” from within shouting distance of the SHIELD controller.

SHIELD controller’s built-in microphone and low-power ambient listening capabilities activate your Google Assistant with just your voice. With SHIELD’s hands-free capability, you can leave your remote on the table and use speech to enjoy TV.

So whether you need to rewind the movie you’re watching on HBO Now while it’s too cold to come out from the blankets — or you see “Winner, Winner, Chicken Dinner” at the end of a session of PlayerUnknown’s Battlegrounds, and realize you could go for some Domino’s — you’re always just an “Ok Google” away.

Your Home, Made Smarter with SmartThings

SHIELD becomes the hub for your total AI home control by incorporating SmartThings’ hub core technology. When paired with a SmartThings Link, SHIELD works as a SmartThings Hub, integrating with hundreds of Works With SmartThings certified partner products on the market. Using SHIELD, you’ll be able to perform tasks like turning on your lights, adjusting the thermostat and much more.

It also means Routines can be set up in your smart home, letting you instantly trigger customizable actions at different times you choose, like when you’re asleep, awake, on vacation, at the office, out shopping or back home. SmartThings will remember all the actions you’ve personalized and perform them exactly when you say so, automatically, with the SmartThings app or with a single voice command.

Three Months Free of Ad-Free Music on YouTube

To celebrate launching the Google Assistant on SHIELD, new and existing SHIELD owners will receive three months of YouTube Red for free. YouTube Red is everything you already love about YouTube, with no interruptions getting in the way. No ads, no breaks, no worries, no matter where you are. It’s YouTube uninterrupted. Visit https://www.youtube.com/red for more details.

SmartThings Link, the key to unlocking SHIELD as a SmartThings Hub, is coming soon. Priced at $39.99 regularly, SmartThings Link will launch at a promotional price of $14.99.

For more information, or to order the world’s most advanced streamer, visit shield.nvidia.com.

 

The post Stream Smarter: Google Assistant Now on SHIELD TV appeared first on The Official NVIDIA Blog.

Winner Winner, Schnitzel Dinner: Gamers Get Peek at What’s Next at Gamescom

Some will quaff the crisp Kölsch. Others will sample the savory pork knuckle. But all of the 350,000 gamers crowding into Cologne this week for the world’s largest gamer gathering will feast on the latest that gaming has to offer.

With PC gaming thriving like never before, we couldn’t wait for Gamescom 2017 to begin — so we didn’t.

At a pre-show event this weekend, attendees heard from the developers of Destiny 2, PlayerUnknown’s Battlegrounds, Shadow of War and FINAL FANTASY XV WINDOWS EDITION — announced with a stunning 4K trailer revealing the game will launch early next year and be packed with NVIDIA GameWorks technology.

Gamers at the event got hands-on with all these titles, as well as Pro Evolution Soccer 2018, Forza Motorsport 7, Need For Speed Payback, Lawbreakers, Project Cars 2 and more.

PlayerUnknown’s Battlegrounds Gets NVIDIA ShadowPlay Highlights

PlayerUnknown’s Battlegrounds has caused a worldwide sensation since its release in March on Steam’s early access platform. It’s made history as one of only three titles to break 600,000 concurrent players on Steam.

With every update now eagerly anticipated, Gamescom was the perfect event at which to announce that NVIDIA ShadowPlay Highlights is now in the multiplayer online battle royale automatically capturing video and screenshots of your greatest achievements.

FINAL FANTASY XV

You only need to hear a snippet of Final Fantasy’s signature title screen music to know you’re in for another epic journey. And with over 135 million units sold worldwide, a lot of fans share this feeling. Now PC gamers can join the adventure with FINAL FANTASY XV.

With full support for GeForce Experience, and the combination of Square Enix’s cutting-edge Luminous Engine and NVIDIA GameWorks, this is the definitive version of the game. With native support for 4K and 8K resolutions, as well as HDR10, the world of Eos will look stunning. And with NVIDIA Ansel support, sharing your adventures is easy.

New Games, New Feature for Ansel

You’ll also be able to flex your creative muscles with support for Ansel, our in-game photo capture tool, in Monolith’s Middle-Earth: Shadow of War and Konami’s Pro Evolution Soccer 2018. Whether you’re capturing a glimpse of Shelob or framing your 40-yard screamer at the Nou Camp for posterity, we can’t wait to see your screenshots.

And Ansel will keep on getting better with the introduction of AI Style Transfer. Once you’ve framed the perfect shot, just use this new feature to pick a painting from your favorite artist. GeForce AI then transforms your snapshot into a masterpiece in the unique style of the artist you’ve selected.

Get Ready for Destiny 2 with GeForce

Let’s not forget one of the most eagerly awaited titles of the year: Destiny 2. Bungie’s seminal shooter is having its PC beta next week (starting August 28 for those with Early Access). We’ve also announced that HDR and SLI support is coming, together with the latest Destiny 2 bundle. Buyers of select GeForce GTX 1080 Ti and 1080 GPUs, systems, and laptops, will get the game at PC launch.

We’ll also be giving some lucky GeForce Experience users early access to the beta. So, make sure you’re registered and keep your fingers crossed. And everyone trying the beta should make sure they download the Destiny 2 Beta Game Ready Driver on August 24.

Follow Us for the Latest from Gamescom

This is just a snapshot of what’s already come out of Gamescom 2017. Stay connected to NVIDIA through Facebook, Twitter and Instagram to see more behind-the-scenes stories throughout the week. You can also find in-depth articles on the above games, technologies and more at www.geforce.com.

The post Winner Winner, Schnitzel Dinner: Gamers Get Peek at What’s Next at Gamescom appeared first on The Official NVIDIA Blog.

World’s Largest ‘CS:GO’ League Selects NVIDIA GeForce

The stakes are high. The crowds — online and off — are enormous. So the gear better be great.

That’s why the world’s top Counter-Strike: Global Offensive teams will battle it out in the sixth season of the ESL Pro League for a $1 million prize pool on rigs equipped with our GeForce GTX GPUs.

ESL Pro League trophy
The world’s top CS:GO teams will battle on GeForce GTX rigs in the sixth season of the ESL Pro League for a $1 million prize pool and a cool trophy.

ESL’s events fill stadiums, attract elite players with huge followings and draw online audiences measured in the millions. This is the third year we’re working with ESL, the world’s largest esports company. They are, by any measure, one of the world’s foremost sporting organizations.

We couldn’t think of a better venue for our technology — both on and off the sport’s electronic playing field. ESL requires gaming performance for its pro players and gear that can reliably serve up content to vast audiences online.

“NVIDIA GeForce is synonymous with PC gaming, which is the number one platform for esports,” said Ulrich Schulze, senior vice president of products at ESL. “Pro players competing in our tournaments expect to play on NVIDIA GPUs, because GeForce GTX provides the performance and reliability they demand to excel at their games.”

GeForce GTX GPUs will also support some of the biggest events in the ESL One series, including ESL One New York and ESL One Hamburg in 2017. Our technology will be put to work powering top tier competition in both Dota 2 and CS:GO. In addition, GeForce GTX is the official graphics platform for The International DotA 2 Championships/Majors, as well as at Blizzcon.

ESL Pro League Season 5 was the most watched yet. With the inclusion of YouTube streaming, its online audience grew more than 17 percent compared to the previous season.

We think this year’s competition will be the biggest — and best — yet.

The post World’s Largest ‘CS:GO’ League Selects NVIDIA GeForce appeared first on The Official NVIDIA Blog.

The World’s Biggest Esports Tournament Relies on GeForce GTX and G-SYNC

Eighteen teams of world-class competitors. Rabid fans. One of the biggest prize pool in all of sports.

We love DotA 2. And everything we love about DotA 2 — the competitive pressure, the teamwork, the action, the community — will be on display at The International DotA 2 Championships on Aug. 7-12, in Seattle.

The prize pool alone — more than $23.1 million and growingthe largest prize pool in all of eSports, ensures this will be an event to remember.

That’s why we’re going all out to support an event — better known as simply “The International” — that, in less than a decade, has achieved fabled status.

Professional gamers demand the best performing and most reliable gaming solutions possible. That is why, for the third year running, GeForce GTX and NVIDIA G-SYNC are the official graphics platform of The International. Pro gamers will compete and warm up on 120 systems equipped with GeForce GTX 1080 GPUs and Acer G-SYNC Predator displays.

Meeting the Elite

Our involvement goes beyond powering The International. We partner with teams to help them prepare for the biggest tournament of the year.

As part of our work supporting pro teams, we brought in Team Secret — ranked No. 10 in the world and No. 4 in Europe — to our Silicon Valley headquarters’ esports bootcamp to prepare for this month’s tournament. Here, Team Secret trained in a facility stocked with the most advanced esports gear, a mirror image of the onstage TI systems.

“Every year we train on GeForce GTX and and G-SYNC,” said Matthew Bailey, manager of Team Secret, “because it is the defacto platform of choice for The International and all the majors in between.”

Tune In

The finals will take place at Seattle’s Key Arena. If you love DotA 2 like we do, don’t miss it. And click, here, to learn more about the official graphics platform of the world’s best DotA teams and The International, and to participate in our I <3 DOTA 2 giveaway.

 

The post The World’s Biggest Esports Tournament Relies on GeForce GTX and G-SYNC appeared first on The Official NVIDIA Blog.