Nvidia and Microsoft's new real-time lighting tech feels like the future of high-end gaming visuals
There's a lot of crazy and impressive technology on display at the Game Developers Conference - that's just the nature of the show. Several times as I explored the show floor I had to stop in my tracks to take a closer look at something or another; cool game ideas, control technologies, infrastructure that can handle 1000-player multiplayer. But the biggest wow of the show came from the graphics gurus at Nvidia.
Nvidia's big push at the show this year is Ray Tracing, a technology that the company believes will provide the next big graphical leap. As bumps in resolution and polygon counts have gradually diminishing returns, attention turns to lighting in an attempt to provide the next huge leap in visual fidelity.
When it's used right, ray tracing can look absolutely insane. GDC had a number of demos based around the tech, but the stand-out was a Star Wars tech demo from Epic featuring Stormtroopers and Captain Phasma. Before you read any further, have a watch:
The Star Wars demo is a proper holy s**t moment. A graphics card did this - and in real time.
The demo is a proper holy s**t moment, right? Epic has used actual assets from LucasFilm and CGI house ILM here, but that shouldn't undermine just how insane this looks. A graphics card did this - and in real time.
While a few smaller details reveal it as computer-generated footage, it's astonishingly photo-realistic and film-like. In a private demo, a grinning Nvidia representative brings up the console and tweaks the settings mid-demo to prove that it's real. It's still a little difficult to believe.
So, what is ray tracing? To put it in the most simple terms, this tech is an advanced way of rendering lighting and shadows, and it's not entirely new: it's been used in TV and movies for a while now to ensure that added CG imagery blends properly with real-life shots. It's previously required enormous server farms and lengthy render times, but the new DirectX 12 ray tracing tech from Microsoft in partnership with Nvidia is aiming to make real-time ray tracing a properly achievable thing.
The actual act of ray tracing is pretty much exactly what you'd expect - an algorithm tracks (or, hey, traces) the path of beams of light and then fully simulates how that ray interacts with the environment as it bounces around. Light then impacts pretty much everything about the picture you see - colours, reflection, shadow and beyond, and so this simulation can have a properly transformative effect when compared to rasterization, the current shader-based system games typically use for lighting.
Nvidia's solution performs the ray tracing in a noisy, lossy manner by drawing fewer individual rays - perhaps as few as one per pixel - but then carefully-developed and computationally light de-noising tech is applied over the top of the image. The end result is a picture that theoretically is indistinguishable from if everything had been rendered at full scale as in those farm-driven film uses of the technology.
Unlike HDR or a resolution bump where your monitor or TV will impact the level of benefit you get, the difference provided by ray tracing should be pretty obvious from the get-go on any HD screen.
In theory it should also be relatively easy for many large developers to embrace, since a lot of games recruitment has pulled in artists from Hollywood who have already worked with the tech. At the show there are demos from Microsoft, EA and even a smaller studio in Remedy Entertainment, the creators of Alan Wake and Quantum Break. While nothing is absolutely certain, Nvidia expect the first games with some sort of support for DirectX ray tracing this year.
For now it's all about experimentation, and Nvidia is leading the charge in terms of ensuring developers have the know-how to begin implementing ray tracing in their engines.
EA's Frostbite and SEED engines are both already compatible, for instance, as is Unreal. There's also already a really impressive Metro Exodus demo that makes use of the ray tracing technology.
Initially it may only be used on the development side; I'm also shown a demo that reveals how a lighting render that previously could've took upwards of forty-five minutes might only take three with this new gear. The end goal, however, is to have this working and eventually become a staple at the consumer level. Nvidia reps were quick to point out that with the speed at which graphics tech has been accelerating that might be a whole lot sooner than many might realize.
While this is a DirectX 12 feature, there's no way that this is going to work on the existing DX12 graphics cards that are on the market - or at least not while doing the other complex rendering work that goes hand-in-hand with a game. The Star Wars demo ran on a single current station with a huge cost thanks to multiple Titan V cards running in sync, yes, but it ran in real-time, and as tech marches on the chance of it trickling down to consumers fully grows ever more. More importantly, lesser, watered down versions of the tech might appear sooner - still offering an improvement but on weaker builds.
A future where this technology is ready for the rest of us to use feels tantalizingly close, in fact. The first wave will probably come in the form of using DX12's built-in multi-GPU support to shoulder the ray tracing tech, but then we've also got Nvidia promising full support for this feature in their new Volta next-generation GPUs.
That's probably the big headline, in fact: this feels like the future. In this job you get a lot of fancy new technology paraded in front of you, all of them sold as game changers. But looking at this, it really does feel like real-time ray tracing is the future and the next proper visual leap for video games. It's a technology I'll be watching eagerly - especially to see if the next generation of consoles will be capable of managing it.