Epic bets on Unreal Engine 5 transforming next-gen game development - here's what developers think
We recently got our first glimpse of what video games could look like on PS5 thanks to a tech demo for Unreal Engine 5.
The footage shows an adventurer trekking through a sandy tomb, shale sliding down rocky outcrops and light shafts breaking through a crack in the cave’s surface. We see complex mechanisms and hundreds of detail-rich statues lined up in large halls. Then we’re flying through the sky as explosions rock the surroundings and terrain crumbles - our character a high speed blur surrounded by scenery that somehow remains in sharp focus.
To many players, it’s a noticeable but uninspiring step forward - the same kind of games we can play today, albeit with prettier rocks. But for developers, Unreal Engine 5 is a game-changer.
“A number of features in Unreal Engine 5 have been designed to reduce the complexity of building games,” Epic Games VP of engineering Nick Penwarden explains. “Nanite, for example, allows artists to directly import their high-poly models and let the engine handle the technical details, adding more creativity to the art pipeline. Similarly, Lumen allows for realistic global illumination without having to author lightmap UVs or wait for a long, offline baking process to complete.”
It’s all in aid of creating a more streamlined process for developers, who can now do the same things as before but with fewer steps. This could lead to shorter development time, or that time could instead be used to make bigger, more complex games in the same amount of time it takes to make one today. The way lighting is handled in Lumen could even lead to the creation of something completely new, according to Epic.
“Fully dynamic global illumination with Lumen will change how current games are experienced, and even enable entirely new types of games to be made,” Penwarden says. “Developers can craft dynamic and changing worlds where lighting is an integral component of gameplay.”
Originally, the demo was planned to be shown at GDC, an industry-facing conference with an audience of game developers. Because of COVID-19, GDC was cancelled, and that meant the trailer became a part of the next-generation hype machine as players hungrily consumed every scrap of information about next-gen consoles and games. Despite that, the features shown off in the demo were very deliberately chosen to highlight their use for developers, not players.
Baking lightmaps, generating lighting, and object optimisation are integral parts of triple-A game development. They’re also some of the most time-consuming aspects of the process. Unreal Engine 5 aims to almost eradicate them.
“With regards to Lumen, traditionally lighting for a high-end game is handled by ‘baking’ a lightmap - basically a texture that contains all of the lighting information for a scene - to produce high quality lighting,” Champion’s Ascent developer Husban Siddiqi tells me. “However, the disadvantage is that lighting can only be done with objects that are static and you have to reprocess a scene every time the lighting is redone.”
Before Unreal 5, developers had to wait for lightmap baking to complete before continuing work, every time they tweak lighting in a scene. With Lumen, developers can make these changes in real time and iterate without those lengthy waiting times.
“Light baking is necessary because in current generation game engines calculating dynamic light uses a lot of resources,” explains Tobias Graff, CEO of Mooneye Studios, developer of Lost Ember. “Whenever there’s a situation where the light doesn’t really change - for example, when coming from a static lamp or if the game doesn’t have a moving sun - the lighting information is actually just saved into a texture and applied to the world.
“This of course means that things like moving characters or even leaves moving in the wind wouldn’t cast moving shadows, so often a mix of different lighting solutions is used. Fine-tuning when and at which distances to use static or dynamic lighting is another step that of course takes time and could be skipped completely with Unreal Engine 5. Not to mention that baking light information itself is something that can take forever.”
Then there’s Nanite, which will do for a 3D artist what Lumen does for lighting. Ordinarily, 3D artists would make an asset in their chosen software, then they would have to create a lower fidelity, optimised version of the same asset in-game, using texture maps and normal maps to add more detail. Nanite will allow creators to import their assets directly into the game, in full fidelity, with no smoke and mirrors.
“This can mean things like 3D scanned assets or heavily stylised work can have a much easier time going into games without significant overheads in optimisation,” independent developer (Earthlight) Emre Deniz explains. “The key component to take notice here is that the workflow for game art development is opaque to many people. High fidelity assets have a huge overhead in making sure they are optimised and using everything as efficiently as possible - and often this is best done manually or with great forward investment in tools, experience and planning.
“The ability for developers, from small teams to multi-studio productions, to be able to quickly take these higher fidelity assets into the engine will make production not only faster but also more accessible for a variety of different teams. I would personally want to stress that higher fidelity does not necessarily mean realistic. Stylized, experimental or riskier art directions will be more attainable with these breakthroughs.”
Widely used art tools such as ZBrush allow artists to sculpt their creations as if from clay. These are built purely for aesthetics, rather than performance. That means a single sculpture could potentially be millions of tris, which isn’t feasible on current generation hardware if you want your game to actually run.
“So, you'd take that high-poly ZBrush model as a visual target, then make a lower-poly version of that same model and ‘bake’ the high poly one on top of it as a normal map (which is a bit like putting a lion mask on a labrador and calling it Mufasa,” game animator Tommy Millar explains. “From what I could glean from the demo, this middle step might be either lessened or removed entirely, making the optimisation phase a lot easier overall.
“A good metaphor to use is this - you're going on holiday, you're at the airport, but the baggage handlers say you need to put your liquids in wee clear bags, and your electronics in a bucket, and you're 10kg over your clothing allowance. So you do all that nonsense, leave some stuff behind, give security your laptop battery, and eventually, you can head on through, where you inevitably have to try to rebuy approximations of the stuff in duty-free that you lost (even though they don't have exactly what you had before).
“If what Unreal is proposing is true, this would be like showing up at customs with five times your usual luggage and them saying ‘okay’... and that's that. You go through, immediately - no luggage conversion, no luggage optimisation, no waiting around for checks and suitability rating, and no trying to recoup/fix what you lost on the other side. Just straight through.”
This will also be the same for level of detail (LOD). If you’ve ever stood in the Vinewood hills and watched the Los Santos traffic snake through the city of GTA 5, you might have noticed the cars are replaced with headlights that give the illusion of cars. That’s because there’s not enough memory to actually keep Los Santos filled with high detail vehicle models as far as the eye can see. Game developers use LOD to create scenes over vast distances, with each model in the game actually a handful of different models that the game swaps out depending on how far away the player is. The further away, the lower the detail.
“Just as with the normal maps, this step can likely be skipped completely for a lot of objects as well,” Graff says. “The SSDs are probably the true heroes in all this, because all of the other features likely wouldn’t be possible if it weren’t possible to load these huge models and information sets fast enough.
“With all of these changes to the workflow of developing games, developers can spend more time on the creative part of the process, on finding and fixing bugs, or just be able to work with a lower budget than it was possible before. This is especially important for smaller studios that often have half the team working on nothing other than optimizing performance and applying all these techniques mentioned above for months.”
Epic Games says much of this is possible due to the unique architecture of the PS5, which is why the engine was shown off running on Sony’s machine.
“The PlayStation 5 provides a huge leap in both computing and graphics performance, but its storage architecture is also truly special,” Penwarden says. “The ability to stream in content at extreme speeds enables developers to create denser and more detailed environments, changing how we think about streaming content. It’s so impactful that we’ve rewritten our core I/O subsystems for Unreal Engine with the PlayStation 5 in mind.”
It’s not just video games that will benefit from Unreal Engine 5, however. Recently, Star Wars spin-off The Mandalorian used Unreal to render the landscapes of its alien planets, and we can expect to see these kinds of uses become even more common thanks to the features available in Unreal Engine 5.
“Unreal Engine is already heavily used in enterprise applications, such as film production or simulations, and I can see this being a significant leap in how these industries leverage game engines,” Deniz says.
3D artist Shahriar Abdullah has been working with Unreal as a part of a team that leverages the tech for movies and television, most recently creating imaginary places in the BBC’s adaptation of His Dark Materials. Here’s a look at the kind of work they’ve been doing:
“So the big takeaway for me was it had finally looked like UE5 had finally closed the gap between the pipeline needed for creating game assets and things for real time, and that which would be more traditionally found in say a VFX or movie, but even more impressive as it was all in real time,” Abdullah says. “Seeing things that are of extremely high poly counts go straight in, with 8K textures and full dynamic lighting was mind blowing.
“What this means is that there will be (hopefully) a great amount of freedom for how we work when creating scenes. We are constantly fighting a battle of cost and time, and also being able to push what the engine can do and stay stable. Now it feels as though those restrictions are going to be a thing of the past. Creatively opening up what can be put in and rendered, almost giving the freedom to experiment without having to take the time consuming steps of optimisation. It's hard to predict how massive an impact it will have on the gaming industry. One thing I do predict it's going to have a huge impact in the film and VFX industry.”
The big question is whether this new level of detail will be viable over the course of a 30-40 hour game. A traditional asset creation pipeline results in reduced file sizes thanks to robust optimisation, but we could see games balloon if those steps are skipped.
“Those megascans assets are huge, and if we were only to use the non crunched down versions, I can't even fathom how much hard drive space that would take up,” Abdullah tells me. “Though hopefully there is a solution to that too which we will see in time.”
“Are 30 hour games of that visual fidelity really possible?” Graff asks. “To that I would say no, at least not really. The main bottleneck and problem I see at the moment is memory. Storing all these ultra-high-poly models is likely not possible in a game that shouldn’t take up hundreds of GB of space. But of course all the techniques mentioned are still valid and developers are often able to perfect them to a degree where it’s impossible to see the difference between low and high poly model. So I’d say they’ll still play a role in next-generation game development, just not as important as they do now, and games that look like the tech demo could very well be possible, just not without all the little tricks and cheats game developers always come up with.”
With a preview available for developers in early 2021, we’ll have to wait a few years to see.