
Above: Epic Games shows off intricate details amid shadowy lighting.
GamesBeat: I know it’s a long way from shipping, but is there a similar business model for you going from Unreal 4 to Unreal 5? Is it different in some way?
Sweeney: The Unreal Engine business model is the same. It’s a royalty-based model where we succeed when you succeed. It’s 5 percent of gross revenue from games. With this generation, retroactive to January of this year, we’re exempting every game’s first million dollars in revenue from royalties, so it’s a bit easier for indie developers to get started and not have to worry about the cost of the engine until they’re successful.
GamesBeat: Obviously the whole thing is scalable to current mobile phones and older machines, but on the higher end, as seen in the demo for the PS5, is there something in particular about the architecture that makes these features possible now that weren’t available in the past on the PlayStation 4?
Nicholas Penwarden: Some of it just advances in the speed of the hardware, the CPU and GPU, as well as the new I/O capabilities of the next generation of platforms, allowing us to stream in a lot more data on the fly. One of the things that Nanite does is it only keeps the actual triangles in memory that you need around you. When you have scenes with tens of billions of triangles, it wouldn’t be practical to have all that data in memory at once. You need to be able to stream that in dynamically as the viewpoint moves around.
Event
GamesBeat Summit 2023
Join the GamesBeat community for our virtual day and on-demand content! You’ll hear from the brightest minds within the gaming industry to share their updates on the latest developments.
GamesBeat: Is that because of things like the SSD? Or is there a lot more cloud technology at work here making a difference?
Penwarden: In this case it’s the SSD and other hardware features in the next-gen consoles that enable that.

Above: Kim Libreri is CTO of Epic Games.
GamesBeat: Is there a cloud-based element to the technology? Could somebody use this remotely right now and be as productive as they would be if they were in the office with a bunch of additional hardware?
Libreri: Generally, the engine scales pretty well to cloud-deployed GPUs. We’re having to deal with a bunch of that ourselves. Not everyone has a workstation at home. That general framework that Amazon and the cloud providers have is great.
What we’ll see as people start to get their heads around Nanite is the data streaming — not just a primary and secondary cache, but a tertiary cache that’s cloud-based. We’ll see where people take it. But we’re definitely designing the next generation of the engine so that storage can be off actual hardware.
Sweeney: Sony’s new PlayStation 5 is a remarkably balanced device, not only the GPU power, but also an order of magnitude increase in storage bandwidth, which makes it possible to not just render this kind of detail, but stream it in dynamically as the player is moving through the world. That’s going to be critical to rendering the kind of detail in bigger open-world games. It’s one thing to render everything that can fit in memory, but another thing to have a world that might be tens of gigabytes in size.
Libreri: That’s our goal with Unreal Engine 5. Huge, complex, large-scale worlds can be streamed into the machine with incredible detail, and without noticing things popping in the traditional way you’d see.
The other thing of note is that Niagara is now in its polished, releasable, usable state, and there’s tons of cool new features in there. You can look in the pod where you see all these fluid dynamics we’re doing, all sorts of crazy — the bugs react to the flashlight as it moves around. That was a good capability for us. It made it a lot easier for us to make the demo feel a bit more next-gen. We’re very happy that Niagara is in a great state. In time we expect all next-generation Unreal games to be using Niagara now.

Above: This isn’t a picture. An artist using Unreal Engine 5 made it.
GamesBeat: Is that a plug-in?
Libreri: It’s the visual effects system. Whenever you see particles or fluids or any of this cool signaling stuff, it’s a combination of Niagara on the local side and then Chaos, our new physics system. All the rigid-body collisions and destruction and stuff.
GamesBeat: As far as global illumination goes, this is reminding me of a Pixar movie or something like that.
Libreri: We wanted to get to a point where we felt that things started to look very real and very photographic. You can’t do that without dynamic global illumination. For years we’ve had a system in the engine we call Lightmass that allows you to bake GI, but the problem with that is that in the next-generation, you’ll want everything moving and animating and destructible. You need a live GI solution.
We’ve tried. Even in Unreal Engine 4 there were a few pieces we did that helped with live GI, the screen-space GI thing we did and other things. But this was the first time that we really nailed it, where at console performance levels in a huge scene you were able to get dynamic lighting. It’s awesome. When the roof opens in that big area that the statues are stored, that light is — we’re not keyframing anything. It’s just the wall opening and it illuminates the space.
Penwarden: This is another area where it enables new levels of visual fidelity, but is also transformative on the workflow side. Artists don’t have to alter lightmap UVs. They don’t need to manually place probes throughout the environment. They don’t need to go through a long multi-hour baking process to build lighting. They can just build the environment and see lighting update as they’re building it in the editor. It all just works on the console when they deploy it.
Libreri: One of the coolest parts of making this demo, I’m in an office right next to where all our artists sit, and a bunch of them have got their monitors toward me. It’s great to see them making the world and literally picking up mountains and just moving them and dropping them, changing lighting direction, and it all looks real. It’s this very surreal experience. Looking into the future. I don’t see them ever wanting to go back to the old way of working.
GamesBeat: Would you agree it feels kind of rare that you have a system that improves the quality of life for developers, while still improving the end product for the consumer? It seems to be often that when there’s a big step in game development, new graphics and new effects, it means a hell of a lot more work behind the scenes. It seems rare that this improves things on both ends.
Libreri: It never used to be like that. But honestly, the power of the hardware is so amazing nowadays that we’re really — we’re getting to a point where we can improve the quality on both sides. You can even do simulations in the editor while you’re authoring as a developer that are crazy complicated. You can play that back in real time on a console. There’s lots of flexibility.
What makes great games is iteration, time spent on iteration. It’s important for us to make sure we have accessible tools that allow developers to concentrate on the important stuff, like great gameplay. That’s one of the reasons Quixel is part of Epic. We don’t want developers spending half their time making rocks and trees and bits of grass. They shouldn’t have to do that. We can do that to a high level of quality that will last forever.
One of the coolest parts of this demo was the fact that we’re using movie assets, the ones that could normally only be rendered by ILM or WETA or other big visual effects companies. That’s what went into this demo. It’s awesome.

Above: Shadows and lighting are the specialty of Unreal Engine 5.
GamesBeat: Is this seven years after Unreal Engine 4? Does that tell us a lot about how complicated this is? Is it hundreds of people working for seven years? What was the scale of this project like?
Sweeney: The first Unreal Engine 5 work began a couple of years ago. Brian Karis, a young graphics programmer who was in the video, began experimentation with the Nanite technology, as well as Lumen.
Penwarden: Yeah, a couple of years ago, we started brainstorming what would be the key features for a next-generation engine, especially next-generation graphics. We kicked off a number of R&D projects to start working through that and figure out what we could do and where we could really make a generational leap, both in terms of fidelity and workflow. That’s where it got started, with those R&D projects that we built up over time. We expanded the team over time to build up to the demo you saw.
Libreri: Brian himself is an amazing engineer, but — he gets wound up when we call him part-tech artist, but he has a real empathy for art and visuals. I’ve been at Epic for six years. He was talking about doing this all the time I’ve been there. A couple of years ago we said, “Go on, do some research.” And then finally last year, around this time, we looked at the early results and said, “That’s amazing. It’s going to work.” We dogpiled the whole demo team on it and built this demo.
It’s always a bit scary, because things don’t — until you actually kick the tires and make something with the tech you’re building, it can all be for naught. It was good. It was exciting. We’re very proud of the results.
GamesBeat: Is there a background to the scene at all? Does it represent any particular game?
Libreri: I wouldn’t say there’s any — we’ve done a bunch of cinematic demos in the past, and then the past couple of years we’ve made stuff that was more gameplay. We wanted to do that again. We wanted to make something that genuinely could be a game. We had the idea that we’d have an explorer — we all like Uncharted and the Lara Croft games. We’ll make an explorer character that’s going to go through an environment. Quixel was around, they were our friends, and they said, “Well, we’ve got some awesome new rocks and all that, some caves that we’ve built.” I’d gone to Malaysia last summer, and I went to see the Batu caves. They look like this. I brought back some photographs and said, “Check this out. Have you got anything like this?” We just riffed on that.
We knew we wanted something that felt extremely natural. Stuff that we could look at photographic reference for. Trying to build an engine that does photoreal without actually looking at photographs and studying the real world is hard to do. It just evolved from there. Tim played a role. The whole section at the end with the city and the big crumbling stuff, that was Tim asking for a bit more variety. We have an awesome team. Honestly, making stuff in real time is so much fun that you can’t help yourself. You just roll with it.
GamesBeat: If I could anticipate the internet at all, I would guess your pushback here might be, “Oh, we’re never going to see games like this.”
Libreri: This is running on a PlayStation 5. It’s running on the hardware.
Sweeney: It’s all real. The important thing is that this was not a vast new content development effort. These assets came straight from Quixel. We put it together pretty quickly into a scene. That’s the point of the technology, to enable any creator to build this kind of high quality scene without having to create each piece by themselves manually.
Libreri: I’m pretty certain that next-generation games on Unreal Engine can look like that. This is not a smoke and mirrors act. We’re genuinely trying to build technology that we enjoy using ourselves, and then have a game team build the demo, and then get it into customers’ hands. Once the Nanite stuff is in customers’ hands, we’re excited to see what they discover, how they want to evolve it. We’re going through a bunch of the workflows to make sure they’re efficient for all studios, but it’s exciting. Something that looks as good as that, it can’t help but bring joy. We’re excited. The next year is going to be amazing.