Epic Games took the wraps off its Unreal Engine 5 today, showing in a demo video how the game engine behind Fortnite will be able to generate outstanding realistic graphics in real time on next-generation consoles like the PlayStation 5, high-end PCs, and even mobile devices.
Unreal Engine 5 debuts in 2021, and it will be one of the tools that will enable for the entire game industry to take a step upward in graphics quality, said Epic Games CEO Tim Sweeney in an interview with GamesBeat.
“It’s a real generational leap in new features. Even though it doesn’t break things that came previously, Unreal Engine 5 will be a straightforward upgrade for anyone working with Unreal Engine 4,” Sweeney said. “It’ll be like going through a few minor version updates. But it has major new graphical features targeted at a new generation of hardware, defined by PlayStation 5. These capabilities are also coming to PC and elsewhere.”
Sweeney emphasized that Epic Games worked closely with Sony on the PlayStation 5 so that games can take full advantage of the next-generation hardware. But he also said that Unreal Engine 5 is designed to make any game run anywhere. Game designers can use it to create their games, and Epic does the hard work of translating it so that it can run on everything, from mobile devices to next-generation consoles to high-end PCs.
Event
GamesBeat at the Game Awards
We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited!
The guinea pig for Unreal Engine 5 is Fortnite, which has reached more than 350 million players across seven platforms. This year, Fortnite will debut on the PS5 and Xbox Series X based on Unreal Engine 4 technology, but it will eventually migrate to Unreal Engine 5. And Sweeney said the rest of the industry can piggyback on the pioneering work, and they will pay the same kind of fees for the Unreal Engine 5 license as they do now with Unreal Engine 4: 5% of royalties for games that do more than $1 million in sales.
I interviewed Sweeney and other members of the Epic team — chief technology officer Kim Libreri and Unreal architect Nicholas Penwarden — over a Zoom call, with another writer present. Here’s an edited transcript of our conversation.
GamesBeat: What’s happening?

Above: Tim Sweeney, CEO of Epic Games, and Kim Libreri, CTO.
Tim Sweeney: This is a first glimpse of Epic’s next-generation lineup of tools and technologies for game developers. The graphics speak for themselves. Epic has always pushed the leading edge of what’s possible on 3D hardware. In this generation we’re pushing geometry to new levels with the Nanite technology, also the Quixel megascans library, which produces film-quality assets scanned from the real to make content creation much more practical, and the Lumen dynamic lightning technology.
But our goal isn’t just to bring more features to developers — it’s to help solve the hardest problem in game development right now. Building high-quality content takes enormous time and cost. We want to make it productive for people to build games at this quality level. Nanite frees developers from having to worry about individual polygons. You just build your highest-quality assets, and the rest of it is the engine’s problem, sorting it out and scaling to each platform. It ties into the Quixel megascans library, where we’ve made available a vast and rapidly growing collection of assets to everyone for free use in Unreal Engine games. You don’t have to create yet another chair or mountain or rock for your game. The Lumen technology frees developers from having to wait for lighting and build their games around the limitations of dynamic lighting. We want to make developers’ lives easier and more productive so they can build more effective businesses.
This extends to our online services. Our goal since the very early days has been to connect all the players across all the platforms. We pioneered this in Fortnite, which was the first to connect Sony, Microsoft, Nintendo, Apple, Google devices, every device, and enable everyone to play together. We’ve taken that entire stack of online technologies and we’re opening it up to all developers, including the nuts and bolts game services like matchmaking and data storage. But also the account system and the friends graph we built up for Fortnite, with more than 350 million players across seven platforms, and more than 2.2 billion social connections.
That’s now open for everybody. You can piggyback on all of Epic’s efforts to build up this multiplatform audience and then contribute back to it by using it in your game, having your players add your friends to it. Everybody benefits together by building up this non-walled garden version of things that have existed on each platform and on Steam in a locked-down way in the past.

Above: Epic Games is launching the Unreal Engine in 2021.
We’re opening all of that up to all developers for free. That’s the spirit of all our efforts in the next generation. We’re working to serve all developers and help them achieve what we’ve achieved with our games, and to help them do that productively and effectively.
Kim Libreri: One of the challenges with making content for any game or any interactive experience is the effort that goes into making them massive. I second what Tim said about the Quixel megascan library, but also Nanite, this super-dense geometry system we’ve built — it means that now all industries that use our engine don’t have to worry about the traditional authoring process. You can load in a movie-quality asset and it just works in the engine. The engine does all the work behind the scenes. Even if ultimately your target’s going to also cover mobile, the engine will make clever LODs (level of detail) for that platform without the usual drudgery associated with making game assets.
For the demo, the environment team was half-classic Epic environment artists, and then a couple of the other people who came in came straight from a movie VFX company. They said, “Wow, this is crazy. It’s like authoring in the metaverse. I just grab a rock. It looks like a rock. I can move it and scale it and light it and adjust the bounds around it and still get results.” It’s a massive quantum leap in artist-friendliness and the visual resources you get. That video really is — it’s basically an HDMI capture device plugged into the back of a PlayStation. That’s the pixels the engine generated. Quality in an easy way, that’s what we’re aiming for.
GamesBeat: The video looks pretty amazing. Do you want to set an expectation for how games will look relative to that on things like, say, mobile, all the way up to the new consoles and high-end PCs?
Sweeney: The aim here is that you can build your content at the highest level of quality possible, and the engine will scale it down to every platform automatically, so you don’t have to worry about texture maps and polygon LODs yourself. You can rely on the technology to do that.
The demo is demonstrating the highest level of quality, which is available on PlayStation 5 and other next-generation hardware. Every other platform that doesn’t have these capabilities will go through a more traditional rendering pipeline, in which we’ll take these assets you’ve built and scale them down to more traditional LODs, rendering them so you can — there will be a version of this demo you could run on Android devices from three years ago. It will have much lower polygon detail, but it’ll be the same scene and you can build the same game.
It has to be this way, because as we’ve said, we’re launching Fortnite on next-generation consoles this year on UE4. We’re moving it to UE5 over the course of next year. Fortnite will continue to support the seven platforms it supports now, plus the two new ones that have been announced. We have to support the game on all hardware, and we have to do that without game developers ever having to build any asset, any content twice.
GamesBeat: In terms of calling it Unreal Engine 5, what mattered in that particular decision? What’s the distinction that you think elevates to being Unreal Engine 5?
Sweeney: It’s a real generational leap in new features. Even though it doesn’t break things that came previously, Unreal Engine 5 will be a straightforward upgrade for anyone working with Unreal Engine 4. It’ll be like going through a few minor version updates. But it has major new graphical features targeted at a new generation of hardware, defined by PlayStation 5. These capabilities are also coming to PC and elsewhere.
We’re enabling a new paradigm for game development in which each generation introduces a new set of problems you have to worry about, and hopefully removes an old generation of problems so you don’t have to worry about them anymore. We’re trying to remove the content scalability problem from this generation and get developers thinking about a new way of building games. It’s building on some of the things we’ve learned with Fortnite. You can build a high-end console game, it can look fantastic, and you can also make it work on smartphones. You can build an audience that’s far bigger than just the hardcore gaming audience by shipping it on more platforms.
The technology can enable that and make it more productive. The online tool set can do the same. We want to help the whole game industry get to this better spot.

Above: Epic Games shows off intricate details amid shadowy lighting.
GamesBeat: I know it’s a long way from shipping, but is there a similar business model for you going from Unreal 4 to Unreal 5? Is it different in some way?
Sweeney: The Unreal Engine business model is the same. It’s a royalty-based model where we succeed when you succeed. It’s 5 percent of gross revenue from games. With this generation, retroactive to January of this year, we’re exempting every game’s first million dollars in revenue from royalties, so it’s a bit easier for indie developers to get started and not have to worry about the cost of the engine until they’re successful.
GamesBeat: Obviously the whole thing is scalable to current mobile phones and older machines, but on the higher end, as seen in the demo for the PS5, is there something in particular about the architecture that makes these features possible now that weren’t available in the past on the PlayStation 4?
Nicholas Penwarden: Some of it just advances in the speed of the hardware, the CPU and GPU, as well as the new I/O capabilities of the next generation of platforms, allowing us to stream in a lot more data on the fly. One of the things that Nanite does is it only keeps the actual triangles in memory that you need around you. When you have scenes with tens of billions of triangles, it wouldn’t be practical to have all that data in memory at once. You need to be able to stream that in dynamically as the viewpoint moves around.
GamesBeat: Is that because of things like the SSD? Or is there a lot more cloud technology at work here making a difference?
Penwarden: In this case it’s the SSD and other hardware features in the next-gen consoles that enable that.

Above: Kim Libreri is CTO of Epic Games.
GamesBeat: Is there a cloud-based element to the technology? Could somebody use this remotely right now and be as productive as they would be if they were in the office with a bunch of additional hardware?
Libreri: Generally, the engine scales pretty well to cloud-deployed GPUs. We’re having to deal with a bunch of that ourselves. Not everyone has a workstation at home. That general framework that Amazon and the cloud providers have is great.
What we’ll see as people start to get their heads around Nanite is the data streaming — not just a primary and secondary cache, but a tertiary cache that’s cloud-based. We’ll see where people take it. But we’re definitely designing the next generation of the engine so that storage can be off actual hardware.
Sweeney: Sony’s new PlayStation 5 is a remarkably balanced device, not only the GPU power, but also an order of magnitude increase in storage bandwidth, which makes it possible to not just render this kind of detail, but stream it in dynamically as the player is moving through the world. That’s going to be critical to rendering the kind of detail in bigger open-world games. It’s one thing to render everything that can fit in memory, but another thing to have a world that might be tens of gigabytes in size.
Libreri: That’s our goal with Unreal Engine 5. Huge, complex, large-scale worlds can be streamed into the machine with incredible detail, and without noticing things popping in the traditional way you’d see.
The other thing of note is that Niagara is now in its polished, releasable, usable state, and there’s tons of cool new features in there. You can look in the pod where you see all these fluid dynamics we’re doing, all sorts of crazy — the bugs react to the flashlight as it moves around. That was a good capability for us. It made it a lot easier for us to make the demo feel a bit more next-gen. We’re very happy that Niagara is in a great state. In time we expect all next-generation Unreal games to be using Niagara now.

Above: This isn’t a picture. An artist using Unreal Engine 5 made it.
GamesBeat: Is that a plug-in?
Libreri: It’s the visual effects system. Whenever you see particles or fluids or any of this cool signaling stuff, it’s a combination of Niagara on the local side and then Chaos, our new physics system. All the rigid-body collisions and destruction and stuff.
GamesBeat: As far as global illumination goes, this is reminding me of a Pixar movie or something like that.
Libreri: We wanted to get to a point where we felt that things started to look very real and very photographic. You can’t do that without dynamic global illumination. For years we’ve had a system in the engine we call Lightmass that allows you to bake GI, but the problem with that is that in the next-generation, you’ll want everything moving and animating and destructible. You need a live GI solution.
We’ve tried. Even in Unreal Engine 4 there were a few pieces we did that helped with live GI, the screen-space GI thing we did and other things. But this was the first time that we really nailed it, where at console performance levels in a huge scene you were able to get dynamic lighting. It’s awesome. When the roof opens in that big area that the statues are stored, that light is — we’re not keyframing anything. It’s just the wall opening and it illuminates the space.
Penwarden: This is another area where it enables new levels of visual fidelity, but is also transformative on the workflow side. Artists don’t have to alter lightmap UVs. They don’t need to manually place probes throughout the environment. They don’t need to go through a long multi-hour baking process to build lighting. They can just build the environment and see lighting update as they’re building it in the editor. It all just works on the console when they deploy it.
Libreri: One of the coolest parts of making this demo, I’m in an office right next to where all our artists sit, and a bunch of them have got their monitors toward me. It’s great to see them making the world and literally picking up mountains and just moving them and dropping them, changing lighting direction, and it all looks real. It’s this very surreal experience. Looking into the future. I don’t see them ever wanting to go back to the old way of working.
GamesBeat: Would you agree it feels kind of rare that you have a system that improves the quality of life for developers, while still improving the end product for the consumer? It seems to be often that when there’s a big step in game development, new graphics and new effects, it means a hell of a lot more work behind the scenes. It seems rare that this improves things on both ends.
Libreri: It never used to be like that. But honestly, the power of the hardware is so amazing nowadays that we’re really — we’re getting to a point where we can improve the quality on both sides. You can even do simulations in the editor while you’re authoring as a developer that are crazy complicated. You can play that back in real time on a console. There’s lots of flexibility.
What makes great games is iteration, time spent on iteration. It’s important for us to make sure we have accessible tools that allow developers to concentrate on the important stuff, like great gameplay. That’s one of the reasons Quixel is part of Epic. We don’t want developers spending half their time making rocks and trees and bits of grass. They shouldn’t have to do that. We can do that to a high level of quality that will last forever.
One of the coolest parts of this demo was the fact that we’re using movie assets, the ones that could normally only be rendered by ILM or WETA or other big visual effects companies. That’s what went into this demo. It’s awesome.

Above: Shadows and lighting are the specialty of Unreal Engine 5.
GamesBeat: Is this seven years after Unreal Engine 4? Does that tell us a lot about how complicated this is? Is it hundreds of people working for seven years? What was the scale of this project like?
Sweeney: The first Unreal Engine 5 work began a couple of years ago. Brian Karis, a young graphics programmer who was in the video, began experimentation with the Nanite technology, as well as Lumen.
Penwarden: Yeah, a couple of years ago, we started brainstorming what would be the key features for a next-generation engine, especially next-generation graphics. We kicked off a number of R&D projects to start working through that and figure out what we could do and where we could really make a generational leap, both in terms of fidelity and workflow. That’s where it got started, with those R&D projects that we built up over time. We expanded the team over time to build up to the demo you saw.
Libreri: Brian himself is an amazing engineer, but — he gets wound up when we call him part-tech artist, but he has a real empathy for art and visuals. I’ve been at Epic for six years. He was talking about doing this all the time I’ve been there. A couple of years ago we said, “Go on, do some research.” And then finally last year, around this time, we looked at the early results and said, “That’s amazing. It’s going to work.” We dogpiled the whole demo team on it and built this demo.
It’s always a bit scary, because things don’t — until you actually kick the tires and make something with the tech you’re building, it can all be for naught. It was good. It was exciting. We’re very proud of the results.
GamesBeat: Is there a background to the scene at all? Does it represent any particular game?
Libreri: I wouldn’t say there’s any — we’ve done a bunch of cinematic demos in the past, and then the past couple of years we’ve made stuff that was more gameplay. We wanted to do that again. We wanted to make something that genuinely could be a game. We had the idea that we’d have an explorer — we all like Uncharted and the Lara Croft games. We’ll make an explorer character that’s going to go through an environment. Quixel was around, they were our friends, and they said, “Well, we’ve got some awesome new rocks and all that, some caves that we’ve built.” I’d gone to Malaysia last summer, and I went to see the Batu caves. They look like this. I brought back some photographs and said, “Check this out. Have you got anything like this?” We just riffed on that.
We knew we wanted something that felt extremely natural. Stuff that we could look at photographic reference for. Trying to build an engine that does photoreal without actually looking at photographs and studying the real world is hard to do. It just evolved from there. Tim played a role. The whole section at the end with the city and the big crumbling stuff, that was Tim asking for a bit more variety. We have an awesome team. Honestly, making stuff in real time is so much fun that you can’t help yourself. You just roll with it.
GamesBeat: If I could anticipate the internet at all, I would guess your pushback here might be, “Oh, we’re never going to see games like this.”
Libreri: This is running on a PlayStation 5. It’s running on the hardware.
Sweeney: It’s all real. The important thing is that this was not a vast new content development effort. These assets came straight from Quixel. We put it together pretty quickly into a scene. That’s the point of the technology, to enable any creator to build this kind of high quality scene without having to create each piece by themselves manually.
Libreri: I’m pretty certain that next-generation games on Unreal Engine can look like that. This is not a smoke and mirrors act. We’re genuinely trying to build technology that we enjoy using ourselves, and then have a game team build the demo, and then get it into customers’ hands. Once the Nanite stuff is in customers’ hands, we’re excited to see what they discover, how they want to evolve it. We’re going through a bunch of the workflows to make sure they’re efficient for all studios, but it’s exciting. Something that looks as good as that, it can’t help but bring joy. We’re excited. The next year is going to be amazing.

Above: Unreal Engine 5 has the same business model as Unreal Engine 4.
GamesBeat: As far as timing goes, 2021, it would be great to have this now, but it does seem a little late for people who are doing not just launch titles, but titles that are maybe in the second wave. Do you anticipate this would be used more for the middle of the wave of next-generation games?
Sweeney: Fortnite’s going to be on this in 2021. I think you’ll see adoption pretty quickly. A developer can move from Unreal Engine 4 to Unreal Engine 5 with just an upgrade, and not rebuilding their game. This timing is typical. Any game that’s going to ship at launch on the new consoles has been in development for at least the last two years, and probably three. This is the lead-up, what’s necessary to get fully next-generation technology up and running. If you look at the signature games of each previous generation, in the Xbox 360 generation it wasn’t until Gears and similar games began shipping in year two that the full capabilities of the platform were demonstrated.
GamesBeat: How quick is it to move from Unreal Engine 4 to Unreal Engine 5? Is it just as easy as upgrading and then you’re done?
Penwarden: We like to think of it as moving up a couple of minor revisions of the engine. Most developers who stay up to date with Unreal might move from 4.4 to 4.5 and 4.5 to 4.6. This will be about the amount of work it takes to move three or four versions of Unreal Engine 4.
GamesBeat: Is there any reason you haven’t mentioned Xbox Series X yet? Is PlayStation 5 the lead horse for this in some way?
Sweeney: We’ve been working super-closely with Sony for quite a long time on the storage architecture and other elements. It’s been our primary focus. But Unreal Engine 5 will be on all next-generation platforms, and so will Fortnite.
Sony has done an awesome job of architecting a great system here. It’s not just a great GPU, and they didn’t just take the latest PC hardware and upgrade to it, following the path of least resistance. The storage architecture in PlayStation 5 is far ahead of anything that you can buy in any PC for any amount of money right now. It’s great to see that sort of innovation. It’s going to help drive future PCs. They’ll see this thing ship and realize, “Wow, with two SSDs, we’ll have to catch up.”
GamesBeat: So feeding data into this processing beast is a problem they attacked? Is that something this demonstrates?
Sweeney: Right. If you look at previous generations, you had to deal with magnetic disks, the lowest common denominator. You couldn’t count on a lot of bandwidth supporting scenes like this. You had a beautiful scene and a long loading time, and then another beautiful scene. That disrupted the game experience. Our aim for the next generation is nothing but seamless, continuous worlds, and to enable all developers to achieve that. You can have this degree of fidelity going on for as many kilometers and gigabytes as you want.

Above: Feels like a movie. But it’s Unreal Engine 5.
GamesBeat: Would that still apply in something like a purely cloud game, like an MMO, a persistent universe, as opposed to games that reside on your hardware?
Sweeney: There’s instancing in this geometry. If you have the same rock used a million times, you only need to load it once. There’s really no limit to the scale of the worlds you can build. Even if you say that your game can’t be larger than some number of tens of gigabytes, you can still build something enormously expansive. You see a lot of growth around the genre of continuous open world games, whether they’re online games or single-player experiences.
Every time the hardware improves by an order of magnitude, you see new types of games take off. Battle royale only took off in this generation because you finally had enough processing power and cloud infrastructure to support 100-player game sessions with a massive amount of action. I think we’ll see new genres emerge, single-player and multiplayer, as a result of this technology being made available.
GamesBeat: What’s your best marketing pitch on this one?
Dana Cowley: You’ll be able to create worlds at an unprecedented level of detail and interactivity and scale, and more efficiently than ever. It’s just that simple.
Libreri: The next generation of consoles is going to give developers and consumers a quantum leap in their gaming experience. Unreal Engine 5 is another leap on top of that. It feels like two generations of improvement in quality, because of this new technology we’ve been able to bring to life. The future is very bright for gamers, and anybody using our engine for any application. I’m pretty sure our friend Jon Favreau [executive producer for The Mandalorian, which uses Unreal Engine], when he sees this demo, is going to be asking if he can have it on his movie sets.
Sweeney: It’s pretty fundamental. What you can do with unlimited geometry and vast bandwidth for streaming data, it really uncaps games. You can build anything you want at this point. It’s just a matter of budget and scale and development team. There’s no artificial limit.
GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings.