All the sessions from Transform 2021 are available on-demand now. Watch now.

Epic Games stunned everyone a couple of years ago with the realistic digital human character Senua, from the video game Hellblade. And today, the maker of the Unreal Engine game tools showed another astounding demo, dubbed Siren, with even more realistic graphics.

CEO Tim Sweeney said technologies for creating digital humans — from partners such as Cubic Motion and 3Lateral — are racing ahead to the point where we won’t be able to tell the real from the artificial in video games and other real-time content.

I caught up with Sweeney and Kim Libreri, chief technology officer at Epic Games, in an interview during a preview session this week at the GDC. We talked about digital humans, Epic’s demos, and the success of its Fortnite battle royale game.

Here’s an edited transcript of our interview.


Three top investment pros open up about what it takes to get your video game funded.

Watch On Demand

Above: Kim Libreri (left, CTO) and Tim Sweeney (CEO) of Epic Games.

Image Credit: Dean Takahashi

GamesBeat: I heard a rumor that you will rebrand Epic Games as Fortnite Games, and you will rename the Unreal Engine as the Fortnite Engine. (Laughs). What would you say about Fortnite’s success?

Tim Sweeney: No spreadsheet could have predicted this, right? It’s becoming a social hub. It’s awesome. It’s part of an exciting trend in the industry right now. Mobile gaming was dominated by highly casual games for a long time. That started turning in Korea, where, beginning two years ago, the market began to be dominated by serious games for gamers, both in revenue and in play time. We had some Unreal Engine 4 games.

Now that’s happening on mobile in North America and Europe. Fortnite is one of the games leading the way. It isn’t a game that was just eventually ported to mobile. It’s the same game across iOS, Android, PC, Mac, PlayStation, Xbox. You can play together across most of these platform families. Everything works in a single unified game.

Above: Epic Games’ Siren demo

Image Credit: Epic Games

GamesBeat: I still have to give the mobile a try.

Sweeney: It’s cool. Ark is on mobile. PUBG mobile just came out. You have this broad set of games that are becoming universal game experiences that run everywhere. That’s going to be a trend shaping the game industry for the next few years. At the end of this year, the number one category on mobile will be serious games for gamers – number one in play time, number one in revenue.

That’ll be a much larger segment, ultimately, than casual games. Especially these games that are ad-supported or have really greedy monetization. We’ll see the advent of generous games with traditional console-like business models. It’s awesome.

GamesBeat: With Fortnite, do you notice that women are a big part of the audience?

Sweeney: Yeah, there’s much higher adoption than other hardcore games. That’s the great things about games as social experiences. You play with all your friends across social groups. You see young girls as well as young boys playing. These are kids in school, people in offices, in pubs, all having fun together. We played it on the plane flying over here. Fortnite, because of its visual style, it’s widely acceptable to just about everyone. It’s open up to a much wider audience than a realistic military-style simulation.

GamesBeat: Did you guys hit No. 1 on iOS yet? I saw No. 2 this morning.

Sweeney: We’re No. 1 in the U.S., despite launching with just an invitation event.

GamesBeat: I don’t think that’s been done before, PC and console and mobile–

Sweeney: Yeah, and it’s the same game experience everywhere. About a month ago we unveiled 60fps across all the console platforms. That was just the result of the optimization work we were doing to get Fortnite running on iOS and Android. We made the rendering and the level of detail support so efficient that the mobile effort benefited console. All these things we’re doing with Fortnite are going straight into the engine to benefit all of our licensees, too. They’re getting a bigger set of improvements there, more sweeping than ever before.

GamesBeat: What’s Wednesday going to be all about?

Sweeney: Let’s show you the Fortnite replay stuff. We’ve been building this Unreal Engine replay system and proving it out through Fortnite. Games have become a social phenomenon. The number one Fortnite streamer, Ninja, had a big rapper join him in a game session and they had 600,000 simultaneous viewers on Twitch. Far bigger than any Twitch stream before. That’s more viewers than a lot of TV shows.

Kim Libreri: We wanted to give everyone tools. All these YouTubers that make awesome videos of our games, or any game running on Unreal Engine, we wanted to give them better tools.

Above: Fortnite in action on iOS.

Image Credit: Epic Games

As you probably know, we had a replay system in Paragon that was pretty cool, but not as fully featured as we wanted. For Fortnite, we really wanted to push it and do something amazing. We contacted one of the more famous YouTubers out there that loves Fortnite, and this is what resulted. We got together with him, played a few matches, and then invited him to go back into the games. You can rewind them, restart them, place cameras, follow the action. We came up with a piece of awesome machinima in almost no time.

Also, you can think about the applications for live viewing of tournaments. The fact that you can go and follow the action, frame the coolest moments, go from player to player—you’ll be able to do, in the game, what only ESPN can do for real sports. There’s great potential for this.

The other big thing for us, you may have seen the Microsoft announcements about their new raytracing capabilities in DirectX, DXR. We’ve partnered with Nvidia, who have the new RTX raytracing system, and we thought about how to show the world what a game could look like in the future once raytracing is added to the core capabilities of a PC, or maybe even a console one day. We teamed up with Nvidia and our friends at LucasFilm, the ILM X-Lab, to make a short film that demonstrates the core capabilities of raytracing in Unreal Engine. It’s an experimental piece, but it shows the kind of features we’ll add to the engine over the next year or so.

Above: Epic Games’ Star Wars demo shows off real-time ray tracing.

Image Credit: Epic Games

We’ve added support for what we call textured area lights, which is the same way we would light movies. You can see multiple reflections. You can see on the character, when she’s carrying her gun, the reflection of the back of the gun in her chest plate. It’s running on an Nvidia DGX-1, which is a four-GPU graphics computer they make. But as you know, hardware gets better every year. Hopefully one day there’s a machine that can do this for gamers as well as high-end professionals. It’s beginning to blur the line between what a movie looks like and what a game can look like. We think there’s an exciting time ahead.

One thing we’ve been interested in over the years is digital humans. Two years ago we showed Senua, the Hellblade character. To this day, that’s pretty much state of the art. But we wanted to see if we could get closer to crossing the uncanny valley. She was great, but you could see that the facial animation wasn’t quite there. The details in the skin and the hair—it was still a fair way from crossing the uncanny valley.

We got together with Cubic Motion and 3Lateral again, the facial reading company, and Vicon, and also our friends at Tencent Next, the research lab at Tencent, to see if we could do something amazing. We put together this project to build a digital human for a presentation in China with Tencent.

This is the actress that was our subject – we made a digital clone of her — the actress we’re going to use at the Vicon booth to drive the character’s digital body. She’s from Manchester, close to Cubic Motion. She’s able to drive this completely different character, all live in real time. You’ll be able to go by the Vicon booth and see her, Alexis is her name, and ask her questions, take a camera and film her, and it’ll be a completely different person talking to you. This all runs at 60fps on an Nvidia—I think we’re using a 1080Ti.

Above: Hellblade: Senua’s Sacrifice is chilling tale of madness and faith.

Image Credit: Ninja Theory

GamesBeat: How would you say the new character is an improvement on Senua from Hellblade, in a demo a couple of years ago?

Libreri: There’s more detail in terms of the resolution of the face. Women actually have a tiny bit of peach fuzz, little hairs all over their face, which was impossible to render two years ago, but now the GPUs are fast enough that we can do that. There are 300,000 little tiny hairs on her ears, on her nose. We also improved the shading technology. The skin shading now supports two layers of specular reflection. It also supports back scatter. If you put a light behind her ear, it glows red like it would in the real world.

We added another thing called screen space irradiance, which is the ability to bounce light off her face into her eye sockets. It’s surprising. It seems like a subtle thing, but it makes a big difference as far as believing what’s happening in your eyes. It’s a pretty significant improvement compared to where we got with Senua. We’re happy with the results. It’s still lacking some detail in terms of how the flesh moves.

Above: Andy Serkis gets fully digitized.

Image Credit: Epic Games

Vladimir, our partner at 3Lateral, has been developing a new face scanner that allows us to not only capture key shapes for different expressions, but to get every single frame of nuance out of her performance. We wanted a test subject that had a lot of dynamic range in their acting ability, and we invited Andy Serkis, of Gollum fame, to go to 3Lateral five weeks ago and get scanned in this new device. Then we were able to get all that data into our engine. We’ll show you that as well.

The thing to bear in mind in this demo is that there is no hand animation, no key framing. No human animator worked on this. This is algorithmically extracted by taking—you saw the gray image with the sparkly representation of Andy’s face. What 3Lateral are able to do is take the key components of Andy’s digital face and fit them to what’s happening in that 4D capture, extracting animation data from there. If there’s a mismatch and it doesn’t quite hit the pose, then they’re able to work out what’s missing and feed that back into the animation system to get even more detail.

Above: This is an alien version of actor Andy Serkis.

Image Credit: Epic Games

You’ll see creases and details that you’ve never seen in a digital character that hasn’t been held and sculpted over months and months for a movie. This is all algorithmically extracted, produced through computer science and mathematics.

The cool thing is that when we present it, it’ll run in real time. It runs on an Nvidia Titan V. We’ll have an iPad that we can use as a virtual camera. Thanks to ARKit, you can use the iPad as a cinematography device. We’re able to walk around and film the virtual Andy to present all the details. What do his teeth look like? What does his eye look like? All because the ARKit gives us camera tracking capabilities that we can feed into a high-end PC to render graphics.

Another nice thing is, because of the way 3Lateral build their animation rigs, they’re standardized. You can take that performance and apply it to a different character. Here’s the same performance, with no tweaking or intervention. It’s just moving the numbers between two characters, and you get something different.

GamesBeat: Are we done with digital humans now? Or do you think it’s not perfect yet?

Libreri: That system is one where you sit an actor inside a capture volume. That’s not really practical. An actor has to be given space to move around and express themselves. The next phase for us will be taking that, adapting some machine learning technology, and being able to take a simple head-mounted camera to get the same fidelity by having a big database of shapes like that. But I think from a rendering perspective, if you take this and add the raytracing stuff we’ve just done, we’re this close to crossing the uncanny valley from the rendering perspective.

Sweeney: The interesting question is what that does for everybody across all of these industries. Game developers can now have digital humans that are much realistic and can be captured more economically. A big part of this is cost. A triple-A game might spend a quarter million dollars per actor to capture their data. This greatly economizes that process. What could it do for TV and movies? It’ll greatly accelerate the movement of all of those pipelines to real time, which is already underway.

Libreri: TV shows are being made on our engine. There’s a kids’ TV show, Safari, where they make an episode a week, all done in real time with Unreal Engine. There’s going to be a real time revolution across all industries, not just gaming.

For games, the challenge now is we’re getting to fidelity where it looks super real. We have to start thinking about the brain. If it’s a game character, it has to behave like a real human as well. It’s one thing having a performance recorded. We have to think about how you add logic and reaction capabilities that don’t look like it’s just a state machine, which is the way games do it now. There’s still tons of research to do.

Above: Epic Games Siren demo

Image Credit: Epic Games

GamesBeat: For the real time facial animation, are you guys working on that technology, or is it mainly your partners?

Libreri: It’s our partners. We work very closely. All these collaborations—every day we meet together on a video conference to look at the latest results. Cubic Motion provides the real time capture technology for Siren, the girl in the red dress. We actively help drive them to come up with better solutions. We like the idea that lots of people are working in this field.

There’s even capability now in the iPhone, where you can record a piece of facial animation and drive it through Unreal Engine. Some time later in the year we’ll probably release something that allows people to do that. If you want to do an emote in a game, just shoot yourself with your iPhone and you’ve got it. We have facial animation rigs for the Fortnite characters. It wouldn’t be that hard for us to hook up iPhone X face recording to the game. We have to prioritize what customers want, but I’d love to do that. To be able to record your own funny emotes, that would be awesome.

GamesBeat: When do you think we’ll see games that start using this technology?

Libreri: The super high fidelity digital humans, probably not this year. More likely the year after. But I can’t really say more than that.

GamesBeat: Something like OctaneRender, does it fit in as far as something Nvidia is doing?

Sweeney: The OctaneRender is a good quality GPU-accelerated raytracer, but it’s not built for interactive media. It has that progressive—the image refines over a few frames or a few seconds. It’s a bit of a different use case. I’ve heard that Jules has gotten integration into Unreal Engine for enterprise customers, but I just don’t think it’s practical for fully interactive experiences. It’s pretty cool, though.

GamesBeat: Is this ahead of what Unity is doing right now? How is that competition going, from your perspective?

Sweeney: We don’t see a lot of overlap among customers, other than some flow of Unity developers who are moving to Unreal to develop higher-end games. They remain the engine of choice for indie and mobile developers, especially building more simple games. We remain the engine of choice for high-end PC and console.

The trend of mobile games moving toward high-end games for gamers is going to be interesting for the engine, in that you’ll see a lot of Unreal adopters. A lot of the leading titles in Korea, and now in China, North America, and Europe pushing high-end mobile are all Unreal-powered. There’s ARK. There’s PUBG mobile, which just came out. Fortnite is driving the way. In Korea there are a lot of amazing console-quality mobile games powered by Unreal.

We can always look at what’s happening in Korea as a leading indicator of the market. Free-to-play was big there for several years before it came here. High-end mobile was big there starting two years ago, and now it’s taking hold here. The future is becoming more about the high end, and therefore it’s a more Unreal-powered future across all platforms. We think that’s a great thing. The more we can make it possible to play all of your games across all platforms, taking your player and your progress and your inventory and anything you’ve paid for with, the more it enables players to connect with all of their friends.

At that point gaming becomes a phenomenon that’s almost like a social network. Gamers connecting together and having shared experiences. It’s not divided up by platforms so much as just groups of real-world friends. We’ve been happy to be able to work with Sony and Microsoft to have the first game that honors everyone’s purchases across iOS, Android, PC, Mac, and the console platforms. Of the 36 combinations of platforms that could theoretically play together, 35 are supported right now.


GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. How will you do that? Membership includes access to:
  • Newsletters, such as DeanBeat
  • The wonderful, educational, and fun speakers at our events
  • Networking opportunities
  • Special members-only interviews, chats, and "open office" events with GamesBeat staff
  • Chatting with community members, GamesBeat staff, and other guests in our Discord
  • And maybe even a fun prize or two
  • Introductions to like-minded parties
Become a member