Connect with top gaming leaders in Los Angeles at GamesBeat Summit 2023 this May 22-23. Register here.

Epic Games and 3Lateral showed off a digital version of actor Andy Serkis, who starred in The Lord of the Rings and Planet of the Apes films.

The effort, shown off at the Game Developers Conference in San Francisco, is part of a continuing effort to create digital humans, or believable real-time renderings of people where you can’t tell the difference between real or artificial.

Unreal Engine’s real-time rendering combined with 3Lateral’s Meta Human Framework volumetric capture, reconstruction and compression technology brought Serkis’ digital human performance to life.

The volumetric data was generated by capturing a series of high-quality, HFR images of Andy Serkis from multiple angles under controlled lighting. 3Lateral’s process involved various capture scenarios, some focused on geometry, some on appearance and others on motion. All of these inputs were applied to generate a digital representation of Andy Serkis, and to extract universal facial semantics that represent muscular contractions that make the performance so lifelike.


GamesBeat Summit 2023

Join the GamesBeat community in Los Angeles this May 22-23. You’ll hear from the brightest minds within the gaming industry to share their updates on the latest developments.

Register Here

In the resulting real-time cinematic, a high-fidelity digital replica of Andy Serkis recites lines from Shakespeare’s Macbeth in nearly indistinguishable video and performance quality from his real-life acting. Then the companies converted Serkis’ face into an alien’s.

Above: This is an alien version of actor Andy Serkis.

Image Credit: Epic Games

The Macbeth performance data was also used to drive 3Lateral’s fictional digital creature, Osiris Black, to demonstrate how the same capture can drive two vastly different characters.

3Lateral’s semantic compression reduces data sets while preserving the integrity of the data, enabling the ability to retarget the performance onto a digital character while easily altering gaze and subtle performance nuances.

This high-fidelity capture is pre-processed offline to a data set that can be loaded into Unreal Engine to enable real-time volumetric performances.

While this tech will remain in the realm of professional visual effects for now, Epic CEO Tim Sweeney believes that someday photorealistic digital humans will be used in interactive entertainment, simulations, research, non-verbal communication as an interface with the machines, artificial intelligence and mixed reality applications as well.

Kim Libreri, chief technology officer at Epic Games, said it took about five weeks of work to pull together the demo with Serkis. The demo got a resounding round of applause at the GDC event.

GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings.