Epic Games, visual creation company The Mill, and car maker Chevrolet have created a film that uses the Unreal game engine to blend an animated car in real-time into a video that will be part of a television advertising campaign.
The companies collaborated together on “The Human Race,” a real-time short film and augmented reality presentation that uses Unreal Engine 4, virtual production technology, and Chevy’s car brands. They showed off the film and how they made it at the Game Developers Conference in San Francisco.
The film makes use of The Mill’s special adjustable car, dubbed the Blackbird. The race car has QR codes and markers that allow it to be used as a template for digital animations. It also has a 360-degree camera. Filmmakers shot video of a real car and the Blackbird racing on a mountain road. Then, in real-time, the animators covered the car with digital animations that transformed it into another sleek race car. As they shot the Blackbird, the filmmakers could see through a viewfinder what the finished animated car would look like in the scene.
Epic Games CEO Tim Sweeney (who will be a speaker at our GamesBeat Summit 2017 event in May) said in an interview with GamesBeat that the creators used an advanced implementation of Epic’s Unreal Engine with The Mill’s proprietary virtual production toolkit, Mill Cyclops, to create a film that merges real-time visual effects and live-action storytelling.
GamesBeat Next 2023
Join the GamesBeat community in San Francisco this October 24-25. You’ll hear from the brightest minds within the gaming industry on latest developments and their take on the future of gaming.
“This is a way to bring down the costs of animation,” Sweeney said. “Brands can’t always have a car on the set. Sometimes that’s due to security reasons. We had to find a way to visualize a car that wasn’t there. So we created a virtual car using the Blackbird. We used the Blackbird in shoots and then skinned it with a computer-generated car.”
In the ad campaign, “The Human Race” is about a real-life car driver’s battle with an artificial intelligence being that drives a car without human control. The film features the 2017 Chevrolet Camaro ZL1 in a heated race with the Chevrolet FNR autonomous concept car.
“This sounded crazy, but we figured out how to do this in real-time,” said Kim Libreri, CTO, Epic Games, in an interview. “It’s real-time visual effects. We generate cars, composite them, and put them in a video. This looks like it went through months of post-production processing. In reality, it is rendered in real-time.”
Sweeney said, “The question here is can your game engine produce pixels that match the real world.”
The only physical vehicle filmed for “The Human Race” was the Mill Blackbird, a fully adjustable rig that enables filmmakers to insert any car model into any filmed environment. Until now, computer-animated cars were added by visual effects artists in post-production, requiring days of rendering to produce high-quality imagery. During this shoot, however, live video feeds as well as positional data from the Arraiy tracking system were fed directly into Unreal Engine. The Camaro was then rendered and composited seamlessly into the footage in real-time AR, allowing the directors to instantly see the final look and composition of each shot.
The same real-time technology was used to create, alter and produce the short film, blurring the lines between production and post. The ability to create ‘final pixels’ in real time will ultimately change the way filmmakers create content and make critical decisions, Sweeney said.
Alistair Thompson, executive vice president at The Mill, said at Epic’s event that his company created Blackbird because it couldn’t get access to a car that it needed to shoot a commercial. Chevy’s commercial creators, however, need to be able to see the animated car as they shoot a scene through a viewfinder. On stage, Epic, Chevy, and The Mill showed how they could present an animated car on screen simultaneously as the Blackbird was being filmed in front of the crowd.
‘“The Human Race’ blends cinematic storytelling and real-time visual effects to define a new era of narrative possibilities,” said Angus Kneale, chief creative officer at The Mill in New York, in a statement. “This is a pivotal moment for film visual effects and the coming era of augmented reality production. Using Unreal’s cutting-edge game engine technology, filmmakers are able to see their photoreal digital assets on location in real time. It also means the audience can effect change in films in ways previously unimagined, giving interactive control over vehicles, characters and environments within a live action cinema experience. With Mill Cyclops, The Mill’s proprietary virtual production toolkit, we are able to render and integrate digital assets into the real world to a level never seen before.”
”Find New Roads is more than just a tagline at Chevrolet. We embrace that mission in the advanced engineering of our cars, and also in the way we service and communicate with our customers,” said Sam Russell, general director of global Chevrolet marketing, in a statement. “‘The Human Race’ brings a lot of those ideals together. The technology involved in producing this film provides a glimpse into the future of customer engagement and could play a unique role in how we showcase car model options with interactive technologies like AR and VR.”
The Mill and Chevrolet conceived the film to kick off a multi-platform campaign marking the 50th anniversary of the Camaro.
Sweeney said, “This technology is changing the economics of entire industries.”
In another show of the Unreal Engine’s quality, Sweeney invited Lucasfilm’s chief creative officer at Industrial Light & Magic John Knoll on stage to talk about Rogue One: A Star Wars Story and how the Unreal Engine 4 was used to produce the final pixels for scenes that used the K-2SO android.
Knoll said that Unreal renders were worked into scenes with the robot, and you can’t tell that the filmed scene blends both animated and real filmed sequences.
“You can imagine all these forms of digital content coming together in the Metaverse,” Sweeney said, in a reference to the virtual world envisioned in sci-fi author Neal Stephenson.
“These are examples of real-time motion capture that bring to life things that weren’t possible before,” Sweeney said.
Epic also showed another cool visual in a scene from Finding Dory. Pixar has created tools, such as a universal scene descriptor, that allows digital assets to be moved from games to movies and vice versa. The Unreal artists used that tool to take a scene from Finding Dory and turn it into an animated 3D scene that you could view from any angle. Now it’s a place you can visit in VR.
Sweeney also showed how you could use virtual reality to design a game, from inside the game. The Unreal Engine VR editor is a full editing tool that lets you build an entire level of a game from inside virtual reality. It’s a bit like building a full game scene by snapping together Lego bricks and then twisting them into the right shape with your hands, the way a real artist does.
GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings.