Unity Technologies unveiled a new mode of viewing games and esports dubbed CineCast Mode, which allows casters, spectators, and gamers control the camera angle that they want to watch.
The new smart camera system was revealed at the Unite LA event’s keynote talk today in Los Angeles. It is meant to change the way that people watch games being played in esports or livestream arenas, said Adam Myhill, creative director at Unity, in an interview with GamesBeat.
Unity did this because players want a better viewing experience when watching games like Overwatch. The Overwatch League created such tools for viewing its game events, but those tools aren’t available to everyone. By releasing the tool, Unity hopes to bolster the size of the viewing market for games.
“The reason we built it is because in games and computer-generated animation, doing good camera work is really hard,” Myhill said. “In the real world, it’s really easy. You can control the camera and the lens. But in animation, there is no relationship between the camera and the lens. Cameras are stupid.”
So, via Cinemachine, Unity created “camera robots,” and it gave them an understanding of cinematic scenes, creating, in effect, synthetic camera operators. And now Unity is building on top of that to deliver CineCast mode, which is more like having a synthetic movie director who can deliver the best shots to view.
“People want to watch directed experiences, and the problem is how do you make a movie when you don’t know what is going to happen next in something like a game,” Myhill said.
CineCast Mode is powered by Cinemachine (Unity’s camera system designed to blend any camera image to any other camera image seamlessly in a scene). And now Unity will put it into the hands of casters, viewers, and gamers, starting in 2019.
It sounds similar to what Genvid Technologies has been talking about for some time with its cloud streaming technology. CineCast is a dynamic camera system that enables movie-like cinematic sequences from variable game play, in real-time. It powers in-game replays, it enables footage for trailers, it’s a streaming casting tool and it makes esports more watchable and exciting.
It’s like procedural cinematography, Myhill said. Developers can use it to make cool game trailers more easily. And it can be used for things like instant replays of a car crash in a game, which gamers or casters can then share with friends.
“You evaluate shots and make really intelligent decisions about what the best shot is,” he said. “Our hope is it will change how we view 3D content. The viewer doesn’t have to watch this passively.”
CineCast works by dynamically generating cameras which consider multiple subjects and how to best compose and track them. It then evaluates all the shots based on the current situation or story and dynamically edits between them with AI that understands core cinematography rules. CineCast can be used in ‘full auto’ mode or given direction on what subjects and types of shots are desired. The result is trailer-like footage from games while they’re being played in real-time.
Myhill said the replay camera systems in the market now are hard to use and take years of support from a dev team to make them decent. Capturing good game footage for a trailer with a marketing camera is tedious and slows the entire process down. And for esports, the footage is most times jarring and hard for the audience to understand what’s happening.
The first game to implement CineCast is GTFO, a four-player co-op game for hardcore shooter players, from 10 Chambers Collective.