Interested in learning what's next for the gaming industry? Join gaming executives to discuss emerging parts of the industry this October at GamesBeat Summit Next. Register today.
Unity Technologies has been known among developers as the maker of a “game engine” since the tool debuted in 2005. But these days, “creation engine” might be a better name for it.
That’s because experiences created with Unity are more often than not films, interactive movies, advertisements, or augmented reality and virtual reality content. The game engine has grown up beyond games, and it is being used to expand the definition of entertainment, according to Isabelle Riva, head of the Made with Unity developer program at the San Francisco-based company.
In advance of the Siggraph computer graphics show in Vancouver, Canada, I spoke with Riva. Disney Television Animation said last week it is making the Baymax Dreams series, a trio of short films in the Big Hero 6 world, as an animated series using the Unity engine. You can see various films embedded in this post that were all built using Unity. These short films have become showcases for how Unity will not only democratize games with universally accessible tools. It will also democratize film, Riva said.
Here’s an edited transcript of our interview.
GamesBeat: I saw the four films that were attached. The Baymax one is the only Disney one. Is that right?
Isabelle Riva: Right. It’s a partnership with the Big Hero 6 TV series. Disney produced those shorts, the Baymax Dreams, with Unity’s support. Also, if you saw, Disney is releasing their first VR film, Cycles, which will be at Siggraph 2018. That project was made in Unity by Disney. Those were our most recent Disney partnerships.
GamesBeat: Unity-made films are catching some momentum, then?
Riva: Absolutely. These projects validate what people have been working toward for many years, which is having a real-time render engine as part of a mainstream animation pipeline — for all the benefits it brings. It’s definitely catching on now.
GamesBeat: What’s the common thread among the films and newer projects here, as far as how they’re using Unity?
Riva: The Big Hero 6 episodes were created differently than the ones that Neill Blomkamp did with ADAM. These were entirely made in Unity. We had the benefit of this amazing direct link to Autodesk, who produces the Maya software used for keyframing animation. All of Disney’s animation is keyframed in Maya. Apart from that, everything else is made in Unity. We have this connection between animation and engine going, which is really efficient.
We got rid of storyboards. We went straight from scripts into pre-vis. Once all the assets and modeling and texturing were done, once the characters were in there, the director was able to play and decide where the camera goes, what time of day it is. All those changes could be made in real time. It was very empowering as a pipeline for the storytellers.
The director would call for performances from the animators and say, “Hey, can you do one of these but with that goofy squeaking around on the top of a fence?” The animators could provide a performance, and it would appear directly in Unity. Then, the director could set their cameras up, almost as if they were right there on set. That’s how it evolved. It was kind of like game development, where you start rough with pre-vis and gray boxes, and the more you go, the more it becomes refined and polished, the closer you get to your target.
It was a very non-linear way to make an animated story. The well was set continually by all of the departments, including lighting and everything. It was instant compositing. Normally, the director would be in an edit suite, making judgment calls on timing and camera positions. Now, they can change the character’s color, the lights, everything in real time. That really shrunk our team. It sped up our production time. We basically got a new work flow into a steady state. Disney was very pleased.
GamesBeat: How often are film creators doing what you guys have demoed on stage in VR, walking around inside their creations and working from inside?
Riva: This one wasn’t VR. These were made for broadcast. We didn’t have to finish in a stereoscopic environment and test it like you would in a VR setting. We were producing what ended up being a high-res Quicktime movie. But there was a VR element to the early concept art. Our concept artist came from a VR background and, instead of drawing on paper, he painted in Quill. With that, we were able to transfer those files directly into the engine, which meant that we had 3D concept art from the beginning.
Unity is being used by Disney in a number of different ways. Not just for broadcast animation but, obviously, like you saw with Cycles, they’re using it for VR storytelling as well. They’re using it as part of big live-action movies. Certain films use virtual production tools when they’re filming green screen, and a lot of those tools are built on the Unity foundation.
GamesBeat: What advantage would you say Unity has in comparison to something like Unreal for this work? Unreal also pushes the cinematic quality of their engine. How do you compete in this area?
Riva: With respect to film — and in this specific case of episodic animation — our advantage is the direct link to Autodesk. Unity and Autodesk have a collaboration that allows for a greater link between their media and entertainment tools and ours. For example, we have access to source code for their FBX file format, which delivers a much more streamlined process for sharing assets in the modeling and texturing of a character or a prop and how it’s used in the engine. The artists are able to improve how they use 3DS Max and Maya thanks to the real time power of Unity. I feel like that’s a big difference. That gives an advantage to filmmakers in this particular case.
GamesBeat: Is there any real difference in costs between Unity and Unreal? Or, at the very least, it’s less expensive than traditional ways of doing this?
Riva: The traditional approach to episodic animation employs so many people because every person is rendering out in the traditional way. They don’t have instant access to their work. They have to — in some cases lighters and compositors have to set up their render for the day, go home, and come back to see the results. That requires a lot more labor and often requires a render farm. You have stacks and stacks of [graphics processing units] crunching data. That goes away with a real-time platform like Unity. Every station has the power to render instantly.
Everybody is actually working on the same scene. You have multiple artists working on the same beats, but they don’t step on each other’s work. They can see the context of other people’s work when they’re finishing their own. An animator will see the light almost finished in the scene and think, “Oh, the sun is right there. Let me squint this character’s eyes as he passes through it.” You get a much higher quality scene because they’re working together.
You wouldn’t get that in the traditional setting. You’d have to go through a stepped, waterfall kind of process before you got to that idea. Then, you’d have to go back four steps to get it made. That makes for a much more efficient process.
GamesBeat: That ability to work concurrently, I assume you could do that remotely as well? Has that been around for a while, or is that relatively recent?
Riva: Game development has been doing that for a long time because people are able to share a build and work in that same build. That technology is at play here. For animators and filmmakers, it’s newer. The team that Disney put together with our help was distributed from Montreal to Singapore, across many time zones, and they all worked together from their respective cities on the same project.
GamesBeat: I remember that was part of MaxPlay’s pitch from a few years ago, that you could work concurrently in the same file with somebody else.
Riva: Right. That’s definitely happening now but across the whole pipeline. It’s not just between two animators in Max. It’s happening between lighters and animators and editors and music and sound.
GamesBeat: Are you going to show a lot of stuff at Siggraph? What do you have lined up?
Riva: We have some great talks at Siggraph. It’s really exciting. Two of the three episodes are in the can and approved for broadcast, so we’re using those at Siggraph to demonstrate what we learned through the whole experience with Disney. One of the talks is almost an unplugged kind of demonstration, where the director is going to take a scene from one of the episodes in Unity, and they’re going to completely change the scene into another story using the same elements within minutes. That will show the audience, live, just how powerful the engine is.
We have another talk dedicated to tooling, which explains how we were able to do this concurrent, parallel work and roundtrip with Maya and Autodesk really well. We have a talk that focuses more on the lighting and effects we were able to achieve in-engine for the animated episodes. It looks like film quality, beyond broadcast. That’s really exciting to show as well because for animators, especially episodic producers who are making really high-volume animation, changing the color of a character’s costume on the fly is really useful when you want to have asset re-use from episode to episode. You can do that in Unity.
GamesBeat: The Book of the Dead. I recall you showed that at GDC. Was there something new in the latest trailer?
Riva: The Book of the Dead is going to make an appearance at Real Time Live at Siggraph. It’s being used by the team at MPC, Motion Picture Company, who are showing their virtual production tools that are built on Unity. We also have ADAM 2, the Blomkamp ADAM 2. That’ll be at Real Time Live as well. We got into the Electric Theater animation festival with ADAM 2.
We have some other films, like Sonder, which is a gorgeous short film made by an animator at Pixar. We have a lot of episodes coming from France that were built with Unity. Of course, we all know Monsieur Carton. That will be on display at the booth.