Did you miss a session from GamesBeat Summit 2022? All sessions are available to stream now. Watch now.
One of the coolest new tools for making animated movies is a visualization system that shows you what an animated shot would look like in real-time — before a film animator goes through the expensive process of creating a fully animated scene.
I had a chance to view one of these systems in a visit to the Redwood City, Calif., campus of DreamWorks Animation, which used the rig to make How to Train Your Dragon 2, the big summer movie that has generated more than $300 million in worldwide box office revenues. Its real value is in giving a film director an idea of what a scene will look like before he commits the resources to create the scene.
How to make How to Train Your Dragon 2
We went deep inside DreamWorks to find out how it used cutting-edge enterprise and animation tech to make this summer’s blockbuster animation.
Inside that campus is a camera-capture room that resembles a motion-capture studio, with ten state-of-the-art MX-F40 motion-capture cameras from Vicon. Those cameras send out an infrared light signal. The light hits markers on people or objects within the room and then bounces back to the camera. Then tracking software on a Windows machine computes the scene and all of the markers within it. The software can insert the marker locations into an animated world from the film, said Pete Upson, final layout artist and one of the capture-room experts at DreamWorks animation, in an interview with VentureBeat.
The function of the room goes beyond capturing images of actors that artists can use as the foundation for 3D-animated characters. These cameras capture the positions of people and props, which allowed director Dean DeBlois to visualize what he had in mind for both himself and the film crew about what the scene would really look like.
“This gives the director the idea of blocking out the scenes in real-time,” Upson said. “We can work real-time with the director in the room and remove the back-and-forth process.”
I played around inside the room as a faux director. I had a camera on my shoulders that transferred the images in real time over a cable to a big computing rig. That machine transformed the images I was shooting into animated figures I could see on a display. Other visitors played characters from the film, like Astrid and Hiccup. They moved around, and I could see exactly where they were and how much of the screen they took up at any given time. If I didn’t like where someone was standing, I could ask them to move to another spot.
“This is what it allows the artists to do,” said Katie Swanborg, director of technical communication and strategic alliances at DreamWorks Animation, in an interview during a tour of the animation studio. “In the moment, I can provide creative feedback.”
Director James Cameron also created a visualization system for the film Avatar so he could see what his live-action actors would look like when they were converted into ten-foot-tall, computer-animated warriors in scenes in the film.
In the past, computing power for animation was so expensive and time-consuming that there was no way to see what something looked like ahead of time. An artist would create a scene and send it off for rendering. After a day or so, it would come back, and the artist could see it. If the director wanted changes, the work loop would start all over again. The process was like taking a picture with Kodak film. You could take a roll of film, send it off to the store, and then find out if the pictures were good. If they weren’t, you’d have to do it again.
“We are harnessing the power of the hardware and software to put filmmaking back into our artists’ hands,” Swanborg said.
With DreamWorks Animation’s Apollo system, the visualization and animation are now integrated.
The storytellers still do a lot of their work ahead of time using animated storyboards, which are like frames from a comic book. For Dragon, that work took up about two of the five years it took to make the film. But the visualization tool saved a lot of time.
“Five years ago, the process seemed nuts,” Swanborg said. “We sit on top of a computational infrastructure here that can access a tremendous amount of compute power if we want to. Why can’t we use that to make our filmmaking much more like our consumer lives of taking pictures with our cell phones? We are using this to turn cinematography on its ear.”
GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Learn more about membership.