A new GamesBeat event is around the corner! Learn more about what comes next.
Making human faces believable has been a goal of video game developers for a long time. So it may not surprise you that the makers of the new Call of Duty: Advanced Warfare — a game in a franchise that has virtually unlimited development budgets — have tried to deliver on this promise in the latest installment of the multibillion-dollar modern combat series.
Sledgehammer Games, the developer of the new Call of Duty that debuts Nov. 4, wants you to do a double-take when you look at the human faces. The studio tried to do this by pushing technologies such as high dynamic range, physical-based shading, wrinkle maps, performance capture, and physical-based lighting. All of these techniques add subtle features that make animated human skin look and move in a more realistic way.
On a recent visit to Sledgehammer’s headquarters, I sat down with art director Joe Salud and Aaron Halon, the director of product development, to talk about the realism of the faces as well as other features of the game, like real-world environment lighting. The team went out, for instance, to the area around the Golden Gate Bridge in San Francisco to capture the natural lighting of the greenery, the ocean, and the sky.
“It took time for our team [to do this],” Halon said. “It was a big shift.”
Three top investment pros open up about what it takes to get your video game funded.
Here’s an edited transcript of our interview.
GamesBeat: You’ve noted before that people take a look at your characters and do a double-take. They wonder if they’re real or animated. Is it real, or is it a game? Is that part of the vision for PlayStation 4 and Xbox One development?
Joe Salud: Yeah, that is. We have a tack that we like to call “physical-based” here. We’ve done tons of research on how to achieve that. There are three basic pillars that make that reality. You have to have HDR, you have to have this thing called physical-based shading, and you have to have something else called physical-based lighting.
To make a long story short, what we do is, everything is based on physical values. We let the renderer do all the calculations as to how shiny something is, how matte something is. They’re all related to each other. It’s using less of the individual artist’s eye to decide on the granular details and more of the engine procedurally doing a lot of that.
GamesBeat: Was there something in your background that put you on this path toward photorealism, or whatever you’d call the endgame here?
Salud: There are two approaches I’ve seen in games. They go more stylized or more real. What we’re trying to do from an art direction standpoint is to sell this crazy world. We’re trying to cross the line of what’s believable and realistic. We’re trying to immerse you more. One of the visual tools for doing that is fooling your eye into thinking you’re there. Giving you at least that impression. If that impression comes through, you’re not going to be concentrating on the graphics so much. You’ll just enjoy the experience. You’ll feel like you’re really there.
It helps you buy some of what we’re trying to sell here. These advanced, futuristic vehicles, what could make them more believable than to see them in a world where they look realistic?
GamesBeat: How does full performance capture play a role in this, as opposed to just face capture or motion capture?
Salud: To touch on it from a high level, performance capture’s been huge. We’ve done a couple of layers that are different from previous games. You capture the performance, obviously, from the face, and then on top of that we have what we call wrinkle maps. Depending on the musculature or the muscle movement, the skin also moves on top of that. It’s not just geometry moving. It’s geometry simulating muscle performance and the skin on top of that. We also have a skin shader that receives the light on a scientific level.
Aaron Halon: One of the interesting things about realism and environment art from a production standpoint, which was new for me when overseeing the team and the artists and designers—you talk about going out and capturing facial. But even for the environment, we were capturing real-world lighting with real-world materials. That was a big part of this physical base.
For Collapse, it’s obviously there in our back yard. We had the art team go out to the Golden Gate Bridge and do a lot of color grading tests. For other, more exotic locations, we would find proxies. We’d go on a ship or something like that for things like the later part of the San Francisco levels.
Salud: It was very important for us not to just say, “Hey, to our eye, what color is this?” We have tools that can take the materials out and measure its physical response to light. We would measure the sky and capture the sky and say, “At these points in the sky, we have these physical properties.” We would do that to everything, and we would re-create it.
Halon: It took time for our team. It was a big shift. Imagine you hire a lot of senior artists who’ve been in the games industry for a long time, and they’re used to building this in a certain way, painting textures to get things to look realistic. We said, “Yeah, we want you to do this same thing, but no more painting and tweaking textures.” For a while they said, “Wait, what do you mean? That’s what we’ve always done.” No, we’re going to let the lighting engine dictate that now and control it in a different way.
Some people were not quite comfortable with that for a while. Eventually, though, they started seeing the results and got more comfortable with how to build for that.
GamesBeat: Would it be similar to what the Pixar animation studio did with Monsters University? They brought up global illumination a lot — where the sun is, that’s your light source, and you’re not going to paint shadows on things.
Salud: It is very similar to that, yes. Global illumination is just one part of it. There’s how the global illumination responds to the surfaces. Before what would happen is you had 10 artists, and they’d all work on making this room here. One artist would work on the chair. One would work on this or that. They all see the world a little different. Then, when you turn the lighting on, the room looks almost right, but not quite. It’s because each artist has their own interpretation. If you let the renderer do that, then you eliminate all that interpretation.
Halon: There are efficiencies later on that are exciting. Figuring out how much time we’ve spent with material artists compared to—we can put that time into other details in the environment. It took a while to get there, but now we’re figuring out how to make that more efficient.
Salud: The efficiency that came out of it—before, when we would make this object, we would tune the texture to work in this very specific lighting. But now, since it’s physical-based, it doesn’t matter if it’s in this room or the other room or a room with green lighting. It should all work.
GamesBeat: What was it like joining Sledgehammer and being part of this new studio?
Halon: It’s definitely exciting for us. Call of Duty, for me personally, was something I was really into. I’d been working in games for a while before that, but playing a lot of Call of Duty. That opportunity jumped it right to the top. And also the opportunity to work with Mike and Glen, who I’d worked with in the past.
GamesBeat: You worked on Dead Space?
Salud: Yeah, we were both on Dead Space. It was also just exciting to develop the studio from the ground up. We were in a very small conference room, just coming up with ideas. We were used to EA before, with offices and cubicles. We didn’t have that back then. We started very small, just an agile little team. It was fun.