The fact that game developer Valve even employs someone like Mike Ambinder shows you how the company is always thinking forward about the future of games. He’s an an experimental psychologist, not a game developer. And Valve has him working on an new way to make game development respond to player feedback better.
The company that makes games like Left4Dead and Half-Life is exploring new ways to innovate in the medium, such as biofeedback that tells the developer how a player is physiologically reacting to a game, Ambinder said in a session on emotion in games at the NeuroGaming Conference and Expo last week.
“One thing we are very interested in is the notion of biofeedback and how it can be applied to game design,” he said. “There is potential on both sides of the equation, both for using physiological signals to quantify an emotional state while people are playing the game.
“The more interesting side of the equation is what you can do when you incorporate physiological signals into the gameplay itself.”
With today’s games, you map player intent into onscreen behavior using a game controller or mouse. That results in novel experiences. But the game developer doesn’t know how the player is enjoying the game or what is their emotional state.
“If we could start tapping into that, we could tap into a whole wealth of data,” he said.
He said Valve has conducted experiments in which it has measured players’ sweat and correlated that to their level of arousal while playing. It then fed that data into Left4Dead and tried to modify the play experience so it was more fun. Valve also ran an experiment in which the player had four minutes to shoot 100 enemies. If they were calm, the game would progress normally. If they got aroused or nervous, the game would move more quickly, and they would have less time to shoot the enemies.
He also said Valve did research on eye-tracking as well, since you can move your eyes much faster than your hands. It created a version of Portal 2 that you could control with your eyes. The eye-controls worked well, but decoupling aiming and viewpoint helps the overall experiment. That’s like tracking the difference between where your eyes are looking and where your head is facing.
“It’s still experimental, but it worked pretty well, and we were pleased with that,” he said.
The physiological signals can convey information about whether the gamer is angry, afraid, energetic, engaged, jubilant, happy, sad, bored, fatigued, passive, relaxed, or content. Beyond sweat and eye-tracking, you can measure things like heart rate, facial expression, brain waves (electroencephalography, or EEGs), pupil dilation, body temperature, and other things.
Once the developers gets this kind of feedback, they can proceed to better matchmaking in multiplayer games or better profiling of gamers. Players can spectate competitive matches in a more entertaining way, and playtesting can become easier. The data is much more exact than asking someone in a postgame interview about how they felt while playing. Game creators can also do a better job creating peaks and valleys in games.
Here’s Ambinder’s opening comments on his panel at the NeuroGaming Conference and Expo.
VB's research team is studying web-personalization... Chime in here, and we’ll share the results.