The information conveyed in those panels, however, demonstrated how quickly VR developers are coming to grips with the hardware.
Even the best VR devs are still learning
Games like Fantastic Contraption, Job Simulator, and The Lab are some of the highest-profile examples of successful, fun VR design we could point to. One might imagine that the developers of these games have therefore attained some level of mastery over VR. It might be more accurate, however, to say that successful VR developers are the ones who continue to ask the right questions about how the technology works at a psychological level.
Jesse Schell is the founder of Schell Games, the developer behind one of the earliest high-profile VR games, I Expect You To Die. Schell and his team had an advantage: Schell has been working with VR for 25 years, going back to his time creating VR experiences for Disney theme parks as an Imagineer. Schell also teaches a course at Carnegie Mellon University called “Building Virtual Worlds,” where every two weeks, a cross-disciplinary group of designers and programmers builds something unique. This often involves VR.
“So, another thing that I learned out of [Building Virtual Worlds] is the notion of getting up close, in VR,” said Schell at his panel entitled “Lessons Learned from a Thousand Virtual Worlds.” “A lot of people don’t realize that the brain has special nuclei for dealing with situations where physical objects come within arm’s reach. There are certain parts of your brain that turn on when that happens. Other media are not able to turn that on, but VR is able to turn that on.”
Schell delivered a prime example of how developers are understanding what makes for compelling VR experiences, by understanding the related psychology that powers those experiences. Comprehending how VR does what it does to the human brain is not just an interesting aside for developers. It may be a prerequisite for success. “We made strong use of that in I Expect You To Die,” said Schell. “We’re making use of it in all of our VR games.”
Proprioception, or the mind’s ability to sense the relative placement of parts of the human body (like being able to tell whether your arm is being held straight up over your head or extended out away from your body), is another sense that VR developers are learning how to manipulate. If you’re playing a VR game with a control pad, your hands are in relatively the same position during the entire experience, so proprioception isn’t involved.
When hand units like SteamVR controllers or Oculus Touch are being used, and the player is changing the shape of their arms as they reach around the simulation, proprioception becomes a factor. As long as the position of the player’s body parts in VR matches the relative position of the player’s parts in the real world, the brain accepts the illusion of reality.
When the vestibular system that affects our balance doesn’t match up with what a player is seeing in VR, that’s a surefire way to make the user motion sick, so developers have learned not to tilt the horizon, in VR experiences. The human brain is not always a liability that VR developers have to account for. Sometimes, the human brain is a tool that VR developers can use to create smoother, better experiences.
The idea of the “snap turn,” for instance, a practically instantaneous camera movement to a different angle from which the player had been seeing, triggered by pressing a control stick left or right. It’s a form of movement that may be foreign to the brain, but it works because the brain doesn’t have time to process the weirdness of a snap turn. It happens too fast.
Chris Pruett, head of developer relations at Oculus, explained how the trick works at his GDC 2017 panel “Lessons Learned from the Front Lines.” “Snap turns are a form of a sort of more general rule which is called change blindness,” Pruett said. “Change blindness is the idea that if you don’t show the brain something, and it changes … I didn’t see it, so I buy it.”
This is why snap turns, and the popular point-and-teleport movement option in VR, most often referred to by developers as “blinking,” don’t make users motion sick. The movement happens too quickly for the brain to register the change, hence the brain accepts the user’s new position in the virtual space.
These learnings are the basis for successful VR software development. If we did not hear game developers talking about the fundamental rules of successful VR design one year after the release of the Rift and the Vive, we might have cause to worry if we care about the future of VR. That game developers are understanding the rules and limitations of VR so clearly, and so quickly, is a ray of hope for anyone who wants to see VR succeed.
One possible explanation for why VR panels at GDC 2017 were less full than last year was the Virtual Reality Developers Conference held as a standalone event in November of 2016. I asked Alex Schwartz, founder of Owlchemy Labs, and a member of the VRDC advisory board, whether the event in November may have stolen some of the thunder from GDC 2017.
“VR is still so incredibly new and there is so much to discuss, try, share, and learn,” Schwartz told GamesBeat via email. “There are a surprising amount of VR conferences already, and I think developers are still trying to figure out the best format for sharing information. GDC every year is fantastic, but we’re finding that smaller more specific conferences can be useful to get deeper into certain topics. We’re working with a new technology so there is a TON of ground still to cover.”