Connect with top gaming leaders in Los Angeles at GamesBeat Summit 2023 this May 22-23. Register here.
I attended the Experiential Technology Conference (XTech) in San Francisco this week to hear talks about creating virtual experiences that truly immerse you in another world. It showed me that virtual reality has only scratched the surface. I liked the event, curated by Zack Lynch of Jazz Venture Partners, because it reminded me of the long view of VR and how much of a role that research and development still has to play in it.
VR has made our eyes feel like we’ve gone to someplace else, but that’s only one of our senses. We have 3D sound as well, but we need more than that to achieve real immersion, said Tal Blevins, head of media at UploadVR, in a panel at XTech.
The long road ahead
The illusions won’t seem absolutely real until all of our senses are immersed in an artificial reality. Joe Michaels, chief revenue at AxonVR, a maker of touch technology for VR, and Tom Carter, chief technology officer of Ultrahaptics, said that touch will deliver an extra sense of realism.
“One of the fundamental aspects of realism in VR is touch,” said Michaels, whose company has spent four years developing that sense of touch. “One of the things you feel when you come out of the womb into the world is touch.”
GamesBeat Summit 2023
Join the GamesBeat community for our virtual day and on-demand content! You’ll hear from the brightest minds within the gaming industry to share their updates on the latest developments.
David Edwards, cofounder of Onotes, is making digital scent technology that is like a speaker for your sense of smell.
“It’s not just visual and sound. It’s a bit unfortunate. The last few years, VR has been about putting on a head-mounted display and looking around,” Edwards said. “It should include hitting all the senses that we can to make you feel immersed in that environment.”
“When your senses agree with what you are seeing, it completely transforms the experience,” said Brent Bushnell, CEO of Two Bit Circus.
Carter of Ultrahaptics said that you’ll need other senses more than you realized when you are trying to grab something with your fingers.
I saw that myself when I tried out Qualcomm’s new wireless stand-alone VR headset prototype at the Game Developers Conference a couple of weeks ago. The headset used sensors to detect my fingers, using software from Leap Motion. It was cool that I didn’t have to use a touch controller. But as soon as I reached out the grab something in VR, it didn’t feel real. That was because when my fingers closed around an object, I didn’t feel anything. The headset didn’t incorporate any touch, or haptics, technology.
And that’s the way it is going to be with VR. One improvement begs for another. As soon as we get wired VR headsets, we want wireless ones. As soon as we get touch controls, we want finger detection. As soon as we get finger detection, we want haptics. That’s why Edwards is working on smell.
Then it kind of hit me. This is an enormous undertaking. It’s why long-term investors like Tipatat Chennavasin, cofounder of the Venture Reality Fund, say that it’s going to take decades before the full impact of VR, augmented reality, and mixed reality is realized.
It reminded me of a speech by Mike Abrash, chief scientist at Oculus, speaking last fall.
“Everyone in this room has jumped in to make VR happen, and our reward is we are on the leading edge of one of the most important technological revolutions of our lifetime,” Abrash said. “Thanks to all of our efforts, VR is going to leap ahead in the next five years….The reason we are all working on VR now is because of our vision of what VR will become.”
What this means for startups
And yet VR skeptics are worried about what’s going to happen in 2017. We saw 40 percent more VR startups created in 2016, but the headsets that debuted during the year only generated about 6.3 million units of sales, according to SuperData Research. That’s not going to produce enough software revenue to sustain many of those VR developers.
“There’s going to be a reckoning,” said Margaret Wallace, CEO of Playmatics, on my own panel on VR entertainment experiences.
And so we agreed that VR still needs patient investors, brand advocates, and passionate platform owners — Oculus and Facebook, Intel, Qualcomm, Apple, Google, and Microsoft — to sustain the investment in VR.
“From what I have a sense of, there’s still a lot of investment pouring in from the platform holders,” said Shiraz Akmal, CEO of VR startup Spaces. “Seven-figure deals are happening. But the difference, at least from my perspective, is that the early days were more of a spray-and-pray kind of situation.”
He added, “Now it’s more targeted. ‘Hey, we’ve invested billions in this platform, and now we need a title that can help us sell the numbers that everyone was projecting a year ago.’ The competition for those dollars is more fierce. There are bigger stakes in the development community, especially those studios that have bet on VR. Consumer adoption is — it’s adopting, but not as fast as we’d all like.”
Fortunately, other fields such as healthcare, enterprise, defense will help to drive it forward. Mike Wikan, creative director at Booz Allen Hamilton, said his company has 150 developers working on high-end VR experiences for those who will pay for it today: the military. The Department of Defense spends up to $7 billion a year on training, and if you can train people better in VR, that saves money.
R&D still to come
The XTech event was also impressive in showing off the breadth of research going on in the space.
Adam Gazzaley, showed some fascinating research about what he called a “closed loop” system. In it, we would play a game in VR, and it would produce an effect in our brain and cause us to react. Gazzaley’s brainwave sensors would capture data on the part of the brain that was stimulated. Then that data would serve as feedback for the game developers, who could refine the game to produce a better effect. The game could also adjust itself on the fly to become more difficult, as needed, for the player.
Gazzaley, whose startup is Akili, is trying to use VR to help people with attention disorders. David Eagleman, a neuroscientist at Stanford University, also showed a cool vest that produces tactile feedback on your torso. He showed how a deaf person could “hear” by feeling the haptic feedback on the torso. Eagleman spoke a word into a microphone, which produced touch sensations on the vest. Then the deaf person wearing the vest spelled out the word that was spoken.
The research talks also made me think of another talk by Abrash at Facebook’s F8 event last year. He said the brain doesn’t see the raw data of reality. Rather, it absorbs what comes in from our senses and processes it. It discards data that it doesn’t need and presents something that we can grasp. In other words, our eyes and senses and brain are interpreting reality for us, not presenting it.
That suggests the solution for challenges of VR. We don’t have to reproduce reality. We simply have to trick the brain into thinking it’s reality. That means we don’t have to use as much computing power and other technology as we think to achieve the aim of immersion.
Think about it this way. Bad VR gives us motion sickness. There’s a mismatch between what our eyes see and what our other senses are telling us, Bushnell said.
VR is such a strong medium that it can produce a physiological reaction in our bodies. But if we trick our bodies and our brain, we can get a desired effect from VR. That’s why one of the XTech talks about magic, or misdirecting the brain, made a lot of sense. Stephen Macknick, professor of the department of neurology at SUNY Downstate Medical Center, said that our brain can’t focus on everything, so it focuses on what it thinks matters. And that gives illusions a chance to make an impression on us.
“Everything is a function of your perception and perspective and the contrast with the world around you,” Macknick said. “You get to decide the way you want to see the world.”
Macknick’s talk made an impression on Noah Falstein, chief game designer at Google.
“I love that he puts up a diagram showing how you’re focused on something, and the neurons receiving that image have other neurons that suppress the input from other areas,” said Falstein. “It’s not only that your brain highlights the thing you’re focused on, but it’s also turning off everything else. That’s why you don’t see things happening around you when you concentrate on one thing very intently.”
If we trick the brain into thinking that what it is seeing is real, then we don’t get sick. The illusion seems realistic and more engaging.
But what if we want to trick our body in a physiological way? One VR app, The Walk VR, makes you feel like you are walking on a tightrope between the Two Towers. When I did that walk, I felt like I was going to fall.
Two Bit Circus did something similar at the event. Its VR app and motion platform took you up the side of a skyscraper on a window-washing platform. You get a sense of vertigo as you look over the edge. The motion platform shakes, and you feel like you are falling. Your body gives a physiological response, and everyone around you laughs at the experience, Bushnell said. The guy in the picture above freaked out when his friend grabbed him and shook him on the platform.
Theresa Duringer, meanwhile, used VR to trigger the opposite kind of physiological response. She has a fear of flying, and she created a VR app, Ascension VR, as a distraction to use on the flight. She wanted to suppress a physiological reaction, the fear of flying, and used VR to try to do that.
The scale of digital reality’s road ahead
All of the research and experimentation made me feel like VR is at the start of a huge undertaking. John Favreau, a Hollywood director, actor, and VR enthusiast, reminded us that we have to find the “humanity in the technology.”
“So much of the time people are reading about technologies, like they’re at a race track seeing what horse will win,” he said. “You have to realize it’s a one-way street. You don’t know how the river is going to flow, but it is flowing in the direction. But if the only people who are involved with it are people who are unconcerned with the human impact of it, it’s going to shape the path. What’s the opportunity to humanize it?”
And as Abrash pointed out in his talks, this is kind of like the Manhattan Project of our age. A lot of bright minds are working on it and debating the ramifications of this new technology, and the Brave New World that it will create.
GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings.