The problem with virtual reality is that it’s never quite immersive enough. While the visuals and sound make you feel like you’re wandering in a 3D space, that feeling of immersion disappears when you try to touch something. That’s why Ultrahaptics is developing ultrasound “haptics” technology so you can feel things in VR using “mid-air touch.”
The goal of VR is to make you feel like you’re someplace else. The closer you get to that sense of immersion, the more likely the industry will grow as predicted to $30 billion a year by 2020 (as forecasted by tech advisor Digi-Capital).
Ultrahaptics, based in Bristol, England, is the brainchild of Tom Carter. The chief technology officer of the company started working on it during his university days in 2009. He turned it into a doctoral project (which he finally handed in a few weeks ago), and he met Steve Cliffe. They started a company in November, 2013. The company has a demo of the technology that allows you to feel things by sending vibrations over the air to your fingers.
“We have a small collection of speakers that are emitting ultrasound,” said Carter, in an interview with GamesBeat. “We focus the speakers on the skin. There’s enough force there to slightly displace the surface of the skin. We control that to vibrate the skin at different frequencies to create different sensations. We can also sculpt and form the ultrasound so that it feels like different shapes.”
He added, “This is the next challenge in augmented reality and VR. You have visuals and head tracking. But you need to get your hands into it. The next area of immersion where VR breaks down is touching something.”
It feels like a fingertip-size impression at its smallest. It can create a force field that feels like a barrier in the air.
“We can create any type of vibration the hand is capable of feeling, from soft and gentle to a continuous smooth sensation to a Force-like lightning from Star Wars, with a crackling sensation of electricity in your hands,” Carter said. “You can feel dry raindrops battering on your hands to a squishy foam.”
The company raised $872,000 early on, and last year it raised $12.5 million. Investors include the IP Group in the United Kingdom, and Woodford Investment Management.
Now it has 43 employees and is in the midst of bringing its product to the market. In 2014, the company generated about $60,000 in revenue. That turned into $500,000 in 2015 and an estimated $2.5 million this year, said Cliffe, CEO of the company, in an interview.
Ultrahaptics has been selling evaluation kits to developers for about $20,000 each. The buyers have been across the board, from virtual reality game companies to consumer electronics firms and location-based entertainment. One of the surprising markets is automotive, where car designers are trying to enable drivers to control functions on the dashboard without taking their eyes off the road.
“They really moved quickly because they have worked on gesture controls for a long time,” Cliffe said. “It’s like having a button come to your hand so you can turn your air conditioning down.”
Roughly 80 percent of car makers have an Ultrahaptics development kit. The company has lots of patents and no direct competitors at the moment. Taptical Hapitcs uses a different kind of touch feedback in controllers, using vibration motors. OmniWear is creating devices you wear around your neck that give you haptic feedback from a game.
Carter said, “I had always been interested in how you communicate with computers. I was annoyed at bad interface design. At the university, I wanted to do something in human computer interaction. I found a professor who did cool stuff in that area. We discussed the trends. It was around when the iPhone came out and a lot of it was about how you couldn’t feel the keys when you were typing. You could type faster on a BlackBerry. People felt how you lost haptics when you moved from buttons to glass. Kinect also came out and took gestures to waving hands in the air. But you couldn’t feel anything that way.”
Carter started working on the technology and eventually created a company. He said that the ultrasound isn’t harmful in part because 100 percent of it is reflected off the surface of your skin. The speakers are like the ultrasound is used in other sensors, like the sensors in cars that detect how close you are to objects, and the sensors that open automatic doors.
“We had to go back to fundamentals to get the ultrasound to work,” Carter said.
At the moment, Ultrahaptics can’t stop your hand from pushing through something. So it can’t simulate something solid like the feeling of a table, as you’ll always be able to push your hand through it. But Carter said the technology can do more nuanced sensations, re-creating various textures. Cliffe said that he can make the sensation of spiders running down your hand. Right now, the range of the technology is about a meter from the speakers.
One of the first products under way is a pro gaming product. But no details have been released yet. Over time, the company hopes to get the cost of the technology down to the tens of dollars.
I could see how Ultrahaptics could partner with a company like Leap Motion, which enables you to use your hands in VR but currently without the sensation of feeling. Another possible application: hospital elevators. The buttons in the elevators have a lot of germs, due to human hands. But if you could make the elevator move without pressing a button, that would stop the spread of germs. You could also use the technology at an ATM machine, as someone eavesdropping on you couldn’t figure out your pin code if you are just pressing buttons in the air.
Of course, VR porn makers would love to reproduce the sense of touch. But that’s not on the horizon just yet.