Interested in learning what's next for the gaming industry? Join gaming executives to discuss emerging parts of the industry this October at GamesBeat Summit Next. Register today.
Even if you’ve never used Tobii’s new eye tracking computer, it still feels like you have.
Seconds into using the device, the whole experience comes together: Glance at an onscreen object, click, and it opens. The entire process is so surprisingly fluid that you barely realize you’re not using a mouse.
Carl Korobkin, Tobii’s business development vice president, says that this experience exposes one of the fundamental realities about how we interact with devices today: It’s all really inefficient.
“With smartphones now, you’re touching the screen, but you’re already touching the screen with your eyes. Why reach out and touch something if I’ve already looked at it? You really don’t need that mechanical process anymore,” he said earlier today.
Join today’s leading executives at the Low-Code/No-Code Summit virtually on November 9. Register for your free pass today.
It’s tough to argue with that logic. Like touch input and voice, eye-tracking breaks down the abstraction of interfaces between you and your devices. Why use a mouse — or even a touchscreen– when you can just look at what you want to interact with? Why type your search queries when you can say them?
(Tobii, however, isn’t ditching the idea of touch entirely: Its laptop prototype features a pressure-sensitive touchpad developed by Synaptics, which replaces the kind of clickable touchpads found in current laptops.)
While eye-tracking has been around for years, its underlying technology is finally getting small enough that manufacturers can implement it in smaller devices — including laptops, tablets, and, yes, smartphones. Two years ago, none of this would have been possible, but by next year, it’ll be everywhere.
And the possibilities are exciting. Imagine if Amazon created a version of its Kindle app that followed your vision, highlighting words as you read along, or even a horror game designed to make monsters pop up on the screen based on where you’re looking. In a more subtle example, imagine if you could simply dismiss dialog boxes with a look instead of a click. (Check below for a video of how some of this looks in action.)
“Our goal is to make the entire computing process 10 percent faster and better — and that’s huge. Our number one application is everything,” Korobkin said.
A good example of how designers are implementing eye-tracking comes from Samsung, which developed a feature called Smart stay alongside the Galaxy S III. Using what Korobkin calls a “rudimentary” form of eye-tracking, Smart stay monitors the user’s vision to determine when it will dim the device’s screen.
Korobkin, though, wants to take that idea further. “We know very accurately if you’re looking at the screen, so it’s the best screen management you can have,” he said.
While Tobii’s eye-tracking laptop is still a prototype, Korobkin says that computer manufacturers are already looking for ways to add the technology to their devices. More, Korobkin argues that the rise of eye-tracking is going make current touchscreen-equipped laptops look really crude in comparison.
“Microsoft is trying to bridge the gap, but it turns out that eye-tracking really is the ideal form of touch,” Korobkin said.
Like touchscreens before it, eye-tracking is in many senses an inevitable step toward a future where the division between humans and computers is blurred into nonexistence. For most of computing history, we’ve been reading our devices. Now our devices are finally reading us.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.