Interested in learning what's next for the gaming industry? Join gaming executives to discuss emerging parts of the industry this October at GamesBeat Summit Next. Register today.


A Canadian company called AdHawk Microsystems is announcing that it has created small motion-tracking sensors that could be a boon for augmented reality glasses and virtual reality headsets.

Current AR and VR products are oversized for consumers, and bulky camera-based sensors are a big part of that problem. But AdHawk has created eye-tracking sensors that are small chips made from microelectrical mechanical systems (MEMS), which are commonly used in gyro chips.

Above: AdHawk sensors are tiny MEMS chips.

Image Credit: AdHawk

The Kitchener, Canada-based company has raised $4.6 million in a funding round led by Intel. AdHawk Microsystems said that its smaller, faster, more power-efficient motion-tracking solutions will render camera-based eye tracking obsolete. And they will pave the way for a new generation of highly immersive AR/VR experiences.

So far, most eye-tracking systems, like Tobii’s products, have relied on cameras. Unlike camera-based eye-tracking that needs to be tethered to a computer, the AdHawk system can be embedded in AR/VR headsets or glasses and worn comfortably all day.

Event

GamesBeat Summit Next 2022

Join gaming leaders live this October 25-26 in San Francisco to examine the next big opportunities within the gaming industry.

Register Here

AdHawk can capture thousands of data points per second, enabling a system based on the chips to be able to predict where a user will look next, leading to more immersive AR/VR experiences.

The potential uses are widespread. For gaming, the AdHawk system is so fast that it could be used to figure out where the user is going to look next, allowing games to increase the element of surprise by rendering content in anticipation of the user’s next move.

Above: AdHawk sensors can be put in compact AR glasses.

In health care, by measuring the smallest movements in the eye, the AdHawk system could be used for early detection of conditions such as Parkinson’s disease or to understand the emotional state of the user.

And in training, observers can understand when you’re tired and not taking in information effectively by tracking blink frequency and eye movement. This is especially relevant in use cases like pilot or driver training.

The technology is in a research state now, and the funding will be used to help bring products out for the consumer VR and AR products.

Other investors in the round include Brightspark Ventures and the founders of AdHawk.

As mentioned, current VR/AR headsets equipped with eye-tracking systems rely on cameras to keep track of where the user is looking, and it takes a ton of computing power to process the hundreds of images per second the cameras capture. As a result, these headsets need to be tethered to a power supply and a high-end computer.

AdHawk’s eye-tracker replaces the cameras with ultra-compact micro-electromechanical systems — known as MEMS — that are so small they can’t be seen by the naked eye. These MEMS eliminate power-hungry image processing altogether, resulting in big improvements in speed and efficiency and size.

The AdHawk can predict where an eye will move up to 50 milliseconds (50 one-thousandths of a second) in advance.

“Creating a sense of total immersion, through an untethered, responsive and unobtrusive headset, is the ultimate goal of the VR/AR world,” said Neil Sarkar, CEO of AdHawk, in a statement. “We believe our technology will go a long way to enabling headset makers to deliver that experience to their users.”

Above: AdHawk enables much smaller VR headsets.

Image Credit: AdHawk

Replacing cameras with AdHawk’s tiny, low-power devices, which can operate for a full day on a coin-cell battery, could enable headset manufacturers to provide eye-tracking without the need for tethering.

AdHawk’s device is currently available for purchase as an eye-tracking evaluation kit. The company has already gleaned worthwhile information from user testing, Sarkar said.

“We have discovered that when we take thousands of eye-position measurements per second to capture the dynamics of eye movements within saccades [the eye’s rapid, abrupt movements between fixation points], we get valuable insight into the state of the user — are they tired, interested, confused, anxious? Where exactly will they look next? This information can be fed back into the VR/AR experience to greatly enhance immersion,” Sarkar said.

GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings.