Hawkeye makes use of the new TrueDepth camera on Apple’s latest iPhone 11 and iPad products, which use three cameras to capture depth perception. That gives the cameras the ability to run software that accurately tracks the movement of your eyes and lets researchers and designers understand how a user experiences a product, from what catches their attention to where they get confused.
But Hawkeye founder Matt Moss said in an email to VentureBeat that existing research tools leave a lot to be desired.
Traditional event-based analytics and always-on screen recorders show which pages the user visits and what actions they perform, but not much more. Narrated user studies, on the other hand, only let you test with a tiny number of people and are incredibly tedious to watch.
Moss started Hawkeye Labs in 2018 after he attended Apple’s Worldwide Developer Conference (WWDC) as a student scholarship winner. After learning about ARKit’s advanced 3D face-tracking features, he wondered if it would be possible to determine where a person is looking on their screen. Once he got the demo working, he posted it to Twitter, where it has been viewed over 350,000 times, to date.
Since then, Moss has been working on Hawkeye Labs (at least when he’s not in class at the University of California at Santa Barbara). The self-funded company has three employees and is based in Santa Barbara, California and San Francisco.