Ask any Oculus Quest or Rift S user to name the headset’s most impressive new feature, and there’s a good chance they’ll point to Insight, the tracking system that accurately senses your head and body movements without external cameras. In blog posts today, Facebook is revealing some of Insight’s little-known technical details while making an interesting prediction: “the future for this technology is in all-day wearable AR glasses that are spatially aware.”
Engineers already knew that Insight uses computer vision and visual-inertial simultaneous localization and mapping (SLAM) technologies for tracking, relying on multiple wide-angle cameras to track controller locations and your relative position in a room. But making Insight run on a mobile processor for VR purposes was no simple feat, due in equal parts to the heavy computational requirements of real-time 3D space mapping and a headset wearer’s intolerance for tracking imperfections that might not be noticeable on mobile displays — jitter, swimminess, and inaccuracy. Oculus engineers decided that Insight’s “spatial AI” system needed to work to a sub-millimeter level of accuracy, and do so without eating the Quest’s entire processing budget.
Several tricks brought Insight to life, including efficient multithreading. The Oculus team reserved Quest’s more power-hungry GPU and 2.45GHz “gold” CPU cores entirely for apps and games, while giving its power-efficient 1.9GHz “silver” CPU cores responsibility for tracking and OS functions. Optimizations enabled the silver cores to operate predictively and semi-independently in maintaining Insight’s room maps and positional data, while the gold cores and the GPU generated visuals specific to the user’s current viewing angle.
To reduce the hit on the silver cores while improving accuracy, the team cross-referenced external motion capture data from high-end OptiTrack cameras against internal data captured from real Quest and Rift S users in “hundreds of environments,” including studios and employees’ homes with varied real-world lighting conditions and obstructions. The resulting models precisely captured users’ positions to compare against Insight’s findings so unnecessary processing could be minimized and inaccuracies could be eliminated. Special low-latency sensor data and kinematic predictions helped considerably, as well.
Insight’s next big step will apparently be from VR to AR. Going forward, Facebook plans to further reduce sensor latencies and cut “power consumption down to as little as 2 percent of what’s needed for SLAM” on a VR headset, enabling Insight to power tracking for AR glasses capable of being worn all day. The company is already suggesting that they’ll be “lightweight” and “stylish,” at least eventually.
Despite Facebook’s continued work on AR software, it doesn’t sound like the company is ready to bring Insight to AR hardware right away. Facebook suggests that chips will need to keep evolving — which they will, thanks to manufacturing innovations — and that it will need to develop new AI to “further optimize the process of synthesizing multiple sensor inputs.”
Whether the first pair of all-day wearable AR glasses with Insight are ultimately Facebook-branded or co-created with another company remains to be seen. But it’s clear that Insight already represents enough of a technology milestone for VR that large hardware companies might want to license it for their AR solutions; if they don’t, it’s possible that your first pair of AR glasses will have a Facebook or Oculus logo on their temples.