Connect with top gaming leaders in Los Angeles at GamesBeat Summit 2023 this May 22-23. Register here.
While AI-powered cameras often track objects or people, their underlying computer vision technology can also be used to digitize the positions of a person’s limbs — an application that is becoming increasingly important for mixed reality, where a user’s natural hand motions can be used as an alternative to a trackpad, keyboard, or game controller. Using technology acquired from Leap Motion, Ultraleap has become a leader in hand tracking, including solutions for PCs and standalone VR headsets. Now it’s offering developers a preview of its fifth-generation tracking software Gemini, which it says will unlock enterprise use cases for mixed reality headsets with accurate tracking across two hands in real time.
Gemini uses computer vision to create virtual hands with fingertips, knuckles, and palms, enabling developers to sense everything from pinches to taps, twists, and other gestures. Ultraleap’s wireframe skeletons can easily be rendered to the viewer as realistic or surrealistic hands, depending on the app, but under the hood, the system is analyzing hand motion data and converting it into intent, a process that until now has been solid enough for certain types of games but not precise or reliable enough for many enterprise applications. For added precision, researchers have continued to work on wearables such as rings and gloves, which obviously require additional steps to put on, take off, and charge.
The Gemini preview is significant for technical decision-makers because it will enable frictionless use of mixed reality headsets. Simply place goggles or glasses with Ultraleap’s integrated software on your head, and you’ll be able to start interacting with digital objects as if they were real, using nothing more than your bare fingers. Ultraleap’s fourth-generation solution, Orion, was very good at tracking a single hand but struggled to simultaneously track two hands, particularly when one hand’s fingers overlapped and occluded the other’s. Gemini improves tracking across both arms, which means an enterprise could equip a field worker with an AR/VR watch-styled computer that can be “worn” on one hand and controlled with the other, among many potential applications.
Compared with Orion, Gemini promises improved hand smoothness, pose fidelity, and robustness, as well as better hand initialization, collectively enabling virtual hands and arms to appear quickly, move accurately, and not disappear or jitter as much while moving. The base tracking system has been substantially rewritten to dramatically bolster two-hand tracking and interactions, such that one hand’s physical occlusion of the other hand doesn’t cause tracking to break.
GamesBeat Summit 2023
Join the GamesBeat community in Los Angeles this May 22-23. You’ll hear from the brightest minds within the gaming industry to share their updates on the latest developments.
The developer preview is currently available only for Windows 10 computers using the Leap Motion Controller or newer Stereo IR 170 camera, with support coming to other platforms later. To assist PC developers, Ultraleap has also added new “Screentop” modes that enable the tracking camera to be mounted above an interactive screen, alternatives to prior headset and desk surface mounting locations.
Mixed reality headsets based on Qualcomm’s Snapdragon XR2 chip will also take advantage of the new Gemini technologies, as Qualcomm notes that it has been working to integrate Ultraleap’s hand-tracking software directly into the XR2 5G platform. This means it would be available for developers regardless of whether a given headset maker opts to use XR2 reference design cameras or Ultraleap’s Stereo IR 170. Varjo is also integrating Ultraleap’s most recent tracking technologies directly into its latest XR-3 and VR-3 headsets.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.