Where does your enterprise stand on the AI adoption curve? Take our AI survey to find out.
It’s been roughly two years since Google launched ARCore, a software development kit for augmented reality apps on Android, iOS, and Chrome OS. The Mountain View company says that to date, it’s been used to create thousands of titles across hundreds of million devices. And this week, it’s set to improve on a number of fronts.
First on the list of enhancements is the Augmented Faces API, which rolled out earlier this year as a part of ARCore 1.7. The creation process has been streamlined with a new face effects template, and the API now plays nicely with iOS, which Google points out means developers can create effects for more than a billion users.
As you might recall, the Augmented Faces API works with front-facing Android cameras, recognizing a face before applying a 468-point 3D mesh that can be used to add mask-like textures, objects, and even facial retouching. Even without a depth sensor, it’s able to track and change with the person’s movement automatically.
Next up, the Cloud Anchors API, which lets multiple devices tap the cloud to share information about objects within a real-world scene, can more efficiently host and resolve anchors thanks to improved cloud anchor generation and visual processing. Now, when creating an anchor, more angles across larger areas in the scene can be captured for a more robust 3D feature map. (Google says that once the map is created, visual data is deleted and only anchor IDs are shared so that tracked objects display correctly from each perspective.) Moreover, multiple anchors in the scene can now be resolved simultaneously, reducing the time required to launch a shared AR experience.
Lastly, Google says it’s working to expand the use of shared AR experiences with persistent Cloud Anchors, or anchors that last indefinitely regardless of surface or distance. They’re already powering Mark AR, a social app developed by Sybo and iDreamSky that lets people create AR art in real-world locations that can be viewed or modified over an extended period of time.
Persistent Cloud Anchors is available in private preview starting today. Interested developers can sign up for early access.
“We see this as enabling a ‘save button’ for AR, so that digital information overlaid on top of the real world can be experienced at any time. Imagine working together on a redesign of your home throughout the year, leaving AR notes for your friends around an amusement park, or hiding AR objects at specific places around the world to be discovered by others” wrote Google augmented reality product manager Christina Tong. “By enabling a ‘save button’ for AR, we’re taking an important step toward bridging the digital and physical worlds to expand the ways AR can be useful in our day-to-day lives.”
Today’s ARCore developments follow on the heels of improvements to the Augmented Images API, which allow users to point their cameras at 2D images (like posters or packaging) and bring them to life, and the Light Estimation API, which provides a single ambient light intensity to mimic real-world lighting in digital scenes. As of May, developers can simultaneously track images and use a new mode — Environmental HDR — that leverages machine learning to “understand high dynamic range illumination” in 360-degrees, taking in available light data and extending the light into a scene.
VentureBeatVentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact. Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:
- up-to-date information on the subjects of interest to you
- our newsletters
- gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
- networking features, and more