Did you miss a session from GamesBeat Summit 2022? All sessions are available to stream now. Watch now.
Google today announced changes to ML Kit, its toolset for developers to infuse apps with AI, designed to make it easier to use offline. While the original ML Kit was tightly integrated with the web development platform Firebase, the refreshed ML Kit makes available on-device APIs in a standalone SDK that doesn’t require a Firebase project.
Google notes that more than 25,000 applications on Android and iOS now use ML Kit’s features, up from just a handful at its introduction in May 2018. Much like Apple’s CoreML, ML Kit is built to tackle challenges in vision and natural language domains, including text recognition and translation, barcode scanning, and object classification and tracking.
With the transition from ML Kit for Firebase’s on-device APIs to the ML Kit SDK, a face contours model — which can detect over 100 points in and around a user’s face and overlay masks and beautification elements atop them — has been added to the list of APIs shipped through Google Play Services, Google’s background service and API package for Android devices. (This should result in smaller app footprints and enable the model to be reused between apps.) Beyond this, Android Jetpack Lifecycle support has been added to all ML Kit APIs, making integration with the CameraX support library ostensibly easier than before.
Lastly, two new APIs are now available as part of the ML Kit early access program: entity extraction and pose detection. Entity extraction spots items in text including phone numbers, addresses, payment numbers, tracking numbers, date and time, and more and makes them actionable. Pose detection supports 33 skeletal points like hands and feet tracking.
In a related development, ML Kit now supports custom TensorFlow Lite image labeling, object detection, and object tracking models. Support will expand to additional types of models in the future, Google says.
The updates come after Google added new natural language processing services for ML Kit, including Smart Reply, last year. (Smart Reply suggests text responses based on the last 10 exchanged messages and runs entirely on-device, and it’s been incorporated into Gmail, Google Chat, and Google Assistant on smart displays and smartphones.) Last year during Google’s I/O 2019 developer conference, ML Kit gained three new capabilities in beta, starting with a translation API supporting 58 languages and a pair of APIs that let apps locate and track objects of interest in a live camera feed in real time.
Google says that for the time being, ML Kit will continue to work with Firebase features like A/B testing and Cloud Firestore and that the cloud-based APIs and model deployment will remain available through Firebase Machine Learning. However, developers who wish to take advantage of the new features and updates will have to switch to the SDK.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.