Google today is talking for the first time about the system that powers its Smart Reply feature for responding to messages on Android Wear 2.0.
First, on wearables running Android Wear 2.0, you’ll be able to respond to messages from any messaging app, not just a Google app.
Second, this is Google’s first completely on-device machine-learning technology, as Google Research staff research scientist Sujith Ravi explains in a blog post. Google usually looks to powerful computers that are kept remotely in the cloud to operate machine learning models, but not here — with a watch, you want things to work right away.
And with a watch there’s less computing and memory capacity than even a smartphone. So Google has thought up a clever system to deal with these constraints. This lines up well with the way that Google shrunk down an artificial neural network and the associated training set to fit a smartphone for instant translations in Google Translate on Android and iOS. (Meanwhile Facebook has been working on similar capabilities, albeit in a very different context: style transfer for photos.) But again, with a watch, everything must be smaller.
With watchOS 3, the Apple Watch does offer canned message replies, but it’s not clear what technology Apple uses for them. Importantly, Ravi clearly points out in today’s blog that the model powering suggested message replies can adapt “to cater to the user’s writing style and individual preferences to provide a personalized experience.” Sort of like Google’s Gboard virtual keyboard, that is. Frankly, the sophistication that’s gone into these predictions might be enough to make some people — including me! — give Android Wear 2.0 a go.
To train the model to predict applicable replies, Google relies on, among other things, semi-supervised graph learning, which was first documented in a 2015 paper coauthored by Ravi, as well as a method called locality-sensitive hashing. Google will publish a paper on the entire system in a forthcoming paper, Ravi writes.
See the full blog post for much more detail.