Google is trying to convince Android smartphone app developers that integration with Google Assistant is a way to spark re-engagement among users. The tactic was tested last week at multiple sessions at Google’s annual I/O developer conference.
Engagement may be a more urgent matter now, as last week Google began to invite Android users to begin deleting unused apps.
App actions create deep links between an Android smartphone app and Google Assistant. They launched in four categories: Health & Fitness, Finance and Banking, Ridesharing, and Food Ordering.
“Why extend your Android app to the Assistant? What’s in it for you? Well we know that there are challenges for re-engagement for mobile apps, and we think that part of that is because there’s friction in the app experience today,” Google developer advocate Daniel Myers told developers onstage at I/O last week.
The AI Impact Tour
Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you!
Using your voice to get something done with an Android app may cut the number of touches and actions needed from eight down to three, Myers said.
“It will unlock a much faster and deeper Android experience than ever, thanks to Google’s advances with on-device AI, but we know that mobile apps will continue to be the foundation of how users get things done on Android, and so we envision a seamless connection with the mobile ecosystem to be a key feature of the next generation assistant and were excited to work with you to make that happen via app actions,” he said.
App Actions transfer intents and commands from Google Assistant directly to an app and will let users do things like order Dunkin’ Donuts, buy stock with Etrade, send money with PayPal, or start to track an exercise with the Nike Run Club app.
“I think this in some ways an evolution of that idea, like can we just gather the parameters in the first invocation and pass it directly to the app, and we also see that it’s easier for developers, because all the work it takes to do this is all in Android Studio, all in their APK, they don’t have to build a separate experience,” Google group product manager Brad Abrams told VentureBeat in an interview.
Google showcased App Actions for developers for the first time at I/O last year, while Apple’s Siri Shortcuts, a similar service for app developers, made its debut last fall.
App Actions then launched in developer preview last week. A public launch is planned in the coming months, Abrams said. Google initially began to integrate Android apps and Google Assistant last year as a way to complete experiences that start with voice apps or conversational actions.
Claims of app engagement increases with Google Assistant connections seem premature, since App Actions won’t be made publicly available until this summer. No data has been collected that demonstrate Google Assistant’s effectiveness as a re-engagement tool, Abrams said.
“We do see that users and using fewer and fewer apps, and some of it could be that there’s just so many icons on the phone and how do people really know what they want, and we think being able to say with their voice what they want will be an easier way for users to connect with their apps,” he said.
Though it’s not yet clear to how Google Assistant may impact engagement with Android apps, a major focus of the conference last week seemed to be on bringing more rich media to Google search and Google Assistant, continuing a trend to make a more visual Google Assistant and Google search started last fall.
With Lens, Google wants to use computer vision to index the physical world like the web, allowing people’s cameras to translate text it sees or highlight top items on the menu at restaurants. With Google Assistant, people should be able to do things about as fast as they can think of an action, and with tools like Duplex for the Web, people should be able to use language-understanding AI to book restaurant reservations or rent a car.
Augmented reality, video, and mobile and voice apps with both Google Assistant and search were a main theme of the conference last week. Also introduced last week: How-to markup language, a schema.org markup that helps Google index websites, apps, and even YouTube videos to answer how-to question asked with Google Assistant.
How-to markup will enrich results seen on smart displays like Nest Home Hub and Nest Hub Max with numbered step-by-step lists, combined with text, images, and video.
More than 50% of respondents to a 2018 PricewaterhouseCoopers study said asking questions was a common thing they do with a voice assistant on a monthly basis.
Answering questions is one of the most popular use cases for Google Assistant, according to the Adobe Voice Report. Analysis last year found that Google Assistant was the best AI assistant for answering questions, followed by Siri, Alexa, and Cortana.
Perhaps in reaction to Amazon’s Alexa Presentation Language, Google introduced Interactive Canvas for voice apps, starting with games.
App Slices were introduced last year to serve up content and data from apps. So you can ask the Nike Run Club app “How many miles did I run today?” and the response can be served up without ever leaving Google Assistant.
Beyond App Actions, Google also shared last week that it’s testing Mini apps to allow a brand to add an action button to Google Assistant or Google search results.
Google’s ubiquitous and comprehensive voice strategy, rooted in the removal of friction from purchases, was a main theme at Google’s conference and central to its pitch to thousands of developers. While it’s unclear to what extent Google Assistant integration will impact Android app download rates or downloaded app re-engagement levels, it seems clear Google doesn’t plan on limiting Google Assistant to voice apps.
Instead, it’s fusing together an amalgamation of content and services, making new experiences by leveraging its Android and search engine ecosystems.