Join gaming leaders, alongside GamesBeat and Facebook Gaming, for their 2nd Annual GamesBeat & Facebook Gaming Summit | GamesBeat: Into the Metaverse 2 this upcoming January 25-27, 2022. Learn more about the event.
Starting today, Google Assistant can quickly open, search, and interact with some of the most popular Android apps on the Google Play Store. To kick off its Google Assistant Developer Day conference, Google this morning announced new Assistant shortcuts across fitness, social media, payment, ride-hailing, and other categories of apps for actions like finding a ride, ordering food, and playing music. Beyond this, it detailed improvements headed to Assistant-powered smart displays like the Nest Home Hub.
The pandemic appears to have supercharged voice app usage, which was already on an upswing. According to a study by NPR and Edison Research, the percentage of voice-enabled device owners who use commands at least once a day rose between the beginning of 2020 and the start of April. Just over a third of smart speaker owners say they listen to more music, entertainment, and news from their devices than they did before, and owners report requesting an average of 10.8 tasks per week from their assistant this year compared with 9.4 different tasks in 2019. And according to a new report from Juniper Research, consumers will interact with voice assistants on 8.4 billion devices by 2024.
Expanded App Actions
The Assistant integrations were made possible in part by App Actions, a developer service that creates deep links between Android smartphone apps and Assistant. App Actions, which Google showcased for the first time at its I/O developer conference in 2018, initially launched in four categories — health and fitness, finance and banking, ride-sharing, and food ordering — when it entered public preview last year. (It now supports 10 verticals in total including social, games, travel and local, productivity, and shopping and communications.) App Actions complement App Slices, which were introduced in 2019 to serve up content and data from apps. And both look to spark reengagement among users without forcing them to build separate experiences for Assistant.
App Actions behave like shortcuts to parts of Android apps. They build on top of existing functionality in apps, and the development process is similar for each App Action developers choose to implement. Basically, App Actions take users to content within apps via deep link URLs, which developers specify. By transferring intents and commands from Assistant to an app, App Actions enable users to do things like order Dunkin’ Donuts, buy stock with Etrade, and send money with PayPal. As for App Slices, they let users ask things like “How many miles did I run today?” and receive responses from apps such as Nike Run Club without leaving Assistant.
Among the over 30 new apps supported are Nike Adapt, Nike Run Club, Spotify, Postmates, MyFitnessPal, Mint, Discord, Walmart, Etsy, Snapchat, Twitter, Citi, Dunkin, PayPal, Wayfair, Wish, Uber, and Yahoo! Mail. Assistant now recognizes and acts on commands like:
- “Hey Google, send a message to Rachel on Discord”
- “Hey Google, search for candles on Etsy”
- “Hey Google, log a berry smoothie on MyFitnessPal”
- “Hey Google, check my accounts on Mint”
- “Hey Google, tighten my shoes with Nike Adapt”
- “Hey Google, start my run with Nike Run Club”
- “Hey Google, order a smoothie on Postmates”
- “Hey Google, send snap with Cartoon Lens”
- “Hey Google, find Motivation Mix on Spotify”
- “Hey Google, check news on Twitter”
- “Hey Google, when is my Walmart order arriving?”
In a related change, Google says that Assistant will begin showing relevant App Actions even when users don’t mention an app explicitly by name. For example, if they say “Hey Google, show me Taylor Swift,” Assistant might highlight a suggestion chip that will guide them to the search results page in Twitter. Assistant will also suggest apps proactively depending on individual usage patterns.
Alongside these integrations and recommendations, Google is introducing the ability to create custom shortcut phrases for specific tasks, a feature first exposed in March by the code sleuths at 9to5Google. Instead of saying “Hey Google, tighten my shoes with Nike Adapt,” users can create a shortcut like “Hey Google, lace it.” Alternatively, they can explore suggested shortcuts by saying “Hey Google, show my shortcuts.”
Google says that all phones running Android 5 and higher should support the new app integrations and shortcuts. (Assistant on Android Go doesn’t support App Actions.) Additional apps and expanded device support are expected to arrive at a future date. “Whether you want a faster way to get into your apps, or create custom shortcuts for your most common tasks, we’re excited to keep making Android and your apps even more useful and convenient, giving you time back to enjoy what matters most,” Assistant product director Baris Gultekin wrote in a blog post.
Coincidentally, Amazon months ago launched its answer to App Actions in Alexa for Apps, which integrates iOS and Android apps’ content and functionality with Alexa. Through deep linking, developers can assign tasks like opening an app’s home page or rendering search results to Alexa app voice commands.
New Smart Display features
Germane to the smart display side of things, Google announced two new English voices that take advantage of an improved prosody model to make Assistant sound more natural and fluent. They’re now available, and developers can leverage them in existing Actions.
In addition to the new voices, Google is expanding Interactive Canvas, an API that lets developers build Assistant experiences that can be controlled via touch and voice, to education and storytelling verticals. Soon, education and storytelling intents will be open for public registration, enabling users to say things like “Hey Google, teach me something new” or “Hey Google, tell me a story” to be presented with learning or story collections of apps.
In an effort to improve sharing and transactions, Google says it’s introducing household authentication tokens that allow users in a home to share games, apps, and more. In the future, users on one smart display will be able to start a puzzle, for example, and let other users on another device pick up where they left off. As for transactions, smart displays will support voice-match as an option for payment authorization ahead of an on-display CVC entry next year.
Lastly, Google is launching two features in beta — Link with Google and App Flip — for improved account linking flows and reintroducing its Action links discovery tool as Assistant links. Link with Google enables anyone with a logged-in Android or iOS app to complete the linking flow on a smart display with a few taps, while App Flip allows users to link their accounts to Google without having to re-enter their credentials. Meanwhile, Assistant links enable developer partners to deliver Assistant experiences on websites as well as deep links to Assistant integrations from anywhere on the web.
VentureBeatVentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact. Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:
- up-to-date information on the subjects of interest to you
- our newsletters
- gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
- networking features, and more