Join top executives in San Francisco on July 11-12, to hear how leaders are integrating and optimizing AI investments for success. Learn More
Spurred on by big events like Mobile World Congress and competition with rivals like Alexa and Siri, Google Assistant has rolled out some major new features in recent weeks. Here are four of the most interesting and worthwhile new ways to interact with Google’s AI assistant.
This may seem like a simple enough thing to do, but it’s a game changer, one that gives Assistant and its Google Home speakers a nice, healthy dose of practical value.
It means you can tell Assistant to remind you to pick up milk when you get to the store or speak with your kid’s teacher when you pick them up from school.
Home and work locations can be preset in the Google Assistant app so you can just say “Remind me when I get to work to print out the presentation,” or “Remind when I get home to take out the trash,” but other locations like a nearby store may need more information, like a street name, to be understood.
Join us in San Francisco on July 11-12, where top executives will share how they have integrated and optimized AI investments for success and avoided common pitfalls.
The ability to preset other commonly visited places, like the home of a family member or your kid’s school or your place of worship, could be pretty helpful additions in the future.
Siri got the ability to give location-based reminders with iOS 11, and Microsoft’s Cortana is also able to do this.
Scheduled or customized commands to carry out multiple tasks were extended to Alexa users in the fall, and with the Routines feature introduced this week, Google Assistant users got the ability to do the same.
Each routine can be configured to your liking in the Google Home app. Routines can be set to do different things, such as control multiple smart home devices, share your calendar info, or play streaming music.
Before Routines were introduced, My Day, which was activated by saying “Tell me about my day,” was the only action Google Assistant could take that made multiple things happen with a single utterance. With the introduction of Routines, My Day has been renamed the Good Morning routine.
As was the case before, the Good Morning routine will allow you to hear the weather, your calendar, your work commute, and the news — or the radio or a podcast or an audiobook. A Good Morning routine can also take your phone off silent, turn on the lights in your room, raise the blinds, and start your coffee.
Say “OK Google, Good night,” and you can set your phone to silent, hear tomorrow’s weather, turn out the lights, and ask you when to set the alarm.
After that Google Home speakers can be preset to play music like a favorite nighttime playlist or sleep sounds like babbling brook, the ocean, or country night. Both your sleepytime playlist and sleep sounds can be preset to turn off after a certain amount of time.
There’s also routines for when you enter and leave the home, and for the commute to and from work.
Tell Google Assistant you’re ready to commute home and it might adjust the temperature in your living room, resume your podcast, and tell you about traffic. Or you may hear your reminders read to you each time you walk in the door and say “Hey Google, I’m home.”
Good morning and good night routines can take up to eight actions at once, while routines for entering and leaving the home are limited to turning smart home appliances on or off and adjusting thermostat temperature.
Routines are at their heart an essential part of the next stages of voice computing. It’s helpful to be able to say “Hey Google” one at a time for things you need to get done. It’s downright transformative to be able to say “Hey Google, turn up the temperature 5 degrees, shut blinds, resume my podcast, and send me ideas for dinner.” Chances are you don’t always need this sort of feature, but it increases how you can rely on an assistant to quickly get things done.
Business card scans
Google’s Lens computer vision AI was introduced last fall and is spreading fast. In the past week, Lens has started to roll out for Google’s Photos app on Android and iOS, and as part of the native Google Assistant on Android smartphones. This means that with a hard press of the home button and tap of the Lens icon, you can accomplish a variety of tasks like identify a business, scan a barcode, or even recognize famous people.
Last week, Google added business card scanning for Lens. Point Lens at a card and it will draw out the name, company, title, phone number, email address, and other pertinent information, then give you the option to save a new contact.
Using business card scans with Lens in Google Assistant on Android is really easy to do, and like the Now Playing feature for automatic song identification, feels like something that should have been an option all along.
A photo of business cards scanned by Lens are saved in the My Activity area of Google Assistant that let’s you see your past query or search history. Unfortunately it does not appear Google Assistant provides a designated place to view business cards yet. Let’s hope this changes in the future.
Lens identification of plants and animals
This is a fun way to bring out information about your surroundings and, combined with the bird sound identification action, can tell you a lot about the world.
Playing with Lens can generate some strange results at times. A picture of a distant cat, for example, was mistaken for a caterpillar, and a photo of tall grass made Lens suggest it was looking at tall trees.
As Lens object identification skills grow, it will be interesting to see if Lens misidentifies, say, a poisonous mushroom or venomous snake for a benign species, and what other misidentifications take place.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.