Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.
Amazon Alexa is everywhere, and that might be a problem.
At CES 2017, the voice assistant kept popping up in unusual places. At a booth for smart home integrator Legrand, there were outlets used for plugging in lamps and appliances that will work with Alexa (in addition to the lights and appliances themselves). At the Dish Network booth, a demo for Alexa let me quickly change channels by voice and find Tom Hanks movies. A watch called the Martian mVoice lets you push a button to activate Alexa. With the Genesis G90 sedan, you can ask Alexa to set the temperature and lock the car. There’s even this weird lamp. All good so far, right?
And yet, is this the future we really wanted?
The problem is that Alexa is not aware of any of the other gadgets. My assistant doesn’t really know about the watch or the car or the television. It’s not context-aware at all, so Alexa doesn’t know to activate only the car functions in the car or deal with TV options on the couch.
MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.
The issue is most obvious if you have Alexa near your television at home. In a few cases, the speaker will activate when a commercial comes on and talks about Alexa. (The same thing happens with the Google Home speaker and the Google Assistant.)
It gets worse. Let’s say I have 35 gadgets in my home that all support Alexa. This is entirely possible in 2017. OK, so how is it an advantage to keep saying Alexa, Alexa, Alexa all day? I talk to the fridge, then my car, then my television. I’m talking to bots all day. You can’t make global commands, like — “I am going away turn everything off.” You can’t say you want to watch TV and have Alexa dim the lights. Each individual gadget needs individual attention, all day long.
Here’s the solution. AI needs to improve to the point where we don’t have to do all of this talking. It’s more than just knowing I’m in the living room and don’t want to talk to the stove. It’s knowing I like the Golden State Warriors and turning on that channel or reminding me to watch it live. It’s dimming the lights when I get home because that’s my preference. It’s locking the car in my driveway because an AI knows I’m home at 9 p.m., and that’s when my day ends.
We’ve moved from clicking tiny icons in apps to talking to bots. Seeing the Amazon logo everywhere at CES made me question the logic of this mass voicebot rollout. Far worse than this is a future scenario where we talk to Google, Siri, Alexa, Cortana, and a dozen other bots that are competing for our vocal cords. It’s crazy! Now is the time to move quickly to make a more advanced AI that can make things easier for us, not make us figure out which bot does what.
Another option? Improve Alexa to a much greater degree. For the watch I mentioned, I don’t want a button. I want Alexa to know I have the watch and to focus on those functions if that’s what is important to me — say, if I’m out for a run. If I ask a question about the weather, for example, Alexa should know it’s because I’m going running — make it local. If I’m in the car, I want to know about traffic problems due to weather. In the end, Alexa needs a higher IQ.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.