Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.
Amazon wants Alexa to someday be just like the conversational computer on Star Trek, said Amazon’s SVP for devices, David Limp, who repeatedly referred to Star Trek on stage today at The Wired Conference in New York City. The Star Trek computer, he said, is Alexa’s “north star.”
While the Apple HomePod, Google Home, and the like are now trying to catch up, the Amazon Echo came to market in early 2015.
Amazon created the first popular smart speaker, Limp said, by encouraging engineers and creative minds inside the company to think of what can be accomplished with a combination of machine learning and the cloud. He said this led the team to work on a device that could understand conversational speech and do things for you. Today, he said, members of the team building Alexa are encouraged to add functionality based on what the Star Trek computer could accomplish.
“The bright light, the shining light that’s still many years away, many decades away, is to recreate the Star Trek computer. That computer, you could be anywhere on the Starship Enterprise and you could say the world “computer” and it would wake up and answer any question, and that’s our goal,” he said.
MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.
Earlier this year, Amazon even made it possible for people to use “computer” as a wake word instead of using “Alexa” or “Echo.”
In service of this goal to be more like Star Trek, Amazon has made software development kits available for developers and product creators. These kits let them go beyond smart speakers and put the intelligent assistant into products like cars or home robots or a can of Pringles. Google and Microsoft have done the same.
In response to a question about what new features people should expect to see from Alexa, Limp said Amazon is working to improve Alexa’s understanding of follow-up questions and to “invent the concept of short-term and long-term memory.”
“Your brain is incredibly good at recalling that short-term memory and doing the context switch. We need to teach the cloud, the Alexa cloud in this case, how to understand that context,” he said.
In one of the assistant’s early steps toward a better memory, Alexa first gained the ability to respond to follow-up questions last December.
Discovery of Alexa skills is also a challenge Amazon hopes to tackle. Last November, Amazon launched the Alexa Skills Store for customers to shop for skills, and, in recent months, Alexa gained the ability to make a skill recommendation. If a user says “Alexa, order me a car,” for example, Alexa may recommend enabling the Lyft or Uber skill.
“I think for us the really interesting problems are — in a world where we just passed 13K of these skills — [how to] imagine a world in the not-so-distant future where there’s a million people in garages and universities and hackers and hobbyists writing skills for an AI like Alexa. So how do you make it so that’s discoverable and easily manageable? It’s a very hard problem, and even modern app stores haven’t solved it. Finding the long tail of an app is a very hard thing to do,” Limp acknowledged.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.