Under codename Project Vesta, Amazon’s Lab126 is reportedly making a home robot. Tests in employees’ homes could begin in the coming months, and sales of an Amazon robot may begin as early as 2019.
Anonymous sources speaking with Bloomberg today provided no details about what the domestic robot will look like or how it will function, but Amazon has made a series of big investments in AI in recent years, and if that same toolbox is being used for Project Vesta, there are certain features and machine intelligence we can expect from an Amazon robot.
In late 2017, Alexa chief scientist Rohit Prasad spoke with VentureBeat and other news outlets about the evolution of Alexa and the company’s future plans.
At that time, Prasad talked about how Amazon is experimenting with the detection of a user’s emotional state based on analysis of voice recordings collected each time you have an exchange with an Alexa-enabled device. Amazon will begin with identification of frustration so Alexa knows whether or not she succeeded in completing a task, but will later branch out into detection of other emotions.
Today, you can tell Alexa “I feel sad,” and you will get an automated response back. In the future, Alexa will sense a deviation from the baseline of what your voice typically sounds like, then use this to inform your experience with Alexa-enabled devices.
In a robot, this knowledge could come to express itself in its tone of voice or the gestures, movement, or expression used.
For example, Alexa in your car or bedroom could pick up hints of stress or agitation in your voice, and in response, your Amazon robot could have a more welcoming or comforting expression on its face when it looks at you the rest of the day. The same signal could be used to recommend certain types of music or other activity based on your actions the last time you were stressed and agitated.
This intelligence is important not just because it can inform advertisers or lead to personalized experiences, but because understanding how an exchange with humans went will help transform interactions from simple commands communicated to Alexa to interaction that feels more natural, potentially with the back-and-forth volley humans call conversation.
As Affectiva CEO Rana el Kaliouby says, robots and assistants like Alexa can tell you a joke today, but they don’t know if you found it funny. Emotional intelligence that uses voice or face analysis to verify your reaction will allow Alexa to understand if you laughed at her joke, then respond by telling you another joke.
Any robot Amazon brings to market will most likely have the ability to recognize users faces and deploy the kinds of services found in Amazon Cloud Cam. Released last fall, Cloud Cam uses motion detection and facial recognition to send users alerts on their smartphones. When combined with two-way audio capabilities, the camera — and likely soon an Amazon robot — can send you an alert so you can yell at your dog to get off the couch or greet your kid when they return home from school. AWS also released real-time facial recognition last fall at AWS Re:invent.
Facial recognition can also be used to record photos and videos around the house, à la Google Clips or Mayfield Robotics’ Kuri. Facial recognition can be used to include or exclude certain members of the house from being recorded.
As deployments by companies like Face++ and Baidu in China make plain, your face could also be used to verify purchases or make payments.
Fashion tips and object detection
Beyond the use of cameras to scan your face, Amazon’s robot could also deploy computer vision to help you with shopping or style.
Amazon released the Echo Look to make Alexa a fashionista by invite only last spring. It uses computer vision to recognize clothes and recommend outfits.
In tandem with Amazon’s visual search or DeepLens, your robot may be able to do a lot more than identify the members of your household — picking up on clothing brands, works of art, books, and millions of products for sale.
Fire TV and Echo Dots may be among the most popular items Amazon currently sells, but Amazon is still just a service that wants to sell you absolutely everything.
To that end, Amazon’s new robot could work with Amazon’s many delivery services, whether a drone or Amazon Key, the in-home delivery service launched last fall.
You might feel more comfortable about letting the plumber into your home or having a large item like a couch delivered if you can see it all go down, from the moment the person enters your home to the moment they leave. The combination of a smart lock and Amazon Cloud Cam are central to this service today, but a mobile Amazon robot could make this a lot easier.
More trust on this front could help Amazon expand into the sale of more home furniture or appliances, other items that require installation, or pieces that cannot be simply dropped off at your doorstep.
If investments Amazon has made are any indication, the robot could attempt to incorporate lessons learned from Embodied Robotics, a company currently running in stealth mode that received Alexa Fund backing in 2016.
Embodied Robotics was cofounded by University of Southern California professor Maja Mataric. Like her work as director USC’s Robotics and Autonomous Systems Center, at Embodied Robotics she will focus on making machines that can socialize with humans and be an assistant.
In the past, Mataric has made robots, for example, to help rehabilitate a person who had a stroke and to interact with children on the autism spectrum. A robot named Maki was created to soothe children before they receive an injection at a hospital.
A robot made with a knowledge of how to effectively interact with, and augment, humans could, as Mataric has said in the past, become “a companion that fills the gap of what’s missing.”
Pair this with the personality Alexa attempts to demonstrate today — she’s a Seahawks fan, a feminist, and routinely does things like predict Oscar winners — and Amazon’s robot could be endowed with intelligence that makes it not just effective at getting things done, but also well versed in the challenges a machine can encounter when dealing with humans.
Years from now an Amazon robot might be able to do the range of things Mataric has focused on, but in the short term that probably just means being able to carry on with chit-chat, perhaps to combat a loneliness epidemic.
What would be really cool is if, once these robots are made available, users could pick their robot’s personality. Depending on your home, chipper Alexa may be fine, but a little sass like Alice from The Jetsons or morbidity like Marvin from The Hitchhiker’s Guide to the Galaxy may also be in order.