Amazon’s Echo Look is an Alexa fashion assistant that combines human and machine intelligence to tell you how you look in an outfit, keeps track of what’s in your wardrobe, and recommends clothes to buy from Amazon.com.
Made generally available to the public in recent weeks, the Echo Look debuted in April 2017, but was available by invite only for more than a year — a first for Alexa-enabled devices. Over time, Amazon will team Echo Look with Prime Wardrobe, an Amazon program akin to modern fashion companies like Stitch Fix and Trunk Club that lets users try on clothes and send back what they don’t want to buy. All the meanwhile, Amazon’s facial recognition software Rekognition keeps making headlines for being used by U.S. law enforcement agencies and misidentifying more than two dozen members of Congress as criminals.
Let’s examine why it can be a lot of fun to use the Echo Look, why it took Amazon a year to make the device generally available, and why its fashion assistant’s AI is inherently biased.
What Alexa’s fashion assistant has learned to do
It took more than a year to roll out the Echo Look, Amazon director of Echo product management Linda Ranz told VentureBeat in an interview, because the Echo Look is Amazon’s first venture into computer vision for consumers, and the company wanted to get it right.
Over the course of a year of testing, a number of features were added to the Echo Look smartphone app that accompanies the device, including collections, which gives users the ability to organize their outfits into categories.
The Echo Look app can automatically create collections based on seasonality, your style, or the kinds of photos you upload each time you say “Alexa, take a picture.” The app also color-filters and recommends clothes to buy from the Amazon marketplace.
Echo Look recommendations are currently fueled by photos uploaded to the app, but it’s really easy to imagine these recommendations also being informed by shopping history and user activity within Amazon’s store.
Style Check is an original Echo Look feature that lets you compare two outfits to each other, and it’s the cornerstone of the Alexa fashion assistant experience.
One feature added during the closed beta is Style Check Reasons, which offers an explanation for why the AI model chooses one outfit over another. The AI may tell you these colors look better together, or one pairing of shirt and pants match better.
The reasoning behind Reasons, Ranz said, is a combination of human and machine intelligence.
“It started out highly human and will become more and more machine, but as one might expect with fashion trends constantly changing, there will always be some human engagement in this keeping track of what styles are now in and what’s changing in fashion,” Ranz said.
The human gaze
Dozens of fashion specialists use fashion images to define Alexa’s fashion sense, a company spokesperson told VentureBeat. However, the people hired to give Alexa a sense of fashion appear to come from similar backgrounds.
Skimming LinkedIn for a dozen of the fashion specialists training Alexa’s fashion assistant AI shows most live in the Seattle area, appear to be young women, and have past job experience at companies like Nordstrom, Zulily, and J.Crew.
To broaden the definition of fashion for specialists training Alexa and to increase sensitivity to variations of body shape or ethnicity, Amazon fashion specialists periodically receive training, Ranz said. A company spokesperson declined to share details about the training process.
“We want to make sure that we’re not narrowly focused in one area or another, so it’s something that we do … both orientation as we bring someone on board and training,” she said. “Because if you’re a fashion stylist for Nordstrom’s, you can go to a very specific target segment. Amazon by nature will appeal to a much broader set of customers, and so it’s an ongoing set of work that we do with that group.”
Even if Amazon managed to hire an amazingly diverse group of individuals from a range of different backgrounds and developed great training, it would still be a challenge to create an AI fashion assistant that appeals to every person’s needs.
Like Project Debater from IBM Research or machine learning meant to work alongside musicians or dancers, it may be easy to take a look at the results and say whether or not something can be called great or terrible, but it can be tough to quantify art, to give a numerical value so AI to deliver fashion recommendations.
Given how insecure people can be about how they look and present themselves to the world, the challenge of measuring for AI bias in a fashion assistant is fraught.
Context is key
One gigantic missing piece — perhaps the most important missing element of Echo Look recommendations today — is context.
If you’re going to a work function, then you dress accordingly. The same can be said for a PTA meeting or a dive bar. You might want to get a bit more edgy on date night but tone things down the first time you meet your spouse’s parents, for example.
The Echo Look app can quickly A/B test your style with Spark, an online community of humans who vote to pick the best outfit. When I ran tests on my own outfits, human votes on Spark were close in percentage to results from Alexa. But share a photo in Spark with context and you may receive a very different response.
For example, a Style Check that just asked “Which one?” received 287 votes, with humans voting 15/85 for an outfit compared to 27/73 from Alexa. The vote doesn’t match up exactly, but it’s similar.
But ask “Which one looks best for Friday night?” and I got the opposite: a 29/71 for human votes versus 69/31 for Alexa.
When toggling between Style Check results, oftentimes I received results that said one outfit received 70 or 75 percent out of 100.
For people like myself who do not consider themselves to be fashionably inclined, Echo Look can be a lot of fun. It can help you do things like quickly choose the right outfit for the right occasion, but you should take the results with a grain of salt — and perhaps accept that humans who teach fashion to machines will always encounter certain challenges.
The future of style
The Echo Look will initially focus on fashion, and the lowest hanging fruit in that category is Prime Wardrobe.
Ranz declined to talk about future plans for Prime Wardrobe and Echo Look, but of course the end game for technology like an Alexa fashion assistant is the sale of clothes directly from Amazon. “What we’ve done with those recommendations on the app is we’ve taken our entire clothing catalog and narrowed it to those brands and styles that we think will be most interesting to this set of customers,” she said.
It’s not tough to imagine potential next steps, such as letting Alexa offer you custom-fitted clothes or help you find a new style.
I would describe my own fashion sense as fairly business casual (dress shirts and slacks) with a smattering of dope t-shirts. The Echo Look is designed to help you find clothes that match your preferred style — but what if you don’t want to dress like you normally do? What if you want to add some grunge or elegance or flair that will still fit in with the rest of your wardrobe? That’s not available today, but seems like the sort of feature that could be one day.
Taking this a step further, what if Amazon just made clothes for you? Researchers from Adobe and the University of California, San Diego designed an AI model to create new personalized clothes for you based on the outfits in your wardrobe.
Amazon’s test balloon in your bedroom
Cameras are one of the leading sensors for gathering data in the AI world, and Echo Look is a test balloon for placing hardware with computer vision in your bedroom.
Besides a deeper integration with Prime Wardrobe, these same forms of AI could be incorporated into a mirror Amazon patented to dress you or the home robot reportedly due out from Amazon next year.
Should it gain the ability to provide more exact measurements, the Echo Look could combine with, or supplement tech like, a Naked 3D body scanner to enter areas like personal health and fitness.
Alexa isn’t alone in providing AI-driven fashion services.
Samsung’s Bixby Makeup helps you test things like shades of lipstick in AR, a major selling point of the Galaxy S9, and a Bixby smart speaker will reportedly make its debut in the coming weeks.
Google Assistant’s and Pinterest’s computer vision features, both named Lens, offer visual search and style-matching features to help users find similar fashion.
However Amazon chooses to deploy computer vision for consumers, it likely only begins with fashion. The Echo Look could be part of a long-term effort to put computer vision in consumers’ homes.
The audio problem: Learn how new cloud-based API solutions are solving imperfect, frustrating audio in video conferences. Access here