As part of a continued mission to make its Alexa intelligent assistant more accessible for everyone, Amazon today introduced a new feature called Show and Tell, which helps blind and low vision customers with an Echo Show smart display identify common pantry goods (like canned or boxed foods) that can be difficult to distinguish by touch.
Show and Tell is available starting today for Alexa customers in the U.S. on first and second-generation Echo Show devices. To get started, say “Alexa, what am I holding?” or “Alexa, what’s in my hand?,” which will kick off verbal and audio cues that guide you to place the item you’d like to identify in front of the Echo Show’s camera.
“The whole idea for Show and Tell came about from feedback from blind and low vision customers,” said head of Amazon’s Alexa for Everyone team Sarah Caplener in a statement. “We heard that product identification can be a challenge and something customers wanted Alexa’s help with. Whether a customer is sorting through a bag of groceries, or trying to determine what item was left out on the counter, we want to make those moments simpler by helping identify these items and giving customers the information they need in that moment.”
Amazon says the feature was developed in close collaboration with blind Amazon employees, and that the development team gathered input and feedback from blind and low vision customers. Additionally, they worked with the Vista Center for the Blind in Santa Cruz and other organizations throughout the process.
“I get to help create products like Show and Tell, make our Fire tablets and TVs accessible and delightful, and constantly help to imagine new ways for our products and services to improve the lives of our customers, including those with disabilities,” said principal accessibility researcher Josh Miele, who’s blind. He cited statistics from the World Health Organization estimating that 1.3 billion people live with some form of vision impairment, and that 15% of the population live with some sort of disability.
The debut of Show and Tell follows on the heels of accommodating features like an adjustable speech rate and Tap to Alexa, which lets Echo Show owners access weather forecasts, news headlines, timers, and more by tapping rearrangeable shortcuts on the Show’s screen. Amazon Captioning, which launched in the U.S. last year and internationally more recently, provides captions for Alexa’s responses on Echo devices with a screen and transcribes incoming voice messages.
Amazon isn’t the only tech giant investing in accessibility, of course.
Google unveiled three separate accessibility efforts targeting users with speech impairments, deafness, and limited mobility at its I/O 2019 developer conference earlier this year, and in July it detailed an open source AI tool, Parrotron, that aims to help those with atypical speech become better understood by voice assistants. Meanwhile, last September saw the unveiling of Microsoft’s Soundscape, a navigation app that uses binaural audio — sound recorded with two microphones arranged to create a 3D stereo sound sensation — to help visually impaired users build mental maps and make personal route choices in unfamiliar spaces.
VentureBeatVentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact. Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:
- up-to-date information on the subjects of interest to you
- our newsletters
- gated thought-leader content and discounted access to our prized events, such as Transform
- networking features, and more