Amazon is introducing a new feature to its Alexa Show device designed to help blind and other low-vision customers identify common household pantry items by holding them up in front of Alexa’s camera and asking what it is. The feature uses a combination of computer vision and machine learning techniques in order to recognize the objects the Echo Show sees.
The Echo Show is the version of the Alexa-powered smart speaker that tends to sit in customers’ kitchens because it helps them with other kitchen tasks, like setting timers, watching recipe videos, or enjoying a little music or TV while you cook.
But for blind users, the Show will now have a new duty: helping them better identify those household pantry items that are hard to distinguish by touch — like cans, boxed foods, or spices, for example.
To use the feature, customers can just say things like “Alexa, what am I holding?” or “Alexa, what’s in my hand?” Alexa will also give verbal and audio cues to help the customers place the item in front of the device’s camera.
Amazon says the feature was developed in collaboration with blind Amazon employees, including its principal accessibility engineer Josh Miele, who gathered feedback from both blind and low-vision customers as part of the development process. The company also worked with the Vista Center for the Blind in Santa Cruz on early research, product development, and testing.
“We heard that product identification can be a challenge and something customers wanted Alexa’s help with,” explained Sarah Caplener, head of Amazon’s Alexa for Everyone team. “Whether a customer is sorting through a bag of groceries, or trying to determine what item was left out on the counter, we want to make those moments simpler by helping identify these items and giving customers the information they need in that moment,” she said.
Smart home devices and intelligent voice assistants like Alexa have made life easier for disabled individuals, as it allows them to do things like adjust the thermostats and lights, lock the doors, raise the blinds, and more. With “Show and Tell,” Amazon hopes to reach the wide market of blind and low-vision customers, as well. According to the World Health Organization, there are an estimated 1.3 billion with some sort of vision impairment, Amazon says.
That being said, Echo devices aren’t globally available — and even when they are offered in a particular country, the device may not support the local language. Plus, the feature itself is U.S.-only at launch.
Amazon isn’t alone in making accessibility a selling point for its smart speakers and screens. At Google’s I/O developer conference this year, it introduced a range of accessibility projects, including Live Caption which transcribes real-time audio; Live Relay for helping the deaf make phone calls; Project Diva, for helping those who don’t speak use smart assistants; and Project Euphonia that helps make voice recognition work for those with speech impairments.
Show and Tell is available now to Alexa users in the U.S. on first and second-generation Echo Show devices.
Written by Sarah Perez
This news first appeared on https://techcrunch.com/2019/09/23/amazons-echo-show-can-now-identify-household-pantry-items-held-in-front-of-its-camera/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Techcrunch+%28TechCrunch%29 under the title “Amazon’s Echo Show can now identify household pantry items held in front of its camera”. Bolchha Nepal is not responsible or affiliated towards the opinion expressed in this news article.