By Sarah Perez
Amazon is introducing a new feature to its Alexa Show device designed to help blind and other low-vision customers identify common household pantry items by holding them up in front of Alexa’s camera and asking what it is. The feature uses a combination of computer vision and machine learning techniques in order to recognize the objects the Echo Show sees.
The Echo Show is the version of the Alexa-powered smart speaker that tends to sit in customers’ kitchens because it helps them with other kitchen tasks, like setting timers, watching recipe videos, or enjoying a little music or TV while you cook.
But for blind users, the Show will now have a new duty: helping them better identify those household pantry items that are hard to distinguish by touch — like cans, boxed foods, or spices, for example.
To use the feature, customers can just say things like “Alexa, what am I holding?” or “Alexa, what’s in my hand?” Alexa will also give verbal and audio cues to help the customers place the item in front of the device’s camera.
Amazon says the feature was developed in collaboration with blind Amazon employees, including its principal accessibility …read more
Source:: TechCrunch Gadgets