New accessibility feature is available for first and second generation Echo Show
Amazon is set to introduce a Show and Tell feature for Echo Show devices to help blind and low-vision users identify items in their kitchen.
Through the front-facing camera on the first and second generation of the smart speaker, Amazon is leveraging both computer vision and machine learning to help Alexa recognise things in the cupboard that may be hard to distinguish.
Read this: Top smart displays to buy 2019
So, for example, spices, tins and food in boxes would more easily be identified through the new Show and Tell.
To use the new feature, simply say something along the lines of, “Alexa, what am I holding?”. The voice assistant will then respond with cues to help the user place the item within the camera’s eye, before helping to identify it.
Interestingly, Amazon says it developed the feature based on user feedback, working with the Vista Center for the Blind and Visually Impaired in California on R&D.
“Whether a customer is sorting through a bag of groceries, or trying to determine what item was left out on the counter, we want to make those moments simpler by helping identify these items and giving customers the information they need in that moment,” Amazon said.
Initially, at least, Show and Tell will only be available to Alexa users in the US. There’s no current timeframe for when the accessibility feature will roll out to other territories, or indeed the other Echo devices with a camera, the Echo Show 5 and ageing Echo Spot.