App is a high-tech seeing eye dog
Work & Lifestyle
The Seeing AI app, for visually impaired people, uses object recognition and natural language processing to describe their surroundings.
Register for full access
Our library content is no longer freely available. Please register to gain access to more than 12,000 innovations, updated daily. Our content is global in scope and covers solutions to the world's biggest challenges across 18 sectors.
One of most exciting applications of computer vision and image recognition is to help blind and visually impaired people understand the world around them. We have already seen BlindTool use image recognition technology to identify 3D objects and convey them verbally for blind users, and now, the Seeing AI app created by developers at Microsoft could take these abilities even further, by providing real-time auditory description of the world around them.
The Seeing AI app can be used via smartphones or Pivothead smart glasses. The user begins by holding up the camera on their phone, or tapping their glasses to prompt the device to ‘look’ at their environment for them. Then, it provides an auditory description of what it sees, either through an earpiece or small speaker. The app uses natural language processing to describe their surroundings — everything from objects to text to the expression on their companion’s faces.
The app was recently unveiled at the Microsoft Build conference, but is still in development. Could these capabilities be used in education, or for language translation when traveling?
19th May 2016
Email: me@saqibshaikh.com
Website: www.saqibshaikh.com