Designed by Swiss AI company Eyra, Horus is a smart headset that uses bone conduction to audibly describe what is nearby, including faces and texts.
Horus is a wearable personal assistant for people with visual impairment. Created by Swiss artificial intelligence company Eyra, the headset connects to a smartphone-sized control unit that users can keep in their pocket or bag. The cameras on the headset send images to the pocket computer where machine learning algorithms translate the information into real time audible descriptions. From navigating to reading texts and recognizing faces, Horus gets smarter with use.
Bone conduction allows the audio description to be heard only by the wearer and without interfering with the usual processes of hearing. If a wearer uses a hearing aid, Horus can connect to the hearing aid system. Currently available in Italian, Japanese and English, Eyra is working to expand the wearable’s language capabilities. Potential buyers must sign up to the waiting list, and online sales are expected to be available in early 2017.
Combining AI with image recognition to help people with visual impairments is an exciting area of development. An app that describes the surrounding environment was revealed earlier in the year, and a smart swim cap vibrates when a swimmer approaches the end of the pool. Might the next big challenge lie in connecting the technology to help people with multiple disabilities?