The UAE's Hands Can Talk project has developed software using Microsoft Kinect technology that detects hand gestures and converts it into audible words for a non-deaf audience.
Sign language is a great tool to help deaf and hearing impaired communicate with those who can understand it. Unfortunately, most non-deaf people don’t have an impetus to learn it unless they regularly speak with deaf people. Innovations such as the Ukraine-based Enable Talk has already attempted to tackle this problem by developing smart gloves that can translate sign language into speech in real time. Approaching it from a different angle is the Dubai startup KinTrans, whose Hands Can Talk project has developed software using Microsoft Kinect technology that detects hand gestures and converts it into audible words for a non-deaf audience.
Developed as part of the Turn8 accelerator program, the system is designed for formal situations such as conferences and lectures where the speaker can be placed in front of the Kinect sensor. The software requires a short setup to help it locate the position of user’s hands and face. After that, the speaker can use sign language and the system will detect what they’re saying and translate it into speech in real time. Users can set multiple languages, including English and Arabic, meaning that the software could help deaf speakers transcend both the spoken language barrier, as well as national language barriers.
Watch the video below to see a demonstration of the service:
Although the system currently requires use of the Kinect hardware, could software soon be developed using the gesture-tracking technology of wearable tech such as Google Glass, enabling deaf people to seamlessly interact with the non-deaf in any situation?