Biomedical engineers are developing a wearable technology capable of sensing the muscle movements used during sign language and translating them into English speech. Ben Gruber has more.
STORY: The communication barrier between deaf people who use sign language and those that don't understand it may be coming to an end…. thanks to a new wearable technology being developed at Texas A&M University. The device incorporates a system of sensors that records the motion of hand gestures, as well as the electromyography or EMG signals produced by muscles in the wrist when signing. (SOUNDBITE) (English) ROOZBEH JAFARI, ASSOCIATE PROFESSOR OF BIOMEDICAL ENGINEERING, TEXAS A&M UNIVERSITY, SAYING: "We decode the muscle activities we are capturing from the wrist. Some of it is coming from the fingers indirectly because if I happen to keep my fist like this versus this the muscle activation is going to be a little different." It's those differences that present the researchers with their biggest challenge. Fine tuning the device to process and translate the different signals accurately, in real time, requires sophisticated algorithms. The other problem is that no two people sign exactly alike, which is why they designed the system to learn from its user. (SOUNDBITE) (English) ROOZBEH JAFARI, ASSOCIATE PROFESSOR OF BIOMEDICAL ENGINEERING, TEXAS A&M UNIVERSITY, SAYING: "When you wear the system for the first time the system operates with some level of accuracy. But as you start using the system more often, the system learns from your behavior and it will adapt its own learning models to fit you." Going forward the team hope to miniaturize the device so it can be worn on a users' wrist like a watch and program it to decipher complete sentences rather than just individual words. The researchers also want to incorporate a synthetic voice speaker, an upgrade that could potentially give the 70 million deaf people around the world…a new voice.