MyVoice has the ability to translate sign language into spoken / written word.
There are between 500,000 and 1 million people living in the U.S. considered to be “deaf”(according to a study conducted by the Gallaudet Research Institute in 2005). This huge population requires a solution to the complex two way communication problem mentioned above. In order to assist them as well as help anybody who needs to communicate with them, a group of University of Houston students teamed up to develop the concept and prototype for MyVoice, a device that reads sign language and translates its motions into audible words.
The device earned first place honors among student projects at the American Society of Engineering Education’s Gulf Southwest Annual Conference.
MyVoice is a handheld tool that consists of a built-in microphone, speaker, soundboard, video camera, and monitor.
The hardware behind MyVoice.The idea is to be able to place it on a surface where it can watch a person’s sign language motions. MyVoice then processes these movements before translating and speaking their meaning out loud in an electronic voice.
“The biggest difficulty was sampling together a database of images of the sign languages. It involved 200 to 300 images per sign,” Seto said. “The team was ecstatic when the prototype came together.”
This video shows a demonstration of MyVoice. It’s particularly impressive how quick the device responds to the user’s motion:
Since MyVoice was created and the team brought home first place honors at the ASEE conference, all of the students involved on the project have graduated. Aleman assures those concerned with the project’s status that just because they’re done with school, MyVoice isn’t finished.
“We got it to work, but we hope to work with someone to implement this as a product,” Aleman said. “We want to prove to the community that this will work for the hearing impaired.”