Engineers at Texas A&M University are designing an arm sensor device similar to the functionality of a bionic arm in addition to measuring and detecting electrical impulses generated by muscle movement. As this is a developing project the wearable sensor has the potential to act as an interpreter of sign language and resolve communication difficulties between deaf people and people who do not understand sign language.
The device’s design is a usage of multiple sensors poised at different areas of the users arm in distinct positions. A sensor is placed at the wrist which monitors movement and gestures made by the hands and fingers as sign language uses specific signs and combinations of gestures.
Currently the prototype version can identify common words that occur in daily conversations this makes it able to have basic communication between users. In order to have a formal communication the device must be initiated through repetition and practice of hands gestures that increases the recognition of muscle movement detected by the arm sensors. When the program expands the goal of the device is to exclude training phase thus the device will respond to the user automatically improving the functionality during a conversation and be capable of translating multiple signals making a more effective method of communication through a complete sentence.
Lead researcher Roozbeh Jafar:
“I thought maybe we should look into combining motion sensors and muscle activation”
“We need to build signal-processing techniques that would help us to identify and understand a complete sentence.”