NOV 18, 2015 05:11 AM PST

Engineering American Sign Language

American Sign Language, or ASL, as it’s called, is the third most widely used language in the world, after English and Spanish.  It’s been in use in the United States for over 200 years and approximately 500,000 people communicate with ASL every day. There can be a gap in communication however when a person who must use ASL to speak encounters someone who does not understand it, perhaps in a store, restaurant or other public area. It’s not always possible to find a translator and this can result in some ASL users not being able to participate fully in their communities.
 ASL is being re-engineered
New research from Texas might be the answer to this problem.  Roozbeh Jafari, is a scientist at the Center for Remote Technologies and Systems and an associate professor in the Biomedical Engineering department at Texas A&M. He is working on a wearable smart device that can translate sign language into words. He started the project at UT Dallas, but has since moved his lab to Texas A&M.
 
Jafari’s prototype is a complex but fairly compact system of motion sensors and electrical activity monitors that take signals from the wearer and convert the ASL gestures into words that appear on a screen.
 
Even though the the device is still in the early stages of development, Jafari reported that it can already recognize 40 American Sign Language words with nearly 96 percent accuracy. He presented his research at the Institute of Electrical and Electronics Engineers (IEEE) 12th Annual Body Sensor Networks Conference this past June. 
 
Jafari partnered with and was partially funded by Texas Instruments, who awarded the project second place in the TI Innovation challenge. Jafari brought Texas A&M Phd student Jian Wu, along with two computer engineering grad students Lu Sun and Zhongjun Tian on to the project and when the team was chosen as  TI Innovation challenge winner he told the Dallas Morning News, “I’m quite proud of what they’ve done.” 

Jafari’s system is unlike other sign language recognition systems because it doesn’t use a camera. Depending on video can be problematic in low lighting conditions and there are privacy concerns as well. In a press release from Texas A&M Jafari stressed the importance of wearability, saying, "Wearables provide a very interesting opportunity in the sense of their tight coupling with the human body. Because they are attached to our body, they know quite a bit about us throughout the day, and they can provide us with valuable feedback at the right times. With this in mind, we wanted to develop a technology in the form factor of a watch.” 

Because certain ASL signs have similar gestures, Jafari used two kinds of sensors. One is a motion sensor that includes an accelerometer and a gyroscope that responds the motions of the hands and arms.
 
The other component of the system is an electromyographic sensor (sEMG) that measures the electrical output and potential of the muscles in the fingers, hands and arms. The two kinds of sensors work together to recognize gestures and improve recognition accuracy. The sensors are placed on the right hand of the user and are connected to a laptop via Bluetooth protocol. The computer uses complex software to decode the movements and flashes the words on the screen. 

Jafari aims to keep developing the prototype until it can be made much smaller, similar to a watch, and then hopefully enhance the software so that it can send the written meaning of signs and gestures to another smart device being used by whomever the ASL speaker is interacting with.
 
Check out the video below to see more about the wearable ASL technology being developed in the lab.
 
About the Author
  • I'm a writer living in the Boston area. My interests include cancer research, cardiology and neuroscience. I want to be part of using the Internet and social media to educate professionals and patients in a collaborative environment.
You May Also Like
OCT 19, 2019
Technology
OCT 19, 2019
Motorized prosthetic arm can sense touch, move with your thoughts
Picking up an egg without crushing it seems like an easy task for anyone—but for Keven Walgamott, who lost his left hand and part of his arm from a m...
OCT 19, 2019
Technology
OCT 19, 2019
Can Robots Land Like Birds?
Below the gaze of high-speed cameras, a tiny bird named Gary is awaiting the signal to fly over to a perch covered in Teflon. The successful land of Gary o...
OCT 19, 2019
Cardiology
OCT 19, 2019
The Best Way to Test Blood Pressure and Find Heart Disease
Heart disease causes hundreds of thousands of deaths annually -- can a new study on blood pressure tests guide doctors toward earlier diagnosis? About one ...
OCT 19, 2019
Clinical & Molecular DX
OCT 19, 2019
A Blood Drop Test for Rapid Detection of Traumatic Brain Injury
A study published by Yue et al., in August 2019, reported that a simple blood test (a few drops of blood) can detect traumatic brain injury efficiently and...
OCT 19, 2019
Microbiology
OCT 19, 2019
Scientists Capture Video of a Virus Forming
In a first, scientists have captured video of individual viruses as they form, illustrating viral assembly....
OCT 19, 2019
Technology
OCT 19, 2019
Telescope technology takes first accurate images of glaucoma-related eye structure
Using the same tools designed to observe the stars, vision scientists at Indiana University have taken the first accurate microscopic images of the trabecu...
Loading Comments...