Children diagnosed with autism often have a very difficult time reading social cues. They are not always able to tell what a person is feeling based on their facial expressions. Most people can look at someone laughing and know they are happy, or look at someone frowning and know that person is angry or upset. Children on the autism spectrum do not that have innate ability and it’s harder for them to decode these cues.
Naturally this interferes with the ability to form friendships and navigate school and social situations. Specialists who work with these children spend time with drawings of facial expressions and explaining what each means. Inclusion therapy in classrooms where there are autistic children includes role playing, story telling and other social exercises to help those on the spectrum be a part of their environment. Now it appears that some high tech could also help.
Google Glass, the futuristic computer equipped glasses are now being studied at the Stanford School of Medicine to see if they can be used as an aide to children diagnosed with an autism spectrum disorder. The Austism Glass Project
is now in it’s second phase of research. The first phase involved building software for the Google Glass that could track facial expressions, read the emotion being expressed and flash on the lens of the Google Glass “happy”, “sad” or “surprised” for the wearer to read.
, a computer science whiz and Stanford Innovator in Residence initially developed the software and tested it in the first phase and now the team at the Wall Lab
at the Stanford University School of Medicine will be testing it on 100 children enrolled in the second phase.
Nick Haber, one of the lead scientists on the project spoke to Techcrunch
and talked about the challenges in making sure the device helped children learn and didn’t become a “prosthesis.” Haber said “ensuring the device’s use led to measurable learning when the children were no longer on the device” was the reason the second phase involved a game that the children would play to insure they interacted with their surroundings and didn’t just rely on the glass output to tell them what was going on. The team will used a game developed at MIT called “Capture the Smile” and along with the game results, parental questionaires and video analyses the team hopes to build a “phenotype” of autism for each child in the study.
The study will only have the children wearing the Google Glass devices for three 20 minute periods per day, but the data collected in these sessions will help researchers analyze what the children are looking at and how their eyes track around their environment. Check out the video below for an interview with lab founder Dennis Wall. He explains how the device can be used to study eye tracking and assist children in decoding facial expressions and other social cues.