MAR 06, 2018 06:11 AM PST

Knowing When Language is Understood in the Brain

Understanding the spoken word is probably one of the most complex tasks the human brain can perform. In regular conversation, the average rate of speech is about 120 to 200 words per minute.

Not only does the speaker have to understand the language well enough to use the correct words to convey their thoughts, but also the person listening has to understand each word at a rapid pace. Including words that are the same but have different meanings depending on the context. New research from the University of Rochester has found the mechanism in the brain that indicates when a person has understood what is being said. Knowing how this works could pave the way for healthcare professionals to assess patients who might have an injury or condition that impacts cognition.

Edmund Lalor is an associate professor of biomedical engineering and neuroscience at the University of Rochester and Trinity College in Dublin. He explained in a press release, "That we can do this so easily is an amazing feat of the human brain – especially given that the meaning of words can vary greatly depending on the context. For example, ‘I saw a bat flying overhead last night' versus ‘the baseball player hit a home run with his favorite bat.'"

Research into this brain function used a fairly ordinary method of reading brain wave patterns with electroencephalography (EEG.) Using electrodes that sit on a patient's scalp, an EEG can monitor signals from the brain, and this is where the team discovered how to interpret EEG readings to show language comprehension.

The research involved some machine learning as well as EEG strips. Using audiobooks, the text was read into a computer that was "trained" to recognize patterns in words. Given the thousands of words contained in just one book, the machine eventually learns which words are supposed to go together. Each word essentially becomes numerical, in lines of code, and the software can determine the meaning of words by how these numbers are assigned. When patients in the study listened to sections of the audiobooks, EEG recordings were made. The data on the strips were then correlated with the numerical measures to show precisely where the brain indicated understanding.

But how was it possible to verify that this signal meant that comprehension was taking place? The team used one experiment involving Hemingway's class "The Old Man and the Sea." Patients listened to portions of the book, and the investigators noted the signals in the EEG readings that they believed showed understanding. Then the researchers changed it up. Lalor stated, "We could see brain signals telling us that people could understand what they were hearing. When we had the same people come back and hear the same audiobook played backward, the signal disappears entirely."

To further test the theory, they asked participants to listen to a recording of a speech make President Barack Obama, but they added some background noise that made it nearly impossible to catch more than a few words here and there. Some signals were detected that showed partial understanding, but it was weak. When the participants were able to view a video of the speech, the researchers noted at the signal "intensified dramatically" because visual cues in the video allowed the study volunteers to understand what it was they couldn't hear precisely.

Their study is published in the journal Current Biology. The project is continuing to look into how the brain processes the meaning of words, sounds, and other stimuli. They hope that the findings might be useful in assessing brain function for patients who are comatose or otherwise impaired, for children to test language development and possibly in older patients to see if cognition in daily conversation is declining.

Sources. Rochester University, Current Biology  Trinity, Dublin  

About the Author
  • I'm a writer living in the Boston area. My interests include cancer research, cardiology and neuroscience. I want to be part of using the Internet and social media to educate professionals and patients in a collaborative environment.
You May Also Like
SEP 11, 2018
Neuroscience
SEP 11, 2018
What Learning is Like in the Teen Brain
It's that time of year again. Much like the Christmas song, some parents might think it's "The Most Wonderful Time of the Year" and offic...
SEP 11, 2018
Videos
SEP 11, 2018
Are you sleeping right? The sleep stages explained
Are you having trouble sleeping? Well, you're not alone. The number of people affected by sleep problems such as sleep apnea or insomnia is startling h...
SEP 24, 2018
Drug Discovery
SEP 24, 2018
Further Challenges in Alzheimer Drug Development
Scientists at Trinity College Dublin performed a large-scale international study involving the treatment of Alzheimer's disease, which was published in...
SEP 27, 2018
Health & Medicine
SEP 27, 2018
Binge Drinking Affects Men and Women Differently
Binge drinking is not a good idea. Alcohol is linked to cardiovascular disease, obesity, diabetes, and some forms of cancer. Hundreds of studies show the p...
OCT 08, 2018
Neuroscience
OCT 08, 2018
Esports Curriculum to Study the Brain and Gaming
In the field of neuroscience, there is a lot of research on the effects video games have on the brain. Video game addiction has been declared a medical dis...
OCT 24, 2018
Neuroscience
OCT 24, 2018
Self-Restraint And Will Power Improves Weight-Loss: Scientific Evidence
Weight loss success linked with active self-control regions of the brain...
Loading Comments...