MAR 06, 2018 06:11 AM PST

Knowing When Language is Understood in the Brain

Understanding the spoken word is probably one of the most complex tasks the human brain can perform. In regular conversation, the average rate of speech is about 120 to 200 words per minute.

Not only does the speaker have to understand the language well enough to use the correct words to convey their thoughts, but also the person listening has to understand each word at a rapid pace. Including words that are the same but have different meanings depending on the context. New research from the University of Rochester has found the mechanism in the brain that indicates when a person has understood what is being said. Knowing how this works could pave the way for healthcare professionals to assess patients who might have an injury or condition that impacts cognition.

Edmund Lalor is an associate professor of biomedical engineering and neuroscience at the University of Rochester and Trinity College in Dublin. He explained in a press release, "That we can do this so easily is an amazing feat of the human brain – especially given that the meaning of words can vary greatly depending on the context. For example, ‘I saw a bat flying overhead last night' versus ‘the baseball player hit a home run with his favorite bat.'"

Research into this brain function used a fairly ordinary method of reading brain wave patterns with electroencephalography (EEG.) Using electrodes that sit on a patient's scalp, an EEG can monitor signals from the brain, and this is where the team discovered how to interpret EEG readings to show language comprehension.

The research involved some machine learning as well as EEG strips. Using audiobooks, the text was read into a computer that was "trained" to recognize patterns in words. Given the thousands of words contained in just one book, the machine eventually learns which words are supposed to go together. Each word essentially becomes numerical, in lines of code, and the software can determine the meaning of words by how these numbers are assigned. When patients in the study listened to sections of the audiobooks, EEG recordings were made. The data on the strips were then correlated with the numerical measures to show precisely where the brain indicated understanding.

But how was it possible to verify that this signal meant that comprehension was taking place? The team used one experiment involving Hemingway's class "The Old Man and the Sea." Patients listened to portions of the book, and the investigators noted the signals in the EEG readings that they believed showed understanding. Then the researchers changed it up. Lalor stated, "We could see brain signals telling us that people could understand what they were hearing. When we had the same people come back and hear the same audiobook played backward, the signal disappears entirely."

To further test the theory, they asked participants to listen to a recording of a speech make President Barack Obama, but they added some background noise that made it nearly impossible to catch more than a few words here and there. Some signals were detected that showed partial understanding, but it was weak. When the participants were able to view a video of the speech, the researchers noted at the signal "intensified dramatically" because visual cues in the video allowed the study volunteers to understand what it was they couldn't hear precisely.

Their study is published in the journal Current Biology. The project is continuing to look into how the brain processes the meaning of words, sounds, and other stimuli. They hope that the findings might be useful in assessing brain function for patients who are comatose or otherwise impaired, for children to test language development and possibly in older patients to see if cognition in daily conversation is declining.

Sources. Rochester University, Current Biology  Trinity, Dublin  

About the Author
  • I'm a writer living in the Boston area. My interests include cancer research, cardiology and neuroscience. I want to be part of using the Internet and social media to educate professionals and patients in a collaborative environment.
You May Also Like
NOV 12, 2018
Plants & Animals
NOV 12, 2018
Researchers Link Sunfish Brain Size to Specific Habitats
To most people, a specific fish species would be the same whether it was found at the shoreline or in the middle of the ocean. But according to research pu...
NOV 15, 2018
Drug Discovery
NOV 15, 2018
Treating Restless Leg Syndrome (RLS)
A common condition of the nervous system, Restless Leg Syndrome (RLS) is the overwhelming urge to move the legs. Usually unpleasant symptoms, many RLS pati...
DEC 03, 2018
Neuroscience
DEC 03, 2018
Stentrode: No Surgery Focused Brain stimulation
Brain stentrode, a brain interfacing electrode that is implanted within a blood vessel...
DEC 17, 2018
Health & Medicine
DEC 17, 2018
Microglial Priming And Pain
Microglia are in primed states when injury happens and acute exposure to opioids activates them further creating pain sensitivity....
DEC 17, 2018
Health & Medicine
DEC 17, 2018
Light Makes An Itch Go Away!
EMBL researchers have now found a way to stop itch with light in mice. Nature Biomedical Engineering publishes their results on 17 December 2018....
JAN 08, 2019
Videos
JAN 08, 2019
How exercise affects your brain
Although there is still so much we have left to learn about the brain, we do already know a lot about it. The brain is the most complex organ in your body,...
Loading Comments...