MAR 06, 2018 06:11 AM PST

Knowing When Language is Understood in the Brain

Understanding the spoken word is probably one of the most complex tasks the human brain can perform. In regular conversation, the average rate of speech is about 120 to 200 words per minute.

Not only does the speaker have to understand the language well enough to use the correct words to convey their thoughts, but also the person listening has to understand each word at a rapid pace. Including words that are the same but have different meanings depending on the context. New research from the University of Rochester has found the mechanism in the brain that indicates when a person has understood what is being said. Knowing how this works could pave the way for healthcare professionals to assess patients who might have an injury or condition that impacts cognition.

Edmund Lalor is an associate professor of biomedical engineering and neuroscience at the University of Rochester and Trinity College in Dublin. He explained in a press release, "That we can do this so easily is an amazing feat of the human brain – especially given that the meaning of words can vary greatly depending on the context. For example, ‘I saw a bat flying overhead last night' versus ‘the baseball player hit a home run with his favorite bat.'"

Research into this brain function used a fairly ordinary method of reading brain wave patterns with electroencephalography (EEG.) Using electrodes that sit on a patient's scalp, an EEG can monitor signals from the brain, and this is where the team discovered how to interpret EEG readings to show language comprehension.

The research involved some machine learning as well as EEG strips. Using audiobooks, the text was read into a computer that was "trained" to recognize patterns in words. Given the thousands of words contained in just one book, the machine eventually learns which words are supposed to go together. Each word essentially becomes numerical, in lines of code, and the software can determine the meaning of words by how these numbers are assigned. When patients in the study listened to sections of the audiobooks, EEG recordings were made. The data on the strips were then correlated with the numerical measures to show precisely where the brain indicated understanding.

But how was it possible to verify that this signal meant that comprehension was taking place? The team used one experiment involving Hemingway's class "The Old Man and the Sea." Patients listened to portions of the book, and the investigators noted the signals in the EEG readings that they believed showed understanding. Then the researchers changed it up. Lalor stated, "We could see brain signals telling us that people could understand what they were hearing. When we had the same people come back and hear the same audiobook played backward, the signal disappears entirely."

To further test the theory, they asked participants to listen to a recording of a speech make President Barack Obama, but they added some background noise that made it nearly impossible to catch more than a few words here and there. Some signals were detected that showed partial understanding, but it was weak. When the participants were able to view a video of the speech, the researchers noted at the signal "intensified dramatically" because visual cues in the video allowed the study volunteers to understand what it was they couldn't hear precisely.

Their study is published in the journal Current Biology. The project is continuing to look into how the brain processes the meaning of words, sounds, and other stimuli. They hope that the findings might be useful in assessing brain function for patients who are comatose or otherwise impaired, for children to test language development and possibly in older patients to see if cognition in daily conversation is declining.

Sources. Rochester University, Current Biology  Trinity, Dublin  

About the Author
  • I'm a writer living in the Boston area. My interests include cancer research, cardiology and neuroscience. I want to be part of using the Internet and social media to educate professionals and patients in a collaborative environment.
You May Also Like
SEP 22, 2019
Health & Medicine
SEP 22, 2019
Wi-Fi Poses A Growing Health Risk
Groups linked to the telecommunications industry claim there are no possible health impacts related to Wi-Fi use. Several recent studies signal that the op...
SEP 22, 2019
Genetics & Genomics
SEP 22, 2019
Is Autism Really 81% Genetic?
A condition that affects 1 in 37 boys and 1 in 151 girls in the US, research is fast-showing that genetics account for 81% of the risk factor for someone t...
SEP 22, 2019
Technology
SEP 22, 2019
Novel Neural-Network Can Determine What Emotion An Image Evokes
Ever wondered if a computer may someday differentiate between joyful images from a depressing one? Can it someday tell if a film is a romantic comedy or a...
SEP 22, 2019
Genetics & Genomics
SEP 22, 2019
The Genetic Reasons You're Addicted to Smoking
Addiction to cigarettes, or nicotine, is one of the most widely researched addictions in modern science. And this is no surprise. Over 1.1 billion people a...
SEP 22, 2019
Drug Discovery & Development
SEP 22, 2019
What's the Difference Between Therapy and MDMA Therapy?
From a first glance, MDMA-assisted therapy may look very different from mainstream therapies such as Cognitive Behavioral Therapy (CBT); patients lying on...
SEP 22, 2019
Neuroscience
SEP 22, 2019
New MRI scan can reveal molecular changes in the brain
MRI scans give us pictures of the brain that depict the physical structure of brain tissue. Now, researchers discovered a way to determine the biological m...
Loading Comments...