When two people interact, their brain activity becomes synchronized, but it was unclear until now to what extent this “brain-to-brain coupling” is due to linguistic information or other factors, such as body language or tone of voice. Researchers report in the journal Neuron that brain-to-brain coupling during conversation can be modeled by considering the words used during that conversation, and the context in which they are used.
“We can see linguistic content emerge word-by-word in the speaker’s brain before they actually articulate what they're trying to say, and the same linguistic content rapidly reemerges in the listener’s brain after they hear it,” says the first author.
To communicate verbally, we must agree on the definitions of different words, but these definitions can change depending on the context. For example, without context, it would be impossible to know whether the word “cold” refers to temperature, a personality trait, or a respiratory infection.
“The contextual meaning of words as they occur in a particular sentence, or in a particular conversation, is really important for the way that we understand each other,” says a co-senior author. “We wanted to test the importance of context in aligning brain activity between speaker and listener to try to quantify what is shared between brains during conversation.”
To examine the role of context in driving brain coupling, the team collected brain activity data and conversation transcripts from pairs of epilepsy patients during natural conversations. The patients were undergoing intracranial monitoring using electrocorticography for unrelated clinical purposes. Compared to less invasive methods like fMRI, electrocorticography records extremely high-resolution brain activity because electrodes are placed in direct contact with the surface of the brain.
Next, the researchers used the large language model GPT-2 to extract the context surrounding each of the words used in the conversations, and then used this information to train a model to predict how brain activity changes as information flows from speaker to listener during conversation.
Using the model, the researchers were able to observe brain activity associated with the context-specific meaning of words in the brains of both speaker and listener. They showed that word-specific brain activity peaked in the speaker’s brain around 250 ms before they spoke each word, and corresponding spikes in brain activity associated with the same words appeared in the listener’s brain approximately 250 ms after they heard them.
Compared to previous work on speaker–listener brain coupling, the team’s context-based approach model was better able to predict shared patterns in brain activity.
“This shows just how important context is, because it best explains the brain data,” says the author. “Large language models take all these different elements of linguistics like syntax and semantics and represent them in a single high-dimensional vector. We show that this type of unified model is able to outperform other hand-engineered models from linguistics.”
In the future, the researchers plan to expand on their study by applying the model to other types of brain activity data, for example fMRI data, to investigate how parts of the brain not accessible with electrocorticography operate during conversations.
“There's a lot of exciting future work to be done looking at how different brain areas coordinate with each other at different timescales and with different kinds of content,” says the author.
Latest News
Brain cells that plan where…
By newseditor
Posted 12 Sep
A common fatty acid may hel…
By newseditor
Posted 12 Sep
Transcription factor functi…
By newseditor
Posted 12 Sep
Blood platelet score predic…
By newseditor
Posted 12 Sep
Mouse skin made transparent…
By newseditor
Posted 12 Sep
Other Top Stories
Key brain circuits for processing fear identified!
Read more
Eating whole grains led to modest improvements in gut microbiota an…
Read more
Exposure to stimuli for overcoming phobia
Read more
Neurons extrude 'exophers' containing protein aggregates and damage…
Read more
Cause of permanent vision loss after head injury discovered
Read more
Protocols
Modeling the atrioventricul…
By newseditor
Posted 11 Sep
Modeling the atrioventricul…
By newseditor
Posted 11 Sep
Fully defined NGN2 neuron p…
By newseditor
Posted 10 Sep
Clinical utility of a blood…
By newseditor
Posted 06 Sep
A glia-enriched stem cell 3…
By newseditor
Posted 01 Sep
Publications
Clinical sequelae of gut mi…
By newseditor
Posted 13 Sep
Neuroimmune interactions in…
By newseditor
Posted 13 Sep
Metabolism and HSC fate: wh…
By newseditor
Posted 13 Sep
Predictive grid coding in t…
By newseditor
Posted 12 Sep
Vaginal Lactobacillus fatty…
By newseditor
Posted 12 Sep
Presentations
Hydrogels in Drug Delivery
By newseditor
Posted 12 Apr
Lipids
By newseditor
Posted 31 Dec
Cell biology of carbohydrat…
By newseditor
Posted 29 Nov
RNA interference (RNAi)
By newseditor
Posted 23 Oct
RNA structure and functions
By newseditor
Posted 19 Oct
Posters
A chemical biology/modular…
By newseditor
Posted 22 Aug
Single-molecule covalent ma…
By newseditor
Posted 04 Jul
ASCO-2020-HEALTH SERVICES R…
By newseditor
Posted 23 Mar
ASCO-2020-HEAD AND NECK CANCER
By newseditor
Posted 23 Mar
ASCO-2020-GENITOURINARY CAN…
By newseditor
Posted 23 Mar