Emotions amplify speaker–listener neural alignment

Research output: Contribution to journalArticleScientificpeer-review

Researchers

Research units

  • University of Turku

Abstract

Individuals often align their emotional states during conversation. Here, we reveal how such emotional alignment is reflected in synchronization of brain activity across speakers and listeners. Two “speaker” subjects told emotional and neutral autobiographical stories while their hemodynamic brain activity was measured with functional magnetic resonance imaging (fMRI). The stories were recorded and played back to 16 “listener” subjects during fMRI. After scanning, both speakers and listeners rated the moment-to-moment valence and arousal of the stories. Time-varying similarity of the blood-oxygenation-level-dependent (BOLD) time series was quantified by intersubject phase synchronization (ISPS) between speaker–listener pairs. Telling and listening to the stories elicited similar emotions across speaker–listener pairs. Arousal was associated with increased speaker–listener neural synchronization in brain regions supporting attentional, auditory, somatosensory, and motor processing. Valence was associated with increased speaker–listener neural synchronization in brain regions involved in emotional processing, including amygdala, hippocampus, and temporal pole. Speaker–listener synchronization of subjective feelings of arousal was associated with increased neural synchronization in somatosensory and subcortical brain regions; synchronization of valence was associated with neural synchronization in parietal cortices and midline structures. We propose that emotion-dependent speaker–listener neural synchronization is associated with emotional contagion, thereby implying that listeners reproduce some aspects of the speaker's emotional state at the neural level.

Details

Original languageEnglish
Pages (from-to)1-12
JournalHuman Brain Mapping
Publication statusPublished - 1 Jan 2019
MoE publication typeA1 Journal article-refereed

    Research areas

  • contagion, emotion, fMRI, speech, synchronization

ID: 36258908