Interacting parallel pathways associate sounds with visual identity in auditory cortices

Research output: Contribution to journalArticleScientificpeer-review

Researchers

Research units

  • Harvard University
  • Massachusetts Institute of Technology
  • Athinoula A. Martinos Center for Biomedical Imaging

Abstract

Spatial and non-spatial information of sound events is presumably processed in parallel auditory cortex (AC) "what" and "where" streams, which are modulated by inputs from the respective visual-cortex subsystems. How these parallel processes are integrated to perceptual objects that remain stable across time and the source agent's movements is unknown. We recorded magneto-and electroencephalography (MEG/EEG) data while subjects viewed animated video clips featuring two audiovisual objects, a black cat and a gray cat. Adaptor-probe events were either linked to the same object (the black cat meowed twice in a row in the same location) or included a visually conveyed identity change (the black and then the gray cat meowed with identical voices in the same location). In addition to effects in visual (including fusiform, middle temporal or MT areas) and frontoparietal association areas, the visually conveyed object-identity change was associated with a release from adaptation of early (50-150 ms) activity in posterior ACs, spreading to left anterior ACs at 250-450 ms in our combined MEG/EEG source estimates. Repetition of events belonging to the same object resulted in increased theta-band (4-8 Hz) synchronization within the "what" and "where" pathways (e.g., between anterior AC and fusiform areas). In contrast, the visually conveyed identity changes resulted in distributed synchronization at higher frequencies (alpha and beta bands, 8-32 Hz) across different auditory, visual, and association areas. The results suggest that sound events become initially linked to perceptual objects in posterior AC, followed by modulations of representations in anterior AC. Hierarchical what and where pathways seem to operate in parallel after repeating audiovisual associations, whereas the resetting of such associations engages a distributed network across auditory, visual, and multisensory areas. (C) 2015 Elsevier Inc. All rights reserved.

Details

Original languageEnglish
Pages (from-to)858-868
Number of pages11
JournalNeuroImage
Volume124
Publication statusPublished - 1 Jan 2016
MoE publication typeA1 Journal article-refereed

    Research areas

  • SUPERIOR TEMPORAL SULCUS, SHORT-TERM PLASTICITY, AUDIOVISUAL INTEGRATION, RHESUS-MONKEY, HUMAN BRAIN, MULTISENSORY INTERACTIONS, CROSSMODAL BINDING, SENSORY CORTICES, NEURAL SYNCHRONY, EEG-DATA

ID: 1500504