TY - JOUR
T1 - Interacting parallel pathways associate sounds with visual identity in auditory cortices
AU - Ahveninen, Jyrki
AU - Huang, Samantha
AU - Ahlfors, Seppo P.
AU - Hämäläinen, Matti
AU - Rossi, Stephanie
AU - Sams, Mikko
AU - Jääskeläinen, Iiro
PY - 2016/1/1
Y1 - 2016/1/1
N2 - Spatial and non-spatial information of sound events is presumably processed in parallel auditory cortex (AC) "what" and "where" streams, which are modulated by inputs from the respective visual-cortex subsystems. How these parallel processes are integrated to perceptual objects that remain stable across time and the source agent's movements is unknown. We recorded magneto-and electroencephalography (MEG/EEG) data while subjects viewed animated video clips featuring two audiovisual objects, a black cat and a gray cat. Adaptor-probe events were either linked to the same object (the black cat meowed twice in a row in the same location) or included a visually conveyed identity change (the black and then the gray cat meowed with identical voices in the same location). In addition to effects in visual (including fusiform, middle temporal or MT areas) and frontoparietal association areas, the visually conveyed object-identity change was associated with a release from adaptation of early (50-150 ms) activity in posterior ACs, spreading to left anterior ACs at 250-450 ms in our combined MEG/EEG source estimates. Repetition of events belonging to the same object resulted in increased theta-band (4-8 Hz) synchronization within the "what" and "where" pathways (e.g., between anterior AC and fusiform areas). In contrast, the visually conveyed identity changes resulted in distributed synchronization at higher frequencies (alpha and beta bands, 8-32 Hz) across different auditory, visual, and association areas. The results suggest that sound events become initially linked to perceptual objects in posterior AC, followed by modulations of representations in anterior AC. Hierarchical what and where pathways seem to operate in parallel after repeating audiovisual associations, whereas the resetting of such associations engages a distributed network across auditory, visual, and multisensory areas. (C) 2015 Elsevier Inc. All rights reserved.
AB - Spatial and non-spatial information of sound events is presumably processed in parallel auditory cortex (AC) "what" and "where" streams, which are modulated by inputs from the respective visual-cortex subsystems. How these parallel processes are integrated to perceptual objects that remain stable across time and the source agent's movements is unknown. We recorded magneto-and electroencephalography (MEG/EEG) data while subjects viewed animated video clips featuring two audiovisual objects, a black cat and a gray cat. Adaptor-probe events were either linked to the same object (the black cat meowed twice in a row in the same location) or included a visually conveyed identity change (the black and then the gray cat meowed with identical voices in the same location). In addition to effects in visual (including fusiform, middle temporal or MT areas) and frontoparietal association areas, the visually conveyed object-identity change was associated with a release from adaptation of early (50-150 ms) activity in posterior ACs, spreading to left anterior ACs at 250-450 ms in our combined MEG/EEG source estimates. Repetition of events belonging to the same object resulted in increased theta-band (4-8 Hz) synchronization within the "what" and "where" pathways (e.g., between anterior AC and fusiform areas). In contrast, the visually conveyed identity changes resulted in distributed synchronization at higher frequencies (alpha and beta bands, 8-32 Hz) across different auditory, visual, and association areas. The results suggest that sound events become initially linked to perceptual objects in posterior AC, followed by modulations of representations in anterior AC. Hierarchical what and where pathways seem to operate in parallel after repeating audiovisual associations, whereas the resetting of such associations engages a distributed network across auditory, visual, and multisensory areas. (C) 2015 Elsevier Inc. All rights reserved.
KW - SUPERIOR TEMPORAL SULCUS
KW - SHORT-TERM PLASTICITY
KW - AUDIOVISUAL INTEGRATION
KW - RHESUS-MONKEY
KW - HUMAN BRAIN
KW - MULTISENSORY INTERACTIONS
KW - CROSSMODAL BINDING
KW - SENSORY CORTICES
KW - NEURAL SYNCHRONY
KW - EEG-DATA
UR - http://dx.doi.org/10.1016/j.neuroimage.2015.09.044
U2 - 10.1016/j.neuroimage.2015.09.044
DO - 10.1016/j.neuroimage.2015.09.044
M3 - Article
SN - 1053-8119
VL - 124
SP - 858
EP - 868
JO - NeuroImage
JF - NeuroImage
ER -