Background. Speech perception is supported by seeing the articulatory movements of the talker. Visual information of articulation can also change what would have been perceived on the basis of acoustical signal only.
Methods. Eight experienced lip readers were studied. Stimuli were an acoustical natural syllable /pa/, and a moving female face articulating either /pa/ or /ka/. Stimuli were presented in different combinations and at different probabilities.
Results. Audiovisual stimuli elicited a long-latency "integration response", which was generated at or close to the auditory cortex.
Conclusions. We suggest that extensively processed visual information affects auditory processing in the anatomical site which is specialized in detecting complex, speech specific features from auditory stimulus.
|Title of host publication||BRAIN TOPOGRAPHY TODAY|
|Editors||Y Koga, K Nagata, K Hirata|
|Publisher||Elsevier Science B.V.|
|Number of pages||7|
|Publication status||Published - 1997|
|MoE publication type||A4 Article in a conference publication|
|Event||Pan-Pacific Conference on Brain Topography - TOKYO BAY, Japan|
Duration: 1 Apr 1997 → 4 Apr 1997
Conference number: III
|Name||INTERNATIONAL CONGRESS SERIES|
|Publisher||ELSEVIER SCIENCE BV|
|Conference||Pan-Pacific Conference on Brain Topography|
|Abbreviated title||BTOPPS III|
|Period||01/04/1997 → 04/04/1997|
- event-related potentials (ERP)
- magnetoencephalography (MEG)