Abstract
Background. Speech perception is supported by seeing the articulatory movements of the talker. Visual information of articulation can also change what would have been perceived on the basis of acoustical signal only.
Methods. Eight experienced lip readers were studied. Stimuli were an acoustical natural syllable /pa/, and a moving female face articulating either /pa/ or /ka/. Stimuli were presented in different combinations and at different probabilities.
Results. Audiovisual stimuli elicited a long-latency "integration response", which was generated at or close to the auditory cortex.
Conclusions. We suggest that extensively processed visual information affects auditory processing in the anatomical site which is specialized in detecting complex, speech specific features from auditory stimulus.
Original language | English |
---|---|
Title of host publication | BRAIN TOPOGRAPHY TODAY |
Editors | Y Koga, K Nagata, K Hirata |
Publisher | Elsevier |
Pages | 47-53 |
Number of pages | 7 |
ISBN (Print) | 0-444-82778-1 |
Publication status | Published - 1997 |
MoE publication type | A4 Conference publication |
Event | Pan-Pacific Conference on Brain Topography - TOKYO BAY, Japan Duration: 1 Apr 1997 → 4 Apr 1997 Conference number: III |
Publication series
Name | INTERNATIONAL CONGRESS SERIES |
---|---|
Publisher | ELSEVIER SCIENCE BV |
Volume | 1147 |
ISSN (Print) | 0531-5131 |
Conference
Conference | Pan-Pacific Conference on Brain Topography |
---|---|
Abbreviated title | BTOPPS III |
Country/Territory | Japan |
City | TOKYO BAY |
Period | 01/04/1997 → 04/04/1997 |
Keywords
- event-related potentials (ERP)
- magnetoencephalography (MEG)