Seeing speech: visual information from lip movements modifies activity in the human auditory cortex

Mikko Sams*, Reijo Aulanko, Matti Hämäläinen, Riitta Hari, Olli V. Lounasmaa, Sing Teh Lu, Juha Simola

*Corresponding author for this work

Research output: Contribution to journalArticleScientificpeer-review

283 Citations (Scopus)

Abstract

Neuromagnetic responses were recorded over the left hemisphere to find out in which cortical area the heard and seen speech are integrated. Auditory stimuli were Finnish /pa/ syllables presented together with a videotaped face articulating either the concordant syllable /pa/ (84% of stimuli, V = A) or the discordant syllable /ka/ (16%, V ≠ A). In some subjects the probabilities were reversed. The subjects heard V ≠ A stimuli as /ta/ or /ka/. The magnetic responses to infrequent perceptions elicited a specific waveform which could be explained by activity in the supratemporal auditory cortex. The results show that visual information from articulatory movements has an entry into the auditory cortex.

Original languageEnglish
Pages (from-to)141-145
Number of pages5
JournalNeuroscience Letters
Volume127
Issue number1
DOIs
Publication statusPublished - 10 Jun 1991
MoE publication typeA1 Journal article-refereed

Keywords

  • Audio-visual interaction
  • Audition
  • Auditory cortex
  • Evoked responses
  • Intersensory convergence
  • Man
  • MEG
  • Neuromagnetism
  • Speech perception

Fingerprint

Dive into the research topics of 'Seeing speech: visual information from lip movements modifies activity in the human auditory cortex'. Together they form a unique fingerprint.

Cite this