Abstract
To take a step towards real-life-like experimental setups, we simultaneously recorded magnetoencephalographic (MEG) signals and subject’s gaze direction during audiovisual speech perception. The stimuli were utterances of /apa/ dubbed onto two side-by-side female faces articulating /apa/ (congruent) and /aka/ (incongruent) in synchrony, repeated once every 3 s. Subjects (N = 10) were free to decide which face they viewed, and responses were averaged to two categories according to the gaze direction. The right-hemisphere 100-ms response to the onset of the second vowel (N100m’) was a fifth smaller to incongruent than congruent stimuli. The results demonstrate the feasibility of realistic viewing conditions with gaze-based averaging of MEG signals.
| Original language | English |
|---|---|
| Article number | 17 |
| Pages (from-to) | 1-7 |
| Number of pages | 7 |
| Journal | Frontiers in Human Neuroscience |
| Volume | 4 |
| DOIs | |
| Publication status | Published - 8 Mar 2010 |
| MoE publication type | A1 Journal article-refereed |
Keywords
- auditory cortex
- eye tracking
- human
- magnetoencephalography
- McGurk illusion