Automatic processing of unattended lexical information in visual oddball presentation: Neurophysiological evidence

Research output: Contribution to journalArticle

Researchers

  • Yury Shtyrov
  • Galina Goryainova
  • Sergei Tugin

  • Alexey Ossadtchi
  • Anna Shestakova

Research units

  • Aarhus University
  • Lund University
  • Medical Research Council
  • St. Petersburg State University
  • Source Signal Imaging Inc
  • Moscow State University of Psychology and Education

Abstract

Previous electrophysiological studies of automatic language processing revealed early (100- 200 ms) reflections of access to lexical characteristics of speech signal using the so-called mismatch negativity (MMN), a negative ERP deflection elicited by infrequent irregularities in unattended repetitive auditory stimulation. In those studies, lexical processing of spoken stimuli became manifest as an enhanced ERP in response to unattended real words as opposed to phonologically matched but meaningless pseudoword stimuli. This lexical ERP enhancement was explained by automatic activation of word memory traces realised as distributed strongly intra-connected neuronal circuits, whose robustness guarantees memory trace activation even in the absence of attention on spoken input. Such an account would predict the automatic activation of these memory traces upon any presentation of linguistic information, irrespective of the presentation modality. As previous lexical MMN studies exclusively used auditory stimulation, we here adapted the lexical MMN paradigm to investigate early automatic lexical effects in the visual modality. In a visual oddball sequence, matched short word and pseudoword stimuli were presented tachistoscopically in perifoveal area outside the visual focus of attention, as the subjects' attention was concentrated on a concurrent non-linguistic visual dual task in the centre of the screen. Using EEG, we found a visual analogue of the lexical ERP enhancement effect, with unattended written words producing larger brain response amplitudes than matched pseudowords, starting at ~100 ms. Furthermore, we also found significant visual MMN, reported here for the first time for unattended lexical stimuli presented perifoveally. The data suggest early automatic lexical processing of visually presented language outside the focus of attention.

Details

Original languageEnglish
Article number421
Pages (from-to)1-10
JournalFRONTIERS IN HUMAN NEUROSCIENCE
Volume7
Publication statusPublished - 14 Jul 2013
MoE publication typeA1 Journal article-refereed

    Research areas

  • Brain, Event-related potential (ERP), Language, Lexical memory trace, Mismatch negativity (MMN, VMMN), Visual word comprehension

Download statistics

No data available

ID: 12958071