Automatic processing of unattended lexical information in visual oddball presentation: Neurophysiological evidence

Yury Shtyrov*, Galina Goryainova, Sergei Tugin, Alexey Ossadtchi, Anna Shestakova

*Corresponding author for this work

    Research output: Contribution to journalArticleScientificpeer-review

    21 Citations (Scopus)
    92 Downloads (Pure)


    Previous electrophysiological studies of automatic language processing revealed early (100- 200 ms) reflections of access to lexical characteristics of speech signal using the so-called mismatch negativity (MMN), a negative ERP deflection elicited by infrequent irregularities in unattended repetitive auditory stimulation. In those studies, lexical processing of spoken stimuli became manifest as an enhanced ERP in response to unattended real words as opposed to phonologically matched but meaningless pseudoword stimuli. This lexical ERP enhancement was explained by automatic activation of word memory traces realised as distributed strongly intra-connected neuronal circuits, whose robustness guarantees memory trace activation even in the absence of attention on spoken input. Such an account would predict the automatic activation of these memory traces upon any presentation of linguistic information, irrespective of the presentation modality. As previous lexical MMN studies exclusively used auditory stimulation, we here adapted the lexical MMN paradigm to investigate early automatic lexical effects in the visual modality. In a visual oddball sequence, matched short word and pseudoword stimuli were presented tachistoscopically in perifoveal area outside the visual focus of attention, as the subjects' attention was concentrated on a concurrent non-linguistic visual dual task in the centre of the screen. Using EEG, we found a visual analogue of the lexical ERP enhancement effect, with unattended written words producing larger brain response amplitudes than matched pseudowords, starting at ~100 ms. Furthermore, we also found significant visual MMN, reported here for the first time for unattended lexical stimuli presented perifoveally. The data suggest early automatic lexical processing of visually presented language outside the focus of attention.

    Original languageEnglish
    Article number421
    Pages (from-to)1-10
    Publication statusPublished - 14 Jul 2013
    MoE publication typeA1 Journal article-refereed


    • Brain
    • Event-related potential (ERP)
    • Language
    • Lexical memory trace
    • Mismatch negativity (MMN, VMMN)
    • Visual word comprehension


    Dive into the research topics of 'Automatic processing of unattended lexical information in visual oddball presentation: Neurophysiological evidence'. Together they form a unique fingerprint.

    Cite this