Spatio-temporal dynamics of face perception

Teemu Muukkonen*, K. Olander, J. Numminen, V. R. Salmela

*Corresponding author for this work

Research output: Contribution to journalArticleScientificpeer-review

3 Citations (Scopus)
6 Downloads (Pure)

Abstract

The temporal and spatial neural processing of faces has been investigated rigorously, but few studies have unified these dimensions to reveal the spatio-temporal dynamics postulated by the models of face processing. We used support vector machine decoding and representational similarity analysis to combine information from different locations (fMRI), time windows (EEG), and theoretical models. By correlating representational dissimilarity matrices (RDMs) derived from multiple pairwise classifications of neural responses to different facial expressions (neutral, happy, fearful, angry), we found early EEG time windows (starting around 130 ms) to match fMRI data from primary visual cortex (V1), and later time windows (starting around 190 ms) to match data from lateral occipital, fusiform face complex, and temporal-parietal-occipital junction (TPOJ). According to model comparisons, the EEG classification results were based more on low-level visual features than expression intensities or categories. In fMRI, the model comparisons revealed change along the processing hierarchy, from low-level visual feature coding in V1 to coding of intensity of expressions in the right TPOJ. The results highlight the importance of a multimodal approach for understanding the functional roles of different brain regions in face processing.

Original languageEnglish
Article number116531
Number of pages12
JournalNeuroImage
Volume209
DOIs
Publication statusPublished - 1 Apr 2020
MoE publication typeA1 Journal article-refereed

Keywords

  • EEG
  • fMRI
  • RSA
  • Face expression
  • Decoding
  • Face perception
  • MULTIVARIATE PATTERN-ANALYSIS
  • OBJECT RECOGNITION
  • FACIAL EXPRESSIONS
  • REPRESENTATIONS
  • RESPONSES
  • IDENTITY
  • MEG
  • INVERSION
  • INFANTS

Cite this