Task-dependent activations of human auditory cortex to prototypical and nonprototypical vowels

Research output: Contribution to journalArticleScientificpeer-review

Researchers

  • Kirsi Harinen
  • Olli Aaltonen
  • Emma Salo
  • Oili Salonen
  • Teemu Rinne

Research units

  • University of Helsinki

Abstract

Research in auditory neuroscience has largely neglected the possible effects of different listening tasks on activations of auditory cortex (AC). In the present study, we used high-resolution fMRI to compare human AC activations with sounds presented during three auditory and one visual task. In all tasks, subjects were presented with pairs of Finnish vowels, noise bursts with pitch and Gabor patches. In the vowel pairs, one vowel was always either a prototypical /i/ or /ae/ (separately defined for each subject) or a nonprototype. In different task blocks, subjects were either required to discriminate (same/different) vowel pairs, to rate vowel "goodness" (first/second sound was a better exemplar of the vowel class), to discriminate pitch changes in the noise bursts, or to discriminate Gabor orientation changes. We obtained distinctly different AC activation patterns to identical sounds presented during the four task conditions. In particular, direct comparisons between the vowel tasks revealed stronger activations during vowel discrimination in the anterior and posterior superior temporal gyrus (STG), while the vowel rating task was associated with increased activations in the inferior parietal lobule (IPL). We also found that AC areas in or near Heschl's gyrus (HG) were sensitive to the speech-specific difference between a vowel prototype and nonprototype during active listening tasks. These results show that AC activations to speech sounds are strongly dependent on the listening tasks.

Details

Original languageEnglish
Pages (from-to)1272-1281
Number of pages10
JournalHuman Brain Mapping
Volume34
Issue number6
Publication statusPublished - Jun 2013
MoE publication typeA1 Journal article-refereed

    Research areas

  • Attention, Auditory cortex, Functional magnetic resonance imaging, Humans, Speech

ID: 13554921