The Impact of Sound Systems on the Perception of Cinematic Content in Immersive Audiovisual Productions

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

Researchers

Research units

  • Kyushu University

Abstract

With fast technological developments, traditional perceptual environments disappear and new ones emerge. These changes make the human senses adapt to new ways of perceptual understanding, for example, regarding the perceptual integration of sound and vision. Proceeding from the fact that hearing cooperates with visual attention processes, the aim of this study is to investigate the effect of different sound design conditions on the perception of cinematic content in immersive audiovisual reproductions. Here we introduce the results of a visual selective attention task (counting objects) performed by participants watching a 270-degree immersive audiovisual display, on which a movie (»Ego Cure») was shown. Four sound conditions were used, which employed an increasing number of loudspeakers, i.e., mono, stereo, 5.1 and 7.1.4. Eye tracking was used to record the participant's eye gaze during the task. The eye tracking data showed that an increased number of speakers and a wider spatial audio distribution diffused the participants' attention from the task-related part of the display to non-task-related directions. The number of participants looking at the task-irrelevant display in the 7.1.4 condition was significantly higher than in the mono audio condition. This implies that additional spatial cues in the auditory modality automatically influence human visual attention (involuntary eye movements) and human analysis of visual information. Sound engineers should consider this when mixing educational or any other information-oriented productions.

Details

Original languageEnglish
Title of host publicationProceedings of the 2019 12th Asia Pacific Workshop on Mixed and Augmented Reality, APMAR 2019
EditorsDongdong Weng, Liwei Chan, Youngho Lee, Xiaohui Liang, Nobuchika Sakata
Publication statusPublished - 7 May 2019
MoE publication typeA4 Article in a conference publication
EventAsia Pacific Workshop on Mixed and Augmented Reality - Ikoma, Nara, Japan
Duration: 28 Mar 201929 Mar 2019
Conference number: 12
http://sigmr.vrsj.org/apmar2019/

Publication series

NameProceedings of the 2019 12th Asia Pacific Workshop on Mixed and Augmented Reality, APMAR 2019

Workshop

WorkshopAsia Pacific Workshop on Mixed and Augmented Reality
Abbreviated titleAPMAR
CountryJapan
CityIkoma, Nara
Period28/03/201929/03/2019
Internet address

    Research areas

  • audiovisual perception, eye gaze, eye tracking, immersive environments, sound systems, spatial sound

ID: 34667158