Abstract
Auditory localization cues in the near-field (<1.0 m) are significantly different than in the far-field. The near-field region is within an arm’s length of the listener allowing to integrate proprioceptive cues to determine the location of an object in space. This perceptual study compares three non-individualized methods to apply head-related transfer functions (HRTFs) in six-degrees-of-freedom near-field audio rendering, namely, far-field measured HRTFs, multi-distance measured HRTFs, and spherical-model-based HRTFs with near-field extrapolation. To set our findings in context, we provide a real-world hand-held audio source for comparison along with a distance-invariant condition. Two modes of interaction are compared in an audio-visual virtual reality: one allowing the participant to move the audio object dynamically and the other with a stationary audio object but a freely moving listener.
Original language | English |
---|---|
Title of host publication | 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) |
Publisher | IEEE |
Number of pages | 7 |
ISBN (Electronic) | 9781728113777 |
DOIs | |
Publication status | Published - 2019 |
MoE publication type | A4 Conference publication |
Event | IEEE Conference on Virtual Reality and 3D User Interfaces - Osaka, Japan Duration: 23 Mar 2019 → 27 Mar 2019 |
Conference
Conference | IEEE Conference on Virtual Reality and 3D User Interfaces |
---|---|
Abbreviated title | VR |
Country/Territory | Japan |
City | Osaka |
Period | 23/03/2019 → 27/03/2019 |
Keywords
- Human-centered computing—Human computer interaction (HCI)—Interaction paradigms—Virtual Reality
- Human-centered computing—Human computer interaction (HCI)—Interaction paradigms—Mixed / augmented reality
- Human-centered computing—Human computer interaction (HCI)—HCI design and evaluation methods—User studies