Abstrakti
Auditory localization cues in the near-field (<1.0 m) are significantly different than in the far-field. The near-field region is within an arm’s length of the listener allowing to integrate proprioceptive cues to determine the location of an object in space. This perceptual study compares three non-individualized methods to apply head-related transfer functions (HRTFs) in six-degrees-of-freedom near-field audio rendering, namely, far-field measured HRTFs, multi-distance measured HRTFs, and spherical-model-based HRTFs with near-field extrapolation. To set our findings in context, we provide a real-world hand-held audio source for comparison along with a distance-invariant condition. Two modes of interaction are compared in an audio-visual virtual reality: one allowing the participant to move the audio object dynamically and the other with a stationary audio object but a freely moving listener.
Alkuperäiskieli | Englanti |
---|---|
Otsikko | 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) |
Kustantaja | IEEE |
Sivumäärä | 7 |
ISBN (elektroninen) | 9781728113777 |
DOI - pysyväislinkit | |
Tila | Julkaistu - 2019 |
OKM-julkaisutyyppi | A4 Artikkeli konferenssijulkaisussa |
Tapahtuma | IEEE Conference on Virtual Reality and 3D User Interfaces - Osaka, Japani Kesto: 23 maalisk. 2019 → 27 maalisk. 2019 |
Conference
Conference | IEEE Conference on Virtual Reality and 3D User Interfaces |
---|---|
Lyhennettä | VR |
Maa/Alue | Japani |
Kaupunki | Osaka |
Ajanjakso | 23/03/2019 → 27/03/2019 |