Abstract
We define four different tasks (orientation, localization, navigation and sonification), which are common in immersive visualization. Immersive visualization takes place in virtual environments, which provide an integrated system of 3D auditory and 3D visual display. The main objective of our research is to find out the best possible ways to use audio in different tasks. In the long run the goal is more efficient utilization of the spatial audio in immersive visualization application areas. Results of our first experiment have proven that navigation is possible using auditory cues.
Original language | English |
---|---|
Title of host publication | Proceedings of SPIE - The International Society for Optical Engineering |
Editors | R.F. Erbacher, P.C. Chen, J.C. Roberts, C.M. Wittenbrink, M. Grohn |
Pages | 13-22 |
Number of pages | 10 |
Volume | 4302 |
DOIs | |
Publication status | Published - 2001 |
MoE publication type | A4 Article in a conference publication |
Event | Visual Data Exploration and Analysis - San Jose, United States Duration: 22 Jan 2001 → 23 Jan 2001 Conference number: 8 |
Conference
Conference | Visual Data Exploration and Analysis |
---|---|
Country | United States |
City | San Jose |
Period | 22/01/2001 → 23/01/2001 |
Keywords
- Immersive visualization
- Multi-modal perception
- Spatial audio
- Virtual environment