Abstract
The use of visual user interfaces in smartphones and other personal media devices (PMD) leads to decreased situational awareness, for example, in city traffic. It is proposed in the paper that many menu navigation functions in PMDs can be replaced by an eyes-free auditory interface and an input device based on acoustic recognition of tactile gestures. We demonstrate, using a novel experimental setup, that the use of the proposed auditory interface reduces the reaction times to external events in comparison to a visual UI. In addition, while the task completion times in menu navigation are somewhat increased in the auditory interface the subjects were able to complete the given interaction tasks correctly within a reasonable time.
Original language | English |
---|---|
Title of host publication | 136th Audio Engineering Society Convention 2014 |
Publisher | Audio Engineering Society |
Pages | 317-325 |
Number of pages | 9 |
ISBN (Print) | 9781632665065 |
Publication status | Published - 2014 |
MoE publication type | A4 Conference publication |
Event | Audio Engineering Society Convention - Berlin, Germany Duration: 26 Apr 2014 → 29 Apr 2014 Conference number: 136 |
Conference
Conference | Audio Engineering Society Convention |
---|---|
Abbreviated title | AES |
Country/Territory | Germany |
City | Berlin |
Period | 26/04/2014 → 29/04/2014 |