Abstract
We explore the combination of above-surface sensing with eye tracking to facilitate concurrent interaction with multiple regions on touch screens. Conventional touch input relies on positional accuracy, thereby requiring tight visual monitoring of one's own motor action. In contrast, above-surface sensing and eye tracking provides information about how user's hands and gaze are distributed across the interface. In these situations we facilitate interaction by 1) showing the visual feedback of the hand hover near user's gaze point and 2) decrease the requisite of positional accuracy by employing gestural information. We contribute input and visual feedback techniques that combine these modalities and demonstrate their use in example applications. A controlled study showed the effectiveness of our techniques for manipulation tasks against conventional touch, while the effectiveness in acquisition tasks depended on the amount of mid-air motion, leading to our conclusion that the techniques can benefit interacting with multiple interface regions. Copyright is held by the owner/author(s).
Original language | English |
---|---|
Title of host publication | DIS 2017 - Proceedings of the 2017 ACM Conference on Designing Interactive Systems |
Publisher | ACM |
Pages | 115-127 |
Number of pages | 13 |
ISBN (Electronic) | 9781450349222 |
DOIs | |
Publication status | Published - 10 Jun 2017 |
MoE publication type | A4 Article in a conference publication |
Event | ACM Conference on Designing Interactive Systems - Centre for Interaction Design, Edinburgh Napier University, Edinburgh, United Kingdom Duration: 10 Jun 2017 → 14 Jun 2017 Conference number: 12 http://dis2017.org/ |
Conference
Conference | ACM Conference on Designing Interactive Systems |
---|---|
Abbreviated title | DIS |
Country | United Kingdom |
City | Edinburgh |
Period | 10/06/2017 → 14/06/2017 |
Internet address |
Keywords
- Above surface interaction
- Eye tracking
- Gaze interaction
- Multi-touch