Projects per year
Traditionally, touchscreen typing has been studied in terms of motor performance. However, recent research has exposed a decisive role of visual attention being shared between the keyboard and the text area. Strategies for this are known to adapt to the task, design, and user. In this paper, we propose a unifying account of touchscreen typing, regarding it as optimal supervisory control. Under this theory, rules for controlling visuo-motor resources are learned via exploration in pursuit of maximal typing performance. The paper outlines the control problem and explains how visual and motor limitations afect it. We then present a model, implemented via reinforcement learning, that simulates co-ordination of eye and fnger movements. Comparison with human data affrms that the model creates realistic fnger-and eye-movement patterns and shows human-like adaptation. We demonstrate the model's utility for interface development in evaluating touchscreen keyboard designs.
|Title of host publication||CHI '21: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems|
|Number of pages||14|
|Publication status||Published - 6 May 2021|
|MoE publication type||A4 Article in a conference publication|
|Event||ACM SIGCHI Annual Conference on Human Factors in Computing Systems - Virtual, Online|
Duration: 8 May 2021 → 13 May 2021
|Conference||ACM SIGCHI Annual Conference on Human Factors in Computing Systems|
|Abbreviated title||ACM CHI|
|Period||08/05/2021 → 13/05/2021|
- Computational modelling
- Rational adaptation
- Touchscreen typing
FingerprintDive into the research topics of 'Touchscreen typing as optimal supervisory control'. Together they form a unique fingerprint.
- 3 Active