Abstract
We present a model for inferring where users look during interaction based on keypress data only. Given a key log, it outputs a scanpath that tells, moment-by-moment, how the user had moved eyes while entering those keys. The model can be used as a proxy for human data in cases where collecting real eye tracking data is expensive or impossible. Our technical insight is an inference architecture that considers the individual characteristics of the user, inferred as a low-dimensional parameter vector. We present a novel loss function for synchronizing inferred eye movements with the keypresses. Evaluations on touchscreen typing demonstrate accurate gaze inference.
| Original language | English |
|---|---|
| Title of host publication | ISWC 2025 - Proceedings of the 2025 ACM International Symposium on Wearable Computers |
| Publisher | ACM |
| Pages | 191-194 |
| Number of pages | 4 |
| ISBN (Electronic) | 979-8-4007-1481-8 |
| DOIs | |
| Publication status | Published - 7 Oct 2025 |
| MoE publication type | A4 Conference publication |
| Event | International Symposium on Wearable Computers - Espoo, Finland Duration: 12 Oct 2025 → 16 Oct 2025 Conference number: 29 |
Conference
| Conference | International Symposium on Wearable Computers |
|---|---|
| Abbreviated title | ISWC |
| Country/Territory | Finland |
| City | Espoo |
| Period | 12/10/2025 → 16/10/2025 |
Funding
This paper is supported by the Research Council of Finland (FCAI: 328400, 345604, 341763; Subjective Functions 357578), the ERC (AdG project Artificial User: 101141916.), and Google Grant (DeepTypist).
Keywords
- eye movement
- scanpath prediction
- text entry
- user model