WigglyEyes: Inferring Eye Movements from Keypress Data

Yujun Zhu, Danqing Shi*, Hee-Seung Moon, Antti Oulasvirta

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingsScientificpeer-review

1 Downloads (Pure)

Abstract

We present a model for inferring where users look during interaction based on keypress data only. Given a key log, it outputs a scanpath that tells, moment-by-moment, how the user had moved eyes while entering those keys. The model can be used as a proxy for human data in cases where collecting real eye tracking data is expensive or impossible. Our technical insight is an inference architecture that considers the individual characteristics of the user, inferred as a low-dimensional parameter vector. We present a novel loss function for synchronizing inferred eye movements with the keypresses. Evaluations on touchscreen typing demonstrate accurate gaze inference.
Original languageEnglish
Title of host publicationISWC 2025 - Proceedings of the 2025 ACM International Symposium on Wearable Computers
PublisherACM
Pages191-194
Number of pages4
ISBN (Electronic)979-8-4007-1481-8
DOIs
Publication statusPublished - 7 Oct 2025
MoE publication typeA4 Conference publication
EventInternational Symposium on Wearable Computers - Espoo, Finland
Duration: 12 Oct 202516 Oct 2025
Conference number: 29

Conference

ConferenceInternational Symposium on Wearable Computers
Abbreviated titleISWC
Country/TerritoryFinland
CityEspoo
Period12/10/202516/10/2025

Funding

This paper is supported by the Research Council of Finland (FCAI: 328400, 345604, 341763; Subjective Functions 357578), the ERC (AdG project Artificial User: 101141916.), and Google Grant (DeepTypist).

Keywords

  • eye movement
  • scanpath prediction
  • text entry
  • user model

Fingerprint

Dive into the research topics of 'WigglyEyes: Inferring Eye Movements from Keypress Data'. Together they form a unique fingerprint.

Cite this