Stroke gesture synthesis in human-computer interaction

Luis A. Leiva, Daniel Martín-Albo, Radu Daniel Vatavu, Réjean Plamondon

Research output: Chapter in Book/Report/Conference proceedingChapterScientificpeer-review


Gesture recognizers usually require a large number of examples to achieve good accuracy. To achieve this goal, a series of time-consuming and expensive experiments must be followed, e.g. preparing the lab, recruiting participants, data collection and labeling, and often reporting to review boards. Fortunately, the Kinematic Theory allows to easily bootstrap gesture data generation. The synthesized data, in turn, may enable further applications of interest. In this chapter, we review the foundations of synthetic stroke gestures generation; i.e. the synthesis of data sequences comprising 2D points and associated timestamps, derived e.g. from electronic pens and touchscreens. We show that synthesized gestures not only perform equally similar to gestures generated by human users, but also they “look and feel” the same. We also discuss how the synthesized gestures can be used to estimate production time, which is a fundamental measure of performance in Human-Computer Interaction. Ultimately, this work benefits researchers and designers who wish to create gesture-driven prototypes or use the synthesized data to build more sophisticated applications.

Original languageEnglish
Title of host publicationLognormality Principle And Its Applications In E-security, E-learning And E-health, The
Number of pages25
ISBN (Electronic)9789811226830
Publication statusPublished - 1 Jan 2020
MoE publication typeA3 Part of a book or another research book


Dive into the research topics of 'Stroke gesture synthesis in human-computer interaction'. Together they form a unique fingerprint.

Cite this