Neural Network Based Facial Expression Analysis of Game Events: A Cautionary Tale

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review


Research units


We present an exploratory study of analyzing and visualizing player facial expressions from video with deep neural networks. We contribute a novel data processing and visualization technique we call Affect Gradients, which provides descriptive statistics of the expressive responses to game events, such as player death or collecting a power-up. As an additional contribution, we show that although there has been tremendous recent progress in deep neural networks and computer vision, interpreting the results as direct read-outs of experiential states is not advised. According to our data, getting killed appears to make players happy, and much more so than killing enemies, although one might expect the exact opposite. A visual inspection of the data reveals that our classifier works as intended, and our results illustrate the limitations of making inferences based on facial images and discrete emotion labels. For example, players may laugh off the death, in which case the closest label for the facial expression is "happy", but the true emotional state is complex and ambiguous. On the other hand, players may frown in concentration while killing enemies or escaping a tight spot, which can easily be interpreted as an "angry" expression.


Original languageEnglish
Title of host publicationProceedings of the 2018 Annual Symposium on Computer-Human Interaction in Play
Publication statusPublished - 2018
MoE publication typeA4 Article in a conference publication
EventACM SIGCHI Annual Symposium on Computer-Human Interaction in Play
- Melbourne, Australia
Duration: 28 Oct 201831 Oct 2018
Conference number: 5


ConferenceACM SIGCHI Annual Symposium on Computer-Human Interaction in Play
Abbreviated titleCHI PLAY
Internet address

ID: 29225404