Recognizing Emotional Expression in Game Streams

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

Standard

Recognizing Emotional Expression in Game Streams. / Roohi, Shaghayegh; Mekler, Elisa; Tavast, Mikke; Blomqvist, Tatu; Hämäläinen, Perttu.

CHI PLAY '19 - Proceedings of the Annual Symposium on Computer-Human Interaction in Play . ACM, 2019. p. 301-311.

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

Harvard

Roohi, S, Mekler, E, Tavast, M, Blomqvist, T & Hämäläinen, P 2019, Recognizing Emotional Expression in Game Streams. in CHI PLAY '19 - Proceedings of the Annual Symposium on Computer-Human Interaction in Play . ACM, pp. 301-311, ACM SIGCHI Annual Symposium on Computer-Human Interaction in Play, Barcelona, Spain, 22/10/2019. https://doi.org/10.1145/3311350.3347197

APA

Roohi, S., Mekler, E., Tavast, M., Blomqvist, T., & Hämäläinen, P. (2019). Recognizing Emotional Expression in Game Streams. In CHI PLAY '19 - Proceedings of the Annual Symposium on Computer-Human Interaction in Play (pp. 301-311). ACM. https://doi.org/10.1145/3311350.3347197

Vancouver

Roohi S, Mekler E, Tavast M, Blomqvist T, Hämäläinen P. Recognizing Emotional Expression in Game Streams. In CHI PLAY '19 - Proceedings of the Annual Symposium on Computer-Human Interaction in Play . ACM. 2019. p. 301-311 https://doi.org/10.1145/3311350.3347197

Author

Roohi, Shaghayegh ; Mekler, Elisa ; Tavast, Mikke ; Blomqvist, Tatu ; Hämäläinen, Perttu. / Recognizing Emotional Expression in Game Streams. CHI PLAY '19 - Proceedings of the Annual Symposium on Computer-Human Interaction in Play . ACM, 2019. pp. 301-311

Bibtex - Download

@inproceedings{a3cbe2b4341f4df39219ebf18999fbfe,
title = "Recognizing Emotional Expression in Game Streams",
abstract = "Gameplay is often an emotionally charged activity, in particular when streaming in front of a live audience. From a games user research perspective, it would be beneficial to automatically detect and recognize players’ and streamers’ emotional expression, as this data can be used for identifying gameplay highlights, computing emotion metrics or to select parts of the videos for further analysis, e.g., through assisted recall. We contribute the first automatic game stream emotion annotation system that combines neural network analysis of facial expressions, video transcript sentiment, voice emotion, and low-level audio features (pitch, loudness). Using human-annotated emotional expression data as the ground truth, we reach accuracies of up to 70.7{\%}, on par with the inter-rater agreement of the human annotators. In detecting the 5 most intense events of each video, we reach a higher accuracy of 80.4{\%}. Our system is particularly accurate in detecting clearly positive emotions like amusement and excitement, but more limited with subtle emotions like puzzlement.",
author = "Shaghayegh Roohi and Elisa Mekler and Mikke Tavast and Tatu Blomqvist and Perttu H{\"a}m{\"a}l{\"a}inen",
year = "2019",
doi = "10.1145/3311350.3347197",
language = "English",
pages = "301--311",
booktitle = "CHI PLAY '19 - Proceedings of the Annual Symposium on Computer-Human Interaction in Play",
publisher = "ACM",

}

RIS - Download

TY - GEN

T1 - Recognizing Emotional Expression in Game Streams

AU - Roohi, Shaghayegh

AU - Mekler, Elisa

AU - Tavast, Mikke

AU - Blomqvist, Tatu

AU - Hämäläinen, Perttu

PY - 2019

Y1 - 2019

N2 - Gameplay is often an emotionally charged activity, in particular when streaming in front of a live audience. From a games user research perspective, it would be beneficial to automatically detect and recognize players’ and streamers’ emotional expression, as this data can be used for identifying gameplay highlights, computing emotion metrics or to select parts of the videos for further analysis, e.g., through assisted recall. We contribute the first automatic game stream emotion annotation system that combines neural network analysis of facial expressions, video transcript sentiment, voice emotion, and low-level audio features (pitch, loudness). Using human-annotated emotional expression data as the ground truth, we reach accuracies of up to 70.7%, on par with the inter-rater agreement of the human annotators. In detecting the 5 most intense events of each video, we reach a higher accuracy of 80.4%. Our system is particularly accurate in detecting clearly positive emotions like amusement and excitement, but more limited with subtle emotions like puzzlement.

AB - Gameplay is often an emotionally charged activity, in particular when streaming in front of a live audience. From a games user research perspective, it would be beneficial to automatically detect and recognize players’ and streamers’ emotional expression, as this data can be used for identifying gameplay highlights, computing emotion metrics or to select parts of the videos for further analysis, e.g., through assisted recall. We contribute the first automatic game stream emotion annotation system that combines neural network analysis of facial expressions, video transcript sentiment, voice emotion, and low-level audio features (pitch, loudness). Using human-annotated emotional expression data as the ground truth, we reach accuracies of up to 70.7%, on par with the inter-rater agreement of the human annotators. In detecting the 5 most intense events of each video, we reach a higher accuracy of 80.4%. Our system is particularly accurate in detecting clearly positive emotions like amusement and excitement, but more limited with subtle emotions like puzzlement.

U2 - 10.1145/3311350.3347197

DO - 10.1145/3311350.3347197

M3 - Conference contribution

SP - 301

EP - 311

BT - CHI PLAY '19 - Proceedings of the Annual Symposium on Computer-Human Interaction in Play

PB - ACM

ER -

ID: 36207486