TY - GEN
T1 - Decision Explanation: Applying Contextual Importance and Contextual Utility in Affect Detection
AU - Fouladgar, Nazanin
AU - Alirezaie, Marjan
AU - Främling, Kary
PY - 2020
Y1 - 2020
N2 - Explainable AI has recently paved the way to justify decisions made by black-box models in various areas. However, a mature body of work in the field of affect detection is still limited. In this work, we evaluate a black-box outcome explanation for understanding humans’ affective states. We employ two concepts of Contextual Importance (CI) and Contextual Utility (CU), emphasizing on a context-aware decision explanation of a non-linear model, mainly a neural network. The neural model is designed to detect the individual mental states measured by wearable sensors to monitor the human user’s well-being. We conduct our experiments and outcome explanation on WESAD and MAHNOBHCI, as multimodal affect computing datasets. The results reveal that in the first experiment the electrodermal activity, respiration as well as accelorometer and in the second experiment the electrocardiogram and respiration signals contribute significantly in the classification task of mental states for a specific participant. To the best of our knowledge, this is the first study leveraging the CI and CU concepts in outcome explanation of an affect detection model.
AB - Explainable AI has recently paved the way to justify decisions made by black-box models in various areas. However, a mature body of work in the field of affect detection is still limited. In this work, we evaluate a black-box outcome explanation for understanding humans’ affective states. We employ two concepts of Contextual Importance (CI) and Contextual Utility (CU), emphasizing on a context-aware decision explanation of a non-linear model, mainly a neural network. The neural model is designed to detect the individual mental states measured by wearable sensors to monitor the human user’s well-being. We conduct our experiments and outcome explanation on WESAD and MAHNOBHCI, as multimodal affect computing datasets. The results reveal that in the first experiment the electrodermal activity, respiration as well as accelorometer and in the second experiment the electrocardiogram and respiration signals contribute significantly in the classification task of mental states for a specific participant. To the best of our knowledge, this is the first study leveraging the CI and CU concepts in outcome explanation of an affect detection model.
KW - Explainable AI
KW - Affect detection
KW - Black-Box decision
KW - Contextual Importance and Utility
UR - http://www.scopus.com/inward/record.url?scp=85098919275&partnerID=8YFLogxK
M3 - Conference article in proceedings
T3 - CEUR Workshop Proceedings
SP - 1
EP - 13
BT - Proceedings of the Italian Workshop on Explainable Artificial Intelligence
PB - CEUR
T2 - Italian Workshop on Explainable Artificial Intelligence
Y2 - 25 November 2020 through 26 November 2020
ER -