Deep learning-based approach for continuous affect prediction from facial expression images in valence-arousal space

Facial emotion recognition has attracted extensive attention from the affective computing community and several approaches have been proposed, mainly providing classification of facial expressions images using a set of discrete emotional labels. The quantification of human emotions from faces has be...

Full description

Saved in:
Bibliographic Details
Main Authors: Hwooi, Stephen Khor Wen, Othmani, Alice, Sabri, Aznul Qalid Md
Format: Article
Published: Institute of Electrical and Electronics Engineers 2022
Subjects:
Online Access:http://eprints.um.edu.my/41266/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Facial emotion recognition has attracted extensive attention from the affective computing community and several approaches have been proposed, mainly providing classification of facial expressions images using a set of discrete emotional labels. The quantification of human emotions from faces has been studied in a continuous 2D emotional space of valence and arousal, that describes the level of pleasantness and the intensity of excitement, respectively. Emotion assessment using valence-arousal computation is a challenging topic with several possible applications in health monitoring, e-learning, mental health diagnosis and monitoring of customer interest. Supervised learning of emotional valence-arousal for continuous affect prediction requires labeled data. However, annotation of facial images with the values for valence and arousal requires training human experts. In this paper, we propose a new and robust approach based on deep learning for continuous affect recognition and prediction. The novelty of our approach is that it maps the discrete labels and a learned facial expression representation to the continuous valence-arousal dimensional space. Given a discrete class of emotion and a facial image, our deep learning-based approach can predict the valence and arousal values accurately. Our proposed approach outperforms existing approaches for arousal and valence prediction on AffectNet dataset and it shows an impressive generalization ability on an unseen dataset for valence prediction.