Please use this identifier to cite or link to this item: http://hdl.handle.net/10609/93201
Title: Automatic recognition of facial displays of unfelt emotions
Author: Kulkarni, Kaustubh
Corneanu, Ciprian
Ofodile, Ikechukwu
Escalera, Sergio  
Baró, Xavier  
Hyniewska, Sylwia
Allik, Jüri  
Anbarjafari, Gholamreza  
Others: Universitat Autònoma de Barcelona (UAB)
University of Tartu
Institute of Physiology and Pathology of Hearing
Hasan Kalyoncu University
Universitat Oberta de Catalunya (UOC)
Citation: Kulkarni, K., Corneanu, C., Ofodile, I., Escalera Guerrero, S., Baró Solé, X., Hyniewska, S., Allik, J. & Anbarjafari, G. (2018). Automatic recognition of facial displays of unfelt emotions. IEEE Transactions on Affective Computing. doi: 10.1109/TAFFC.2018.2874996
Abstract: Humans modify their facial expressions in order to communicate their internal states and sometimes to mislead observers regarding their true emotional states. Evidence in experimental psychology shows that discriminative facial responses are short and subtle. This suggests that such behavior would be easier to distinguish when captured in high resolution at an increased frame rate. We are proposing SASE-FE, the first dataset of facial expressions that are either congruent or incongruent with underlying emotion states. We show that overall the problem of recognizing whether facial movements are expressions of authentic emotions or not can be successfully addressed by learning spatio-temporal representations of the data. For this purpose, we propose a method that aggregates features along fiducial trajectories in a deeply learnt space. Performance of the proposed model shows that on average it is easier to distinguish among genuine facial expressions of emotion than among unfelt facial expressions of emotion and that certain emotion pairs such as contempt and disgust are more difficult to distinguish than the rest. Furthermore, the proposed methodology improves state of the art results on CK+ and OULU-CASIA datasets for video emotion recognition, and achieves competitive results when classifying facial action units on BP4D datase.
Keywords: affective computing
facial expression recognition
unfelt facial expression of emotion
human behaviour analysis
DOI: 10.1109/TAFFC.2018.2874996
Document type: info:eu-repo/semantics/article
Version: info:eu-repo/semantics/submittedVersion
Issue Date: 9-Jan-2018
Publication license: http://creativecommons.org/licenses/by-nc-nd/3.0/es/  
Appears in Collections:Articles cientÍfics
Articles

Files in This Item:
File Description SizeFormat 
unfeltemotions.pdfPreprint5,1 MBAdobe PDFThumbnail
View/Open
Share:
Export:
View statistics

Items in repository are protected by copyright, with all rights reserved, unless otherwise indicated.