Por favor, use este identificador para citar o enlazar este ítem: http://hdl.handle.net/10609/93201
Registro completo de metadatos
Campo DC Valor Lengua/Idioma
dc.contributor.authorKulkarni, Kaustubh-
dc.contributor.authorCorneanu, Ciprian-
dc.contributor.authorOfodile, Ikechukwu-
dc.contributor.authorEscalera, Sergio-
dc.contributor.authorBaró, Xavier-
dc.contributor.authorHyniewska, Sylwia-
dc.contributor.authorAllik, Jüri-
dc.contributor.authorAnbarjafari, Gholamreza-
dc.contributor.otherUniversitat Autònoma de Barcelona (UAB)-
dc.contributor.otherUniversity of Tartu-
dc.contributor.otherInstitute of Physiology and Pathology of Hearing-
dc.contributor.otherHasan Kalyoncu University-
dc.contributor.otherUniversitat Oberta de Catalunya (UOC)-
dc.date.accessioned2019-04-15T11:37:16Z-
dc.date.available2019-04-15T11:37:16Z-
dc.date.issued2018-01-09-
dc.identifier.citationKulkarni, K., Corneanu, C., Ofodile, I., Escalera Guerrero, S., Baró Solé, X., Hyniewska, S., Allik, J. & Anbarjafari, G. (2018). Automatic recognition of facial displays of unfelt emotions. IEEE Transactions on Affective Computing. doi: 10.1109/TAFFC.2018.2874996-
dc.identifier.issn1949-3045MIAR
-
dc.identifier.issn2371-9850MIAR
-
dc.identifier.urihttp://hdl.handle.net/10609/93201-
dc.description.abstractHumans modify their facial expressions in order to communicate their internal states and sometimes to mislead observers regarding their true emotional states. Evidence in experimental psychology shows that discriminative facial responses are short and subtle. This suggests that such behavior would be easier to distinguish when captured in high resolution at an increased frame rate. We are proposing SASE-FE, the first dataset of facial expressions that are either congruent or incongruent with underlying emotion states. We show that overall the problem of recognizing whether facial movements are expressions of authentic emotions or not can be successfully addressed by learning spatio-temporal representations of the data. For this purpose, we propose a method that aggregates features along fiducial trajectories in a deeply learnt space. Performance of the proposed model shows that on average it is easier to distinguish among genuine facial expressions of emotion than among unfelt facial expressions of emotion and that certain emotion pairs such as contempt and disgust are more difficult to distinguish than the rest. Furthermore, the proposed methodology improves state of the art results on CK+ and OULU-CASIA datasets for video emotion recognition, and achieves competitive results when classifying facial action units on BP4D datase.en
dc.language.isoeng-
dc.publisherIEEE Transactions on Affective Computing-
dc.relation.ispartofIEEE Transactions on Affective Computing, 2018-
dc.relation.urihttps://doi.org/10.1109/TAFFC.2018.2874996-
dc.rightsCC BY-NC-ND-
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/3.0/es/-
dc.subjectaffective computingen
dc.subjectfacial expression recognitionen
dc.subjectunfelt facial expression of emotionen
dc.subjecthuman behaviour analysisen
dc.subjectcomputación afectivaes
dc.subjectreconocimiento de la expresión faciales
dc.subjectexpresión facial sin emociónes
dc.subjectanálisis del comportamiento humanoes
dc.subjectcomputació afectivaca
dc.subjectreconeixement d'expressió facialca
dc.subjectexpressió facial sense emocióca
dc.subjectanàlisi del comportament humàca
dc.subject.lcshHuman face recognition (Computer science)en
dc.titleAutomatic recognition of facial displays of unfelt emotions-
dc.typeinfo:eu-repo/semantics/article-
dc.subject.lemacReconeixement facial (Informàtica)ca
dc.subject.lcshesReconocimiento facial (Informática)es
dc.rights.accessRightsinfo:eu-repo/semantics/openAccess-
dc.identifier.doi10.1109/TAFFC.2018.2874996-
dc.gir.idAR/0000006549-
dc.type.versioninfo:eu-repo/semantics/submittedVersion-
Aparece en las colecciones: Articles cientÍfics
Articles

Ficheros en este ítem:
Fichero Descripción Tamaño Formato  
unfeltemotions.pdfPreprint5,1 MBAdobe PDFVista previa
Visualizar/Abrir
Comparte:
Exporta:
Consulta las estadísticas

Los ítems del Repositorio están protegidos por copyright, con todos los derechos reservados, a menos que se indique lo contrario.