Empreu aquest identificador per citar o enllaçar aquest ítem: http://hdl.handle.net/10609/93198
Registre complet de metadades
Camp DCValorLlengua/Idioma
dc.contributor.authorGuo, Jiazgu-
dc.contributor.authorLei, Zhen-
dc.contributor.authorWan, Jun-
dc.contributor.authorAvots, Egils-
dc.contributor.authorHajarolasvadi, Noushin-
dc.contributor.authorKnyazev, Boris-
dc.contributor.authorKuharenko, Artem-
dc.contributor.authorSilveira Jacques Junior, Julio Cezar-
dc.contributor.authorBaró, Xavier-
dc.contributor.authorDemirel, Hasan-
dc.contributor.authorEscalera, Sergio-
dc.contributor.authorAllik, Jüri-
dc.contributor.authorAnbarjafari, Gholamreza-
dc.contributor.otherChinese Academy of Sciences-
dc.contributor.otherUniversity of Tartu-
dc.contributor.otherEastern Mediterranean University-
dc.contributor.otherNTechLab-
dc.contributor.otherUniversitat de Barcelona (UB)-
dc.contributor.otherUniversitat Oberta de Catalunya (UOC)-
dc.date.accessioned2019-04-15T11:37:15Z-
dc.date.available2019-04-15T11:37:15Z-
dc.date.issued2018-01-01-
dc.identifier.citationGuo, J., Lei, Z., Wan, J., Avots, E., Hajarolasvadi, N., Knyazev, B., Kuharenko, A., Silveira Jacques Junior, J.C., Baró Solé, X., Demirel, H., Escalera Guerrero, S., Allik, J. & Anbarjafari, G. (2018). Dominant and Complementary Emotion Recognition from Still Images of Faces. IEEE Access, 6(), 26391-26403. doi: 10.1109/ACCESS.2018.2831927-
dc.identifier.issn2169-3536MIAR
-
dc.identifier.urihttp://hdl.handle.net/10609/93198-
dc.description.abstractEmotion recognition has a key role in affective computing. Recently, fine-grained emotion analysis, such as compound facial expression of emotions, has attracted high interest of researchers working on affective computing. A compound facial emotion includes dominant and complementary emotions (e.g., happily-disgusted and sadly-fearful), which is more detailed than the seven classical facial emotions (e.g., happy, disgust, and so on). Current studies on compound emotions are limited to use data sets with limited number of categories and unbalanced data distributions, with labels obtained automatically by machine learning-based algorithms which could lead to inaccuracies. To address these problems, we released the iCV-MEFED data set, which includes 50 classes of compound emotions and labels assessed by psychologists. The task is challenging due to high similarities of compound facial emotions from different categories. In addition, we have organized a challenge based on the proposed iCV-MEFED data set, held at FG workshop 2017. In this paper, we analyze the top three winner methods and perform further detailed experiments on the proposed data set. Experiments indicate that pairs of compound emotion (e.g., surprisingly-happy vs happily-surprised) are more difficult to be recognized if compared with the seven basic emotions. However, we hope the proposed data set can help to pave the way for further research on compound facial emotion recognition.en
dc.language.isoeng-
dc.publisherIEEE Access-
dc.relation.ispartofIEEE Access, 2018, 6-
dc.relation.urihttps://doi.org/10.1109/access.2018.2831927-
dc.rights(c) Author/s & (c) Journal-
dc.subjectdominant and complementary emotion recognitionen
dc.subjectfine-grained face emotion dataseten
dc.subjectconjunto de datos de emocioneses
dc.subjectconjunt de dades d'emocionsca
dc.subjectcompound emotionsen
dc.subjectemociones compuestases
dc.subjectemocions compostesca
dc.subjectreconocimiento de emociones dominantes y complementariases
dc.subjectreconeixement d'emocions dominants i complementàriesca
dc.subject.lcshBiometryen
dc.titleDominant and complementary emotion recognition from still images of faces-
dc.typeinfo:eu-repo/semantics/article-
dc.subject.lemacBiometriaca
dc.subject.lcshesBiometríaes
dc.rights.accessRightsinfo:eu-repo/semantics/openAccess-
dc.identifier.doi10.1109/ACCESS.2018.2831927-
dc.gir.idAR/0000006292-
dc.relation.projectIDinfo:eu-repo/grantAgreement/PUT638-
dc.relation.projectIDinfo:eu-repo/grantAgreement/IUT213-
dc.relation.projectIDinfo:eu-repo/grantAgreement/TIN2015-66951-C2-2-R-
dc.relation.projectIDinfo:eu-repo/grantAgreement/TIN2016-74946-P-
dc.relation.projectIDinfo:eu-repo/grantAgreement/H2020-ICT-2015-
dc.relation.projectIDinfo:eu-repo/grantAgreement/2016YFC0801002-
dc.relation.projectIDinfo:eu-repo/grantAgreement/61502491-
dc.relation.projectIDinfo:eu-repo/grantAgreement/61572501-
dc.relation.projectIDinfo:eu-repo/grantAgreement/61572536-
dc.relation.projectIDinfo:eu-repo/grantAgreement/61673052-
dc.relation.projectIDinfo:eu-repo/grantAgreement/61473291-
dc.relation.projectIDinfo:eu-repo/grantAgreement/61773392-
dc.relation.projectIDinfo:eu-repo/grantAgreement/61403405-
dc.relation.projectIDinfo:eu-repo/grantAgreement/116E097-
dc.type.versioninfo:eu-repo/semantics/publishedVersion-
Apareix a les col·leccions:Articles cientÍfics
Articles

Arxius per aquest ítem:
Arxiu Descripció MidaFormat 
dominantcomplementary.pdf8,8 MBAdobe PDFThumbnail
Veure/Obrir
Comparteix:
Exporta:
Consulta les estadístiques

Els ítems del Repositori es troben protegits per copyright, amb tots els drets reservats, sempre i quan no s’indiqui el contrari.