File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Multi-region ensemble convolutional neural network for facial expression recognition

TitleMulti-region ensemble convolutional neural network for facial expression recognition
Authors
KeywordsExpression recognition
Deep learning
Convolutional Neural Network
Multi-region ensemble
Issue Date2018
PublisherSpringer. The Proceedings' web site is located at https://link.springer.com/conference/icann
Citation
European Neural Network Society 27th International Conference on Artificial Neural Networks (ICANN2018), Rhodes, Greece, 4-7 October 2018. In Artificial Neural Networks and Machine Learning – ICANN 2018, pt. 1, p. 84-94 How to Cite?
AbstractFacial expressions play an important role in conveying the emotional states of human beings. Recently, deep learning approaches have been applied to image recognition field due to the discriminative power of Convolutional Neural Network (CNN). In this paper, we first propose a novel Multi-Region Ensemble CNN (MRE-CNN) framework for facial expression recognition, which aims to enhance the learning power of CNN models by capturing both the global and the local features from multiple human face sub-regions. Second, the weighted prediction scores from each sub-network are aggregated to produce the final prediction of high accuracy. Third, we investigate the effects of different sub-regions of the whole face on facial expression recognition. Our proposed method is evaluated based on two well-known publicly available facial expression databases: AFEW 7.0 and RAF-DB, and has been shown to achieve the state-of-the-art recognition accuracy.
Persistent Identifierhttp://hdl.handle.net/10722/263545
ISBN
ISSN
2020 SCImago Journal Rankings: 0.249
ISI Accession Number ID
Series/Report no.Lecture Notes in Computer Science ; v. 11139

 

DC FieldValueLanguage
dc.contributor.authorFan, Y-
dc.contributor.authorLam, JCK-
dc.contributor.authorLi, VOK-
dc.date.accessioned2018-10-22T07:40:40Z-
dc.date.available2018-10-22T07:40:40Z-
dc.date.issued2018-
dc.identifier.citationEuropean Neural Network Society 27th International Conference on Artificial Neural Networks (ICANN2018), Rhodes, Greece, 4-7 October 2018. In Artificial Neural Networks and Machine Learning – ICANN 2018, pt. 1, p. 84-94-
dc.identifier.isbn9783030014179-
dc.identifier.issn0302-9743-
dc.identifier.urihttp://hdl.handle.net/10722/263545-
dc.description.abstractFacial expressions play an important role in conveying the emotional states of human beings. Recently, deep learning approaches have been applied to image recognition field due to the discriminative power of Convolutional Neural Network (CNN). In this paper, we first propose a novel Multi-Region Ensemble CNN (MRE-CNN) framework for facial expression recognition, which aims to enhance the learning power of CNN models by capturing both the global and the local features from multiple human face sub-regions. Second, the weighted prediction scores from each sub-network are aggregated to produce the final prediction of high accuracy. Third, we investigate the effects of different sub-regions of the whole face on facial expression recognition. Our proposed method is evaluated based on two well-known publicly available facial expression databases: AFEW 7.0 and RAF-DB, and has been shown to achieve the state-of-the-art recognition accuracy.-
dc.languageeng-
dc.publisherSpringer. The Proceedings' web site is located at https://link.springer.com/conference/icann-
dc.relation.ispartofInternational Conference on Artificial Neural Networks (ICANN2018): Artificial Neural Networks and Machine Learning-
dc.relation.ispartofseriesLecture Notes in Computer Science ; v. 11139-
dc.subjectExpression recognition-
dc.subjectDeep learning-
dc.subjectConvolutional Neural Network-
dc.subjectMulti-region ensemble-
dc.titleMulti-region ensemble convolutional neural network for facial expression recognition-
dc.typeConference_Paper-
dc.identifier.emailFan, Y: yrfan@HKUCC-COM.hku.hk-
dc.identifier.emailLam, JCK: h9992013@hkucc.hku.hk-
dc.identifier.emailLi, VOK: vli@eee.hku.hk-
dc.identifier.authorityLam, JCK=rp00864-
dc.identifier.authorityLi, VOK=rp00150-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1007/978-3-030-01418-6_9-
dc.identifier.scopuseid_2-s2.0-85054879043-
dc.identifier.hkuros294326-
dc.identifier.hkuros306540-
dc.identifier.volume1-
dc.identifier.spage84-
dc.identifier.epage94-
dc.identifier.eissn1611-3349-
dc.identifier.isiWOS:000463336400009-
dc.publisher.placeCham-
dc.identifier.issnl0302-9743-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats