File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: An explainable deep fusion network for affect recognition using physiological signals

TitleAn explainable deep fusion network for affect recognition using physiological signals
Authors
KeywordsAffect recognition
Deep learning
Explainability
Multimodal fusion
Issue Date2019
Citation
International Conference on Information and Knowledge Management, Proceedings, 2019, p. 2069-2072 How to Cite?
AbstractAffective computing is an emerging research area which provides insights on human's mental state through human-machine interaction. During the interaction process, bio-signal analysis is essential to detect human affective changes. Currently, machine learning methods to analyse bio-signals are the state of the art to detect the affective states, but most empirical works mainly deploy traditional machine learning methods rather than deep learning models due to the need for explainability. In this paper, we propose a deep learning model to process multimodal-multisensory bio-signals for affect recognition. It supports batch training for different sampling rate signals at the same time, and our results show significant improvement compared to the state of the art. Furthermore, the results are interpreted at the sensor- and signal- level to improve the explainaibility of our deep learning model.
Persistent Identifierhttp://hdl.handle.net/10722/354145
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorLin, Jionghao-
dc.contributor.authorPan, Shirui-
dc.contributor.authorLee, Cheng Siong-
dc.contributor.authorOviatt, Sharon-
dc.date.accessioned2025-02-07T08:46:44Z-
dc.date.available2025-02-07T08:46:44Z-
dc.date.issued2019-
dc.identifier.citationInternational Conference on Information and Knowledge Management, Proceedings, 2019, p. 2069-2072-
dc.identifier.urihttp://hdl.handle.net/10722/354145-
dc.description.abstractAffective computing is an emerging research area which provides insights on human's mental state through human-machine interaction. During the interaction process, bio-signal analysis is essential to detect human affective changes. Currently, machine learning methods to analyse bio-signals are the state of the art to detect the affective states, but most empirical works mainly deploy traditional machine learning methods rather than deep learning models due to the need for explainability. In this paper, we propose a deep learning model to process multimodal-multisensory bio-signals for affect recognition. It supports batch training for different sampling rate signals at the same time, and our results show significant improvement compared to the state of the art. Furthermore, the results are interpreted at the sensor- and signal- level to improve the explainaibility of our deep learning model.-
dc.languageeng-
dc.relation.ispartofInternational Conference on Information and Knowledge Management, Proceedings-
dc.subjectAffect recognition-
dc.subjectDeep learning-
dc.subjectExplainability-
dc.subjectMultimodal fusion-
dc.titleAn explainable deep fusion network for affect recognition using physiological signals-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1145/3357384.3358160-
dc.identifier.scopuseid_2-s2.0-85075478890-
dc.identifier.spage2069-
dc.identifier.epage2072-
dc.identifier.isiWOS:000539898202014-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats