File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Appears in Collections:
Article: A Novel Method of Emotion Classification and Reconstruction Using VGGNet and StarGAN for Mixed-Reality Interactions in HKU Campusland Metaverse
| Title | A Novel Method of Emotion Classification and Reconstruction Using VGGNet and StarGAN for Mixed-Reality Interactions in HKU Campusland Metaverse |
|---|---|
| Authors | |
| Issue Date | 30-Sep-2025 |
| Publisher | IOS Press |
| Citation | Frontiers in Artificial Intelligence and Applications, 2025, v. 412, p. 399-415 How to Cite? |
| Abstract | Verbal, written, visual, and nonverbal communication are the four main forms of communication. Nonverbal communication, including facial expressions and body gestures, is an effective method for better understanding emotion in communication. With the recent growth of the metaverse, more people and companies started using the metaverse for their business activities. However, the current communication in the metaverse is mainly in written format. This means that the metaverse environment is unable to provide human-like interactions, preventing participants from joining the metaverse and having real-world social and business activities. In order to make human interactions in the metaverse more real, this research aims to develop a novel method for nonverbal communication in the metaverse. Therefore, we reviewed current methods of emotion classification and regeneration and proposed a novel method of emotion reconstruction in the metaverse, which includes the processes of facial emotion detection and reconstruction. We used the FER2013 dataset to train our improved model of VGG19-CNNs with a residual masking network and data augmentation for facial emotion classification. We compared our novel method with a baseline machine learning model, the support vector machine. Our novel model improved the accuracy from 50.2% of the baseline model to 70.8% of the VGG19 model and 88.33% of VGGNet with a residual masking network and data augmentation. Finally, we used the StarGAN model to reconstruct the detected emotion onto the avatar’s face as a case study in HKU Campusland. |
| Persistent Identifier | http://hdl.handle.net/10722/366027 |
| ISSN | 2023 SCImago Journal Rankings: 0.281 |
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Lau, Adela S.M. | - |
| dc.contributor.author | Luan, Jianduo | - |
| dc.contributor.author | Cheung, Liege | - |
| dc.contributor.author | Ma, Patrick | - |
| dc.contributor.author | Lee, Herbert | - |
| dc.date.accessioned | 2025-11-14T02:41:02Z | - |
| dc.date.available | 2025-11-14T02:41:02Z | - |
| dc.date.issued | 2025-09-30 | - |
| dc.identifier.citation | Frontiers in Artificial Intelligence and Applications, 2025, v. 412, p. 399-415 | - |
| dc.identifier.issn | 0922-6389 | - |
| dc.identifier.uri | http://hdl.handle.net/10722/366027 | - |
| dc.description.abstract | <p>Verbal, written, visual, and nonverbal communication are the four main forms of communication. Nonverbal communication, including facial expressions and body gestures, is an effective method for better understanding emotion in communication. With the recent growth of the metaverse, more people and companies started using the metaverse for their business activities. However, the current communication in the metaverse is mainly in written format. This means that the metaverse environment is unable to provide human-like interactions, preventing participants from joining the metaverse and having real-world social and business activities. In order to make human interactions in the metaverse more real, this research aims to develop a novel method for nonverbal communication in the metaverse. Therefore, we reviewed current methods of emotion classification and regeneration and proposed a novel method of emotion reconstruction in the metaverse, which includes the processes of facial emotion detection and reconstruction. We used the FER2013 dataset to train our improved model of VGG19-CNNs with a residual masking network and data augmentation for facial emotion classification. We compared our novel method with a baseline machine learning model, the support vector machine. Our novel model improved the accuracy from 50.2% of the baseline model to 70.8% of the VGG19 model and 88.33% of VGGNet with a residual masking network and data augmentation. Finally, we used the StarGAN model to reconstruct the detected emotion onto the avatar’s face as a case study in HKU Campusland.</p> | - |
| dc.language | eng | - |
| dc.publisher | IOS Press | - |
| dc.relation.ispartof | Frontiers in Artificial Intelligence and Applications | - |
| dc.rights | This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. | - |
| dc.title | A Novel Method of Emotion Classification and Reconstruction Using VGGNet and StarGAN for Mixed-Reality Interactions in HKU Campusland Metaverse | - |
| dc.type | Article | - |
| dc.identifier.doi | 10.3233/FAIA250738 | - |
| dc.identifier.volume | 412 | - |
| dc.identifier.spage | 399 | - |
| dc.identifier.epage | 415 | - |
| dc.identifier.eissn | 1535-6698 | - |
| dc.identifier.issnl | 0922-6389 | - |

