A Comparative Analysis Between Real Human and Virtual Human Interactions in an Academic Learning Context Using Emotion Recognition
- Authors
- Sardar, Suman Kalyan; Cha, Min Chul; Lee, Seul Chan
- Issue Date
- Jun-2025
- Publisher
- TAYLOR & FRANCIS INC
- Keywords
- Virtual human; emotion analysis; academic learning; convolutional neural networks; human-avatar interactions
- Citation
- INTERNATIONAL JOURNAL OF HUMAN-COMPUTER INTERACTION, pp 1 - 10
- Pages
- 10
- Indexed
- SCIE
SSCI
SCOPUS
- Journal Title
- INTERNATIONAL JOURNAL OF HUMAN-COMPUTER INTERACTION
- Start Page
- 1
- End Page
- 10
- URI
- https://scholarworks.bwise.kr/erica/handle/2021.sw.erica/125677
- DOI
- 10.1080/10447318.2025.2512526
- ISSN
- 1044-7318
1532-7590
- Abstract
- In today's academic scenario, understanding learners' emotional responses during academic learning is important to improve learning ability. This study provides a comparative analysis of facial emotions from interactions with both real human (RH) and virtual human (VH) in the context of online academic learning. Facial video data were collected from participants engaged in both RH and VH learning sessions. Facial landmarks were extracted using the MediaPipe Face Mesh model and six emotional states were mapped from computed action unit (AU) scores. A convolutional neural network (CNN) was trained on the FER-2013 and extended CK+ datasets to classify six facial emotional states from the acquired dataset. Emotion intensity was computed based on AU scores for each detected state. Results revealed that happiness and surprise intensities were significantly higher during VH interactions compared to RH. An ANOVA test confirmed statistically significant differences in emotional intensity between RH and VH interactions.
- Files in This Item
-
Go to Link
- Appears in
Collections - COLLEGE OF COMPUTING > SCHOOL OF MEDIA, CULTURE, AND DESIGN TECHNOLOGY > 1. Journal Articles

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.