Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Emotion and Body Movement: A Comparative Study of Automatic Emotion Recognition Using Body Motions

Full metadata record
DC Field Value Language
dc.contributor.authorCho, Youngwug-
dc.contributor.authorJung, Myeongul-
dc.contributor.authorKim, Kwanguk-
dc.date.accessioned2023-02-21T06:04:54Z-
dc.date.available2023-02-21T06:04:54Z-
dc.date.created2023-02-08-
dc.date.issued2022-10-
dc.identifier.issn2771-1102-
dc.identifier.urihttps://scholarworks.bwise.kr/hanyang/handle/2021.sw.hanyang/182408-
dc.description.abstractEmotion recognition through body movement in both real and virtual worlds is an important research topic along with facial expression and voice recognition. Computational methods to recognize emotions based on body movement have been developed to utilize skeletal data and motion capture systems, and 2D and 3D pose estimation methods have recently been proposed. Although each of these methodologies involves advantages and disadvantages, they have not been compared with same data. In this study, we collected seven types of motion data associated with specified emotional states from 25 participants, including happiness, sadness, anger, disgust, fear, surprise, and a neutral emotion. We compared three methodologies, including motion capture, 2D pose estimation, and 3D pose estimation, along with human evaluations as a baseline. The results show that measurement through motion capture showed the highest performance, and the 2D and 3D pose estimation also showed relatively high performance compared to the human evaluators' results. These findings suggest that the existing methodologies can be utilized to perform emotion recognition.-
dc.language영어-
dc.language.isoen-
dc.publisherInstitute of Electrical and Electronics Engineers Inc.-
dc.titleEmotion and Body Movement: A Comparative Study of Automatic Emotion Recognition Using Body Motions-
dc.typeArticle-
dc.contributor.affiliatedAuthorKim, Kwanguk-
dc.identifier.doi10.1109/ISMAR-Adjunct57072.2022.00162-
dc.identifier.scopusid2-s2.0-85146051394-
dc.identifier.wosid000918030200151-
dc.identifier.bibliographicCitationProceedings - 2022 IEEE International Symposium on Mixed and Augmented Reality Adjunct, ISMAR-Adjunct 2022, pp.768 - 771-
dc.relation.isPartOfProceedings - 2022 IEEE International Symposium on Mixed and Augmented Reality Adjunct, ISMAR-Adjunct 2022-
dc.citation.titleProceedings - 2022 IEEE International Symposium on Mixed and Augmented Reality Adjunct, ISMAR-Adjunct 2022-
dc.citation.startPage768-
dc.citation.endPage771-
dc.type.rimsART-
dc.type.docTypeProceedings Paper-
dc.description.journalClass1-
dc.description.isOpenAccessN-
dc.description.journalRegisteredClassscopus-
dc.relation.journalResearchAreaComputer Science-
dc.relation.journalResearchAreaImaging Science & Photographic Technology-
dc.relation.journalWebOfScienceCategoryComputer Science, Cybernetics-
dc.relation.journalWebOfScienceCategoryComputer Science, Software Engineering-
dc.relation.journalWebOfScienceCategoryImaging Science & Photographic Technology-
dc.subject.keywordPlusDeep learning-
dc.subject.keywordPlusHuman computer interaction-
dc.subject.keywordPlusSpeech recognition-
dc.subject.keywordPlusVirtual reality-
dc.subject.keywordPlusEmotion Recognition-
dc.subject.keywordPlusComputing methodologiesartificial intelligence-
dc.subject.keywordPlusDeep learning-
dc.subject.keywordPlusEmotion-
dc.subject.keywordPlusHuman-centered computing-
dc.subject.keywordPlusHuman-centered computing-human computer interaction interaction paradigmsvirtual reality-
dc.subject.keywordPlusHuman-centered computing-human computer interaction interaction paradigm mixed/augmented reality-
dc.subject.keywordPlusHuman-centered computing-human computer interaction interaction techniquesgestural input-
dc.subject.keywordPlusInteraction paradigm-
dc.subject.keywordPlusMotion capture-
dc.subject.keywordPlusPose-estimation-
dc.subject.keywordAuthorComputing methodologiesArtificial intelligence-
dc.subject.keywordAuthordeep learning-
dc.subject.keywordAuthorEmotion-
dc.subject.keywordAuthorHuman-centered computing-Human computer interaction (HCI) Interaction paradigmsVirtual reality-
dc.subject.keywordAuthorHuman-centered computing-Human computer interaction (HCI)Interaction paradigms Mixed/augmented reality-
dc.subject.keywordAuthorHuman-centered computing-Human computer interaction (HCI)Interaction techniquesGestural input-
dc.subject.keywordAuthormotion capture-
dc.subject.keywordAuthorpose estimation-
dc.identifier.urlhttps://ieeexplore.ieee.org/document/9974493-
Files in This Item
Go to Link
Appears in
Collections
서울 공과대학 > 서울 컴퓨터소프트웨어학부 > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Kim, Kwanguk photo

Kim, Kwanguk
COLLEGE OF ENGINEERING (SCHOOL OF COMPUTER SCIENCE)
Read more

Altmetrics

Total Views & Downloads

BROWSE