Study of emotion recognition based on facial image for emotional rehabilitation biofeedback
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Ko, K.-E. | - |
dc.contributor.author | Sim, K.-B. | - |
dc.date.available | 2019-05-30T02:32:39Z | - |
dc.date.issued | 2010-10 | - |
dc.identifier.issn | 1976-5622 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/cau/handle/2019.sw.cau/22755 | - |
dc.description.abstract | If we want to recognize the human's emotion via the facial image, first of all, we need to extract the emotional features from the facial image by using a feature extraction algorithm. And we need to classify the emotional status by using pattern classification method. The AAM (Active Appearance Model) is a well-known method that can represent a non-rigid object, such as face, facial expression. The Bayesian Network is a probability based classifier that can represent the probabilistic relationships between a set of facial features. In this paper, our approach to facial feature extraction lies in the proposed feature extraction method based on combining AAM with FACS (Facial Action Coding System) for automatically modeling and extracting the facial emotional features. To recognize the facial emotion, we use the DBNs (Dynamic Bayesian Networks) for modeling and understanding the temporal phases of facial expressions in image sequences. The result of emotion recognition can be used to rehabilitate based on biofeedback for emotional disabled. © ICROS 2010. | - |
dc.format.extent | 6 | - |
dc.language | 한국어 | - |
dc.language.iso | KOR | - |
dc.publisher | 제어·로봇·시스템학회 | - |
dc.title | Study of emotion recognition based on facial image for emotional rehabilitation biofeedback | - |
dc.title.alternative | Study of Emotion Recognition based on Facial Image for Emotional Rehabilitation Biofeedback | - |
dc.type | Article | - |
dc.identifier.doi | 10.5302/J.ICROS.2010.16.10.957 | - |
dc.identifier.bibliographicCitation | Journal of Institute of Control, Robotics and Systems, v.16, no.10, pp 957 - 962 | - |
dc.identifier.kciid | ART001483976 | - |
dc.description.isOpenAccess | N | - |
dc.identifier.scopusid | 2-s2.0-84860242232 | - |
dc.citation.endPage | 962 | - |
dc.citation.number | 10 | - |
dc.citation.startPage | 957 | - |
dc.citation.title | Journal of Institute of Control, Robotics and Systems | - |
dc.citation.volume | 16 | - |
dc.type.docType | Article | - |
dc.publisher.location | 대한민국 | - |
dc.subject.keywordAuthor | Active appearance model | - |
dc.subject.keywordAuthor | Dynamic bayesian network | - |
dc.subject.keywordAuthor | Facial action coding system | - |
dc.subject.keywordAuthor | Facial emotion recognition facial feature extraction | - |
dc.subject.keywordPlus | Active appearance models | - |
dc.subject.keywordPlus | Dynamic Bayesian networks | - |
dc.subject.keywordPlus | Emotion recognition | - |
dc.subject.keywordPlus | Facial Action Coding System | - |
dc.subject.keywordPlus | Facial emotions | - |
dc.subject.keywordPlus | Facial Expressions | - |
dc.subject.keywordPlus | Facial feature | - |
dc.subject.keywordPlus | Facial feature extraction | - |
dc.subject.keywordPlus | Facial images | - |
dc.subject.keywordPlus | Feature extraction algorithms | - |
dc.subject.keywordPlus | Feature extraction methods | - |
dc.subject.keywordPlus | Image sequence | - |
dc.subject.keywordPlus | Non-rigid objects | - |
dc.subject.keywordPlus | Probability-based classifier | - |
dc.subject.keywordPlus | Bayesian networks | - |
dc.subject.keywordPlus | Biofeedback | - |
dc.subject.keywordPlus | Face recognition | - |
dc.subject.keywordPlus | Feature extraction | - |
dc.subject.keywordPlus | Gesture recognition | - |
dc.subject.keywordPlus | Computer keyboards | - |
dc.description.journalRegisteredClass | scopus | - |
dc.description.journalRegisteredClass | kci | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
84, Heukseok-ro, Dongjak-gu, Seoul, Republic of Korea (06974)02-820-6194
COPYRIGHT 2019 Chung-Ang University All Rights Reserved.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.