Face Detection and Emotion Classification Using Single Deep Convolutional Network for Facial Emotion Recognition
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Shim, H.R. | - |
dc.contributor.author | Sim, K.-B. | - |
dc.date.available | 2019-05-28T03:32:49Z | - |
dc.date.issued | 2019 | - |
dc.identifier.issn | 1976-5622 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/cau/handle/2019.sw.cau/18482 | - |
dc.description.abstract | In this paper, we propose a facial expression recognition system using a deep convolutional network. Previous works used the facial action coding system (FACS) to classify emotions. Therefore, the system consists of a face detector, a feature extractor, a facial action classifier, and an emotional state classifier in series. In contrast, the proposed system is a simplified emotion recognition system that performs face detection and emotion classification in parallel. Moreover, the model was trained without any prior knowledge of FACS. We evaluated its performance on four different databases. Our main contributions are two folds: 1) Our simplified facial expression recognition system processes images in real-time. 2) Our model was trained to classify facial expressions without any action unit (AU) related information. The proposed method achieved a classification accuracy of 98.6% on six basic emotions and a neutral state from faces with five different angles. The experimental results showed that the deep convolutional network could classify emotional states from a multi-angle facial expressions database and various facial expression databases without the use of hand-crafted features. © ICROS 2019. | - |
dc.format.extent | 7 | - |
dc.language | 영어 | - |
dc.language.iso | ENG | - |
dc.publisher | Institute of Control, Robotics and Systems | - |
dc.title | Face Detection and Emotion Classification Using Single Deep Convolutional Network for Facial Emotion Recognition | - |
dc.title.alternative | 얼굴 감정 인식을 위한 단일 딥 컨볼루션 네트워크를 이용한 얼굴 검출 및 감정 분류 | - |
dc.type | Article | - |
dc.identifier.doi | 10.5302/J.ICROS.2019.18.0110 | - |
dc.identifier.bibliographicCitation | Journal of Institute of Control, Robotics and Systems, v.25, no.1, pp 49 - 55 | - |
dc.identifier.kciid | ART002429748 | - |
dc.description.isOpenAccess | N | - |
dc.identifier.scopusid | 2-s2.0-85059688474 | - |
dc.citation.endPage | 55 | - |
dc.citation.number | 1 | - |
dc.citation.startPage | 49 | - |
dc.citation.title | Journal of Institute of Control, Robotics and Systems | - |
dc.citation.volume | 25 | - |
dc.type.docType | Article | - |
dc.publisher.location | 대한민국 | - |
dc.subject.keywordAuthor | Deep convolutional network | - |
dc.subject.keywordAuthor | Emotion recognition | - |
dc.subject.keywordAuthor | Facial expression recognition | - |
dc.subject.keywordPlus | Classification (of information) | - |
dc.subject.keywordPlus | Computer keyboards | - |
dc.subject.keywordPlus | Convolution | - |
dc.subject.keywordPlus | Database systems | - |
dc.subject.keywordPlus | Speech recognition | - |
dc.subject.keywordPlus | Classification accuracy | - |
dc.subject.keywordPlus | Convolutional networks | - |
dc.subject.keywordPlus | Emotion classification | - |
dc.subject.keywordPlus | Emotion recognition | - |
dc.subject.keywordPlus | Facial Action Coding System | - |
dc.subject.keywordPlus | Facial expression recognition | - |
dc.subject.keywordPlus | Facial Expressions | - |
dc.subject.keywordPlus | Feature extractor | - |
dc.subject.keywordPlus | Face recognition | - |
dc.description.journalRegisteredClass | scopus | - |
dc.description.journalRegisteredClass | kci | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
84, Heukseok-ro, Dongjak-gu, Seoul, Republic of Korea (06974)02-820-6194
COPYRIGHT 2019 Chung-Ang University All Rights Reserved.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.