Detailed Information

Cited 1 time in webofscience Cited 1 time in scopus
Metadata Downloads

Role of Zoning in Facial Expression Using Deep Learning

Authors
Shahzad, TaimurIqbal, KhalidKhan, Murad ALiImran,Iqbal, Naeem
Issue Date
Feb-2023
Publisher
Institute of Electrical and Electronics Engineers (IEEE)
Keywords
CK+; Computer vision; deep learning; FER2013; zoning
Citation
IEEE Access, v.11, pp.16493 - 16508
Journal Title
IEEE Access
Volume
11
Start Page
16493
End Page
16508
URI
https://scholarworks.bwise.kr/gachon/handle/2020.sw.gachon/87142
DOI
10.1109/access.2023.3243850
ISSN
2169-3536
Abstract
Facial expression is an unspoken message essential to collaboration and effective discourse. An inner emotional state of a human is expressed using facial expressions and is very effective for communication with actual emotions. Anger, happiness, sadness, contempt, surprise, fear, disgust, and neutral are eight common expressions of humans. Scientific community proposed several face emotion recognition techniques. However, due to fewer face landmarks and their intensity for deep learning models, performance improvement for facial expression recognition still needs to be improved for accurately predicting facial emotion recognition. This study proposes a zoning-based face expression recognition (ZFER) to locate more face landmarks to perceive deep face emotions indemnity through zoning. After face extraction, landmarks from the face, such as the eyes, eyebrows, nose, forehead, and mouth, are extracted. The second step is zoning each landmark into four regions and zone-based face landmarks are passed to the VGG-16 model to generate a feature map. Finally, the feature map is given as input to fully connected neural network (FCNN) to classify facial emotions into multiple classes. Various experiments are performed on facial expression recognition (FER) 2013 and CK+ datasets to evaluate our proposed model with state-of-the-art facial expression recognition approaches using performance assessment metrics like accuracy. The accuracy of the proposed method with face features on CK+ and FER2013 are 98.4% and 65%, respectively. The experimental zoning results improve from 98.47% to 98.74% on the CK+ dataset.
Files in This Item
There are no files associated with this item.
Appears in
Collections
ETC > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Imran,  photo

Imran,
College of IT Convergence (의공학과)
Read more

Altmetrics

Total Views & Downloads

BROWSE