Emotion detection from handwriting and drawing samples using an attention-based transformer model
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Khan, Zohaib Ahmad | - |
dc.contributor.author | Xia, Yuanqing | - |
dc.contributor.author | Aurangzeb, Khursheed | - |
dc.contributor.author | Khaliq, Fiza | - |
dc.contributor.author | Alam, Mahmood | - |
dc.contributor.author | Khan, Javed Ali | - |
dc.contributor.author | Anwar, Muhammad Shahid | - |
dc.date.accessioned | 2024-05-07T13:00:20Z | - |
dc.date.available | 2024-05-07T13:00:20Z | - |
dc.date.issued | 2024-03 | - |
dc.identifier.issn | 2376-5992 | - |
dc.identifier.issn | 2376-5992 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/gachon/handle/2020.sw.gachon/91132 | - |
dc.description.abstract | Emotion detection (ED) involves the identification and understanding of an individual's emotional state through various cues such as facial expressions, voice tones, physiological changes, and behavioral patterns. In this context, behavioral analysis is employed to observe actions and behaviors for emotional interpretation. This work specifically employs behavioral metrics like drawing and handwriting to determine a person's emotional state, recognizing these actions as physical functions integrating motor and cognitive processes. The study proposes an attention-based transformer model as an innovative approach to identify emotions from handwriting and drawing samples, thereby advancing the capabilities of ED into the domains of fine motor skills and artistic expression. The initial data obtained provides a set of points that correspond to the handwriting or drawing strokes. Each stroke point is subsequently delivered to the attention-based transformer model, which embeds it into a high-dimensional vector space. The model builds a prediction about the emotional state of the person who generated the sample by integrating the most important components and patterns in the input sequence using self-attentional processes. The proposed approach possesses a distinct advantage in its enhanced capacity to capture long -range correlations compared to conventional recurrent neural networks (RNN). This characteristic makes it particularly well-suited for the precise identification of emotions from samples of handwriting and drawings, signifying a notable advancement in the field of emotion detection. The proposed method produced cutting-edge outcomes of 92.64% on the benchmark dataset known as EMOTHAW (Emotion Recognition via Handwriting and Drawing). | - |
dc.language | 영어 | - |
dc.language.iso | ENG | - |
dc.publisher | PEERJ INC | - |
dc.title | Emotion detection from handwriting and drawing samples using an attention-based transformer model | - |
dc.type | Article | - |
dc.identifier.wosid | 001194752800004 | - |
dc.identifier.doi | 10.7717/peerj-cs.1887 | - |
dc.identifier.bibliographicCitation | PEERJ COMPUTER SCIENCE, v.10 | - |
dc.description.isOpenAccess | Y | - |
dc.identifier.scopusid | 2-s2.0-85190278271 | - |
dc.citation.title | PEERJ COMPUTER SCIENCE | - |
dc.citation.volume | 10 | - |
dc.type.docType | Article | - |
dc.publisher.location | 영국 | - |
dc.subject.keywordAuthor | Emotional state recognition | - |
dc.subject.keywordAuthor | Handwriting/Drawing analysis | - |
dc.subject.keywordAuthor | Behavioral biometrics | - |
dc.subject.keywordAuthor | Emotion detection | - |
dc.subject.keywordAuthor | Human-computer Interaction | - |
dc.subject.keywordAuthor | Emotional intelligence | - |
dc.subject.keywordAuthor | Transformer model | - |
dc.subject.keywordPlus | STATE RECOGNITION | - |
dc.subject.keywordPlus | FUSION | - |
dc.relation.journalResearchArea | Computer Science | - |
dc.relation.journalWebOfScienceCategory | Computer Science, Artificial Intelligence | - |
dc.relation.journalWebOfScienceCategory | Computer Science, Information Systems | - |
dc.relation.journalWebOfScienceCategory | Computer Science, Theory & Methods | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
1342, Seongnam-daero, Sujeong-gu, Seongnam-si, Gyeonggi-do, Republic of Korea(13120)031-750-5114
COPYRIGHT 2020 Gachon University All Rights Reserved.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.