Detailed Information

Cited 21 time in webofscience Cited 34 time in scopus
Metadata Downloads

Spatio-temporal representation of an electoencephalogram for emotion recognition using a three-dimensional convolutional neural network

Full metadata record
DC Field Value Language
dc.contributor.authorCho J.-
dc.contributor.authorHwang H.-
dc.date.available2020-08-10T00:36:28Z-
dc.date.created2020-06-29-
dc.date.issued2020-06-
dc.identifier.issn1424-8220-
dc.identifier.urihttps://scholarworks.bwise.kr/gachon/handle/2020.sw.gachon/76190-
dc.description.abstractEmotion recognition plays an important role in the field of human–computer interaction (HCI). An electroencephalogram (EEG) is widely used to estimate human emotion owing to its convenience and mobility. Deep neural network (DNN) approaches using an EEG for emotion recognition have recently shown remarkable improvement in terms of their recognition accuracy. However, most studies in this field still require a separate process for extracting handcrafted features despite the ability of a DNN to extract meaningful features by itself. In this paper, we propose a novel method for recognizing an emotion based on the use of three-dimensional convolutional neural networks (3D CNNs), with an efficient representation of the spatio-temporal representations of EEG signals. First, we spatially reconstruct raw EEG signals represented as stacks of one-dimensional (1D) time series data to two-dimensional (2D) EEG frames according to the original electrode position. We then represent a 3D EEG stream by concatenating the 2D EEG frames to the time axis. These 3D reconstructions of the raw EEG signals can be efficiently combined with 3D CNNs, which have shown a remarkable feature representation from spatio-temporal data. Herein, we demonstrate the accuracy of the emotional classification of the proposed method through extensive experiments on the DEAP (a Dataset for Emotion Analysis using EEG, Physiological, and video signals) dataset. Experimental results show that the proposed method achieves a classification accuracy of 99.11%, 99.74%, and 99.73% in the binary classification of valence and arousal, and, in four-class classification, respectively. We investigate the spatio-temporal effectiveness of the proposed method by comparing it to several types of input methods with 2D/3D CNN. We then verify the best performing shape of both the kernel and input data experimentally. We verify that an efficient representation of an EEG and a network that fully takes advantage of the data characteristics can outperform methods that apply handcrafted features. © 2020 by the authors.-
dc.language영어-
dc.language.isoen-
dc.publisherMDPI AG-
dc.relation.isPartOfSensors (Switzerland)-
dc.titleSpatio-temporal representation of an electoencephalogram for emotion recognition using a three-dimensional convolutional neural network-
dc.typeArticle-
dc.type.rimsART-
dc.description.journalClass1-
dc.identifier.wosid000553112500001-
dc.identifier.doi10.3390/s20123491-
dc.identifier.bibliographicCitationSensors (Switzerland), v.20, no.12, pp.1 - 18-
dc.description.isOpenAccessN-
dc.identifier.scopusid2-s2.0-85086758492-
dc.citation.endPage18-
dc.citation.startPage1-
dc.citation.titleSensors (Switzerland)-
dc.citation.volume20-
dc.citation.number12-
dc.contributor.affiliatedAuthorCho J.-
dc.contributor.affiliatedAuthorHwang H.-
dc.type.docTypeArticle-
dc.subject.keywordAuthorConvolutional neural network-
dc.subject.keywordAuthorDEAP-
dc.subject.keywordAuthorEEG-
dc.subject.keywordAuthorEmotion recognition-
dc.subject.keywordAuthorThree-dimensional CNN-
dc.subject.keywordPlusClassification (of information)-
dc.subject.keywordPlusConvolution-
dc.subject.keywordPlusConvolutional neural networks-
dc.subject.keywordPlusDeep neural networks-
dc.subject.keywordPlusElectroencephalography-
dc.subject.keywordPlusHuman computer interaction-
dc.subject.keywordPlusOne dimensional-
dc.subject.keywordPlusSpeech recognition-
dc.subject.keywordPlusBinary classification-
dc.subject.keywordPlusClassification accuracy-
dc.subject.keywordPlusComputer interaction-
dc.subject.keywordPlusElectro-encephalogram (EEG)-
dc.subject.keywordPlusEmotional classification-
dc.subject.keywordPlusFeature representation-
dc.subject.keywordPlusSpatio-temporal data-
dc.subject.keywordPlusTwo Dimensional (2 D)-
dc.subject.keywordPlusBiomedical signal processing-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
Files in This Item
There are no files associated with this item.
Appears in
Collections
IT융합대학 > 소프트웨어학과 > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Cho, Jung Chan photo

Cho, Jung Chan
College of IT Convergence (Department of Software)
Read more

Altmetrics

Total Views & Downloads

BROWSE