Integrated CWT-CNN for Epilepsy Detection Using Multiclass EEG Dataset
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Naseem, Sidra | - |
dc.contributor.author | Javed, Kashif | - |
dc.contributor.author | Khan, Muhammad Jawad | - |
dc.contributor.author | Rubab, Saddaf | - |
dc.contributor.author | Khan, Muhammad Attique | - |
dc.contributor.author | Nam, Yunyoung | - |
dc.date.accessioned | 2021-09-10T06:27:12Z | - |
dc.date.available | 2021-09-10T06:27:12Z | - |
dc.date.issued | 2021 | - |
dc.identifier.issn | 1546-2218 | - |
dc.identifier.issn | 1546-2226 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/sch/handle/2021.sw.sch/19088 | - |
dc.description.abstract | Electroencephalography is a common clinical procedure to record brain signals generated by human activity. EEGs are useful in Brain controlled interfaces and other intelligent Neuroscience applications, but manual analysis of these brainwaves is complicated and time-consuming even for the experts of neuroscience. Various EEG analysis and classification techniques have been proposed to address this problem however, the conventional classification methods require identification and learning of specific EEG characteristics beforehand. Deep learning models can learn features from data without having in depth knowledge of data and prior feature identification. One of the great implementations of deep learning is Convolutional Neural Network (CNN) which has outperformed traditional neural networks in pattern recognition and image classification. Continuous Wavelet Transform (CWT) is an efficient signal analysis technique that presents the magnitude of EEG signals as time-related Frequency components. Existing deep learning architectures suffer from poor performance when classifying EEG signals in the Time-frequency domain. To improve classification accuracy, we propose an integrated CWT and CNN technique which classifies five types of EEG signals using. We compared the results of proposed integrated CWT and CNN method with existing deep learning models e.g., GoogleNet, VGG16, AlexNet. Furthermore, the accuracy and loss of the proposed integrated CWT and CNN method have been cross validated using Kfold cross validation. The average accuracy and loss of Kfold cross-validation for proposed integrated CWT and CNN method are, 76.12% and 56.02% respectively. This model produces results on a publicly available dataset: Epilepsy dataset by UCI (Machine Learning Repository). | - |
dc.format.extent | 16 | - |
dc.language | 영어 | - |
dc.language.iso | ENG | - |
dc.publisher | Tech Science Press | - |
dc.title | Integrated CWT-CNN for Epilepsy Detection Using Multiclass EEG Dataset | - |
dc.type | Article | - |
dc.publisher.location | 미국 | - |
dc.identifier.doi | 10.32604/cmc.2021.018239 | - |
dc.identifier.scopusid | 2-s2.0-85107842129 | - |
dc.identifier.wosid | 000659131200032 | - |
dc.identifier.bibliographicCitation | Computers, Materials and Continua, v.69, no.1, pp 471 - 486 | - |
dc.citation.title | Computers, Materials and Continua | - |
dc.citation.volume | 69 | - |
dc.citation.number | 1 | - |
dc.citation.startPage | 471 | - |
dc.citation.endPage | 486 | - |
dc.type.docType | Article | - |
dc.description.isOpenAccess | N | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
dc.relation.journalResearchArea | Computer Science | - |
dc.relation.journalResearchArea | Materials Science | - |
dc.relation.journalWebOfScienceCategory | Computer Science, Information Systems | - |
dc.relation.journalWebOfScienceCategory | Materials Science, Multidisciplinary | - |
dc.subject.keywordPlus | WAVELET TRANSFORM | - |
dc.subject.keywordPlus | CLASSIFICATION | - |
dc.subject.keywordPlus | FRAMEWORK | - |
dc.subject.keywordAuthor | Deep learning | - |
dc.subject.keywordAuthor | electroencephalography | - |
dc.subject.keywordAuthor | epilepsy | - |
dc.subject.keywordAuthor | continuous wavelet transform | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
(31538) 22, Soonchunhyang-ro, Asan-si, Chungcheongnam-do, Republic of Korea+82-41-530-1114
COPYRIGHT 2021 by SOONCHUNHYANG UNIVERSITY ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.