Movement state classification for bimanual BCI from non-human primate's epidural ECoG using three-dimensional convolutional neural network
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Choi, Hoseok | - |
dc.contributor.author | Lee,Jeyeon | - |
dc.contributor.author | Park, Jinsick | - |
dc.contributor.author | Cho, Baek Hwan | - |
dc.contributor.author | Lee,Kyoung-Min | - |
dc.contributor.author | Jang, Dong Pyo | - |
dc.date.accessioned | 2022-07-12T12:43:35Z | - |
dc.date.available | 2022-07-12T12:43:35Z | - |
dc.date.created | 2021-05-11 | - |
dc.date.issued | 2018-03 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/hanyang/handle/2021.sw.hanyang/150501 | - |
dc.description.abstract | During bimanual movement, brain state is known to be different from the unimanual movement. Thus the conventional arm movement classifier for unimanual arm movement decoding method seems to be insufficient to decode bimanual movement. In this research, we suggested the convolutional neural network (CNN) for movement state classification to improve the decoding accuracy for bimanual movement estimation. We recorded the monkey's cortical signal while the bimanual task, and convert to spectrogram dataset for decoding. To evaluate the CNN, we stacked several layers for deep structure and figured out the best configuration. As a result, this method showed improved the arm movement state classification performance for bimanual tasks. This technique could be applied to arm movement brain computer interfaces (BCIs) in real world and the various neuro-prosthetics fields. | - |
dc.language | 영어 | - |
dc.language.iso | en | - |
dc.publisher | Institute of Electrical and Electronics Engineers Inc. | - |
dc.title | Movement state classification for bimanual BCI from non-human primate's epidural ECoG using three-dimensional convolutional neural network | - |
dc.type | Article | - |
dc.contributor.affiliatedAuthor | Jang, Dong Pyo | - |
dc.identifier.doi | 10.1109/IWW-BCI.2018.8311534 | - |
dc.identifier.scopusid | 2-s2.0-85050796986 | - |
dc.identifier.bibliographicCitation | 2018 6th International Conference on Brain-Computer Interface, BCI 2018, v.2018-January, pp.1 - 3 | - |
dc.relation.isPartOf | 2018 6th International Conference on Brain-Computer Interface, BCI 2018 | - |
dc.citation.title | 2018 6th International Conference on Brain-Computer Interface, BCI 2018 | - |
dc.citation.volume | 2018-January | - |
dc.citation.startPage | 1 | - |
dc.citation.endPage | 3 | - |
dc.type.rims | ART | - |
dc.type.docType | Conference Paper | - |
dc.description.journalClass | 1 | - |
dc.description.isOpenAccess | N | - |
dc.description.journalRegisteredClass | scopus | - |
dc.subject.keywordPlus | Convolution | - |
dc.subject.keywordPlus | Decoding | - |
dc.subject.keywordPlus | Human computer interaction | - |
dc.subject.keywordPlus | Interfaces (computer) | - |
dc.subject.keywordPlus | Motion estimation | - |
dc.subject.keywordPlus | Neural networks | - |
dc.subject.keywordPlus | Bimanual movement | - |
dc.subject.keywordPlus | Brain computer interfaces (BCIs) | - |
dc.subject.keywordPlus | Convolutional neural network | - |
dc.subject.keywordPlus | Convolutional Neural Networks (CNN) | - |
dc.subject.keywordPlus | Decoding methods | - |
dc.subject.keywordPlus | Deep structure | - |
dc.subject.keywordPlus | Non-human primate | - |
dc.subject.keywordPlus | State classification | - |
dc.subject.keywordPlus | Brain computer interface | - |
dc.subject.keywordAuthor | bimanual movement | - |
dc.subject.keywordAuthor | movement state classification | - |
dc.identifier.url | https://ieeexplore.ieee.org/document/8311534 | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
222, Wangsimni-ro, Seongdong-gu, Seoul, 04763, Korea+82-2-2220-1365
COPYRIGHT © 2021 HANYANG UNIVERSITY.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.