근전도 패턴 인식 및 분류 기반 다자유도 전완 의수 개발
DC Field | Value | Language |
---|---|---|
dc.contributor.author | 이슬아 | - |
dc.contributor.author | 최유나 | - |
dc.contributor.author | 양세동 | - |
dc.contributor.author | 홍근영 | - |
dc.contributor.author | 최영진 | - |
dc.date.accessioned | 2021-06-22T10:42:31Z | - |
dc.date.available | 2021-06-22T10:42:31Z | - |
dc.date.issued | 2019-09 | - |
dc.identifier.issn | 1975-6291 | - |
dc.identifier.issn | 2287-3961 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/erica/handle/2021.sw.erica/4076 | - |
dc.description.abstract | This paper presents a multiple DoFs (degrees-of-freedom) prosthetic forearm and sEMG (surface electromyogram) pattern recognition and motion intent classification of forearm amputee. The developed prosthetic forearm has 9 DoFs hand and single-DoF wrist, and the socket is designed considering wearability. In addition, the pattern recognition based on sEMG is proposed for prosthetic control. Several experiments were conducted to substantiate the performance of the prosthetic forearm. First, the developed prosthetic forearm could perform various motions required for activity of daily living of forearm amputee. It was able to control according to shape and size of the object. Additionally, the amputee was able to perform ‘tying up shoe’ using the prosthetic forearm. Secondly, pattern recognition and classification experiments using the sEMG signals were performed to find out whether it could classify the motions according to the user’s intents. For this purpose, sEMG signals were applied to the multilayer perceptron (MLP) for training and testing. As a result, overall classification accuracy arrived at 99.6% for all participants, and all the postures showed more than 97% accuracy. | - |
dc.format.extent | 8 | - |
dc.language | 한국어 | - |
dc.language.iso | KOR | - |
dc.publisher | 한국로봇학회 | - |
dc.title | 근전도 패턴 인식 및 분류 기반 다자유도 전완 의수 개발 | - |
dc.title.alternative | Development of Multi-DoFs Prosthetic Forearm based on EMG Pattern Recognition and Classification | - |
dc.type | Article | - |
dc.publisher.location | 대한민국 | - |
dc.identifier.doi | 10.7746/jkros.2019.14.3.228 | - |
dc.identifier.bibliographicCitation | 로봇학회 논문지, v.14, no.3, pp 228 - 235 | - |
dc.citation.title | 로봇학회 논문지 | - |
dc.citation.volume | 14 | - |
dc.citation.number | 3 | - |
dc.citation.startPage | 228 | - |
dc.citation.endPage | 235 | - |
dc.identifier.kciid | ART002494690 | - |
dc.description.isOpenAccess | N | - |
dc.description.journalRegisteredClass | kci | - |
dc.subject.keywordAuthor | Prosthetic Forearm | - |
dc.subject.keywordAuthor | Myoelectric Prosthesis | - |
dc.subject.keywordAuthor | Multilayer Perceptron | - |
dc.subject.keywordAuthor | Pattern Recognition | - |
dc.identifier.url | https://www.kci.go.kr/kciportal/ci/sereArticleSearch/ciSereArtiView.kci?sereArticleSearchBean.artiId=ART002494690 | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
55 Hanyangdeahak-ro, Sangnok-gu, Ansan, Gyeonggi-do, 15588, Korea+82-31-400-4269 sweetbrain@hanyang.ac.kr
COPYRIGHT © 2021 HANYANG UNIVERSITY. ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.