Wearable fabric sensor for controlling myoelectric hand prosthesis via classification of foot postures
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Lee, Seulah | - |
dc.contributor.author | Sung, Minchang | - |
dc.contributor.author | Choi, Youngjin | - |
dc.date.accessioned | 2021-06-22T09:07:17Z | - |
dc.date.available | 2021-06-22T09:07:17Z | - |
dc.date.issued | 2020-03 | - |
dc.identifier.issn | 0964-1726 | - |
dc.identifier.issn | 1361-665X | - |
dc.identifier.uri | https://scholarworks.bwise.kr/erica/handle/2021.sw.erica/1243 | - |
dc.description.abstract | The degrees-of-freedom of robotic prosthetic hands have recently increased, but high-level amputees such as those with shoulder disarticulation and trans-humeral amputation do not have enough muscular areas on their upper limbs upon which to base surface electromyogram (sEMG) signals. In this paper, a wearable fabric sensor is proposed to measure the sEMG on the lower limb and to classify the foot postures by using the proposed convolutional neural network (CNN), ultimately, for the application to high-level upper limb amputees. First, we determined that sEMG signals of the lower limb can be classified into levels in a manner similar to those of the upper limb for eight postures. Second, a multilayer perceptron (MLP) and the proposed CNN was used to compare the pattern recognition accuracy for classifying eight postures. Finally, the wearable fabric sensor and the proposed CNN network were demonstrated by the trans-radial amputees. These results showed that the wearable fabric sensor verified different eight patterns based on similar motions of both limbs (p < 0.001). In addition, the classification accuracy (91.3%) of the proposed CNN was much higher than that (79%) of MLP (p < 0.05). The wearable fabric sensor allowed the measurement location to change from the upper limb to the lower limb and allowed the number of the classifiable patterns to increase thanks to the sixteen-channel sEMG signals acquired from 32 fabric electrodes. The high classification accuracy of the proposed CNN will be useful for various users who have to wear myoelectric prosthesis every day. | - |
dc.format.extent | 13 | - |
dc.language | 영어 | - |
dc.language.iso | ENG | - |
dc.publisher | Institute of Physics Publishing | - |
dc.title | Wearable fabric sensor for controlling myoelectric hand prosthesis via classification of foot postures | - |
dc.type | Article | - |
dc.publisher.location | 영국 | - |
dc.identifier.doi | 10.1088/1361-665X/ab6690 | - |
dc.identifier.scopusid | 2-s2.0-85082241251 | - |
dc.identifier.wosid | 000520126800001 | - |
dc.identifier.bibliographicCitation | Smart Materials and Structures, v.29, no.3, pp 1 - 13 | - |
dc.citation.title | Smart Materials and Structures | - |
dc.citation.volume | 29 | - |
dc.citation.number | 3 | - |
dc.citation.startPage | 1 | - |
dc.citation.endPage | 13 | - |
dc.type.docType | Article | - |
dc.description.isOpenAccess | N | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
dc.relation.journalResearchArea | Instruments & Instrumentation | - |
dc.relation.journalResearchArea | Materials Science | - |
dc.relation.journalWebOfScienceCategory | Instruments & Instrumentation | - |
dc.relation.journalWebOfScienceCategory | Materials Science, Multidisciplinary | - |
dc.subject.keywordPlus | EMG PATTERN-RECOGNITION | - |
dc.subject.keywordPlus | SEMG SENSORS | - |
dc.subject.keywordPlus | ELECTRODES | - |
dc.subject.keywordPlus | PLACEMENT | - |
dc.subject.keywordPlus | INTERFACE | - |
dc.subject.keywordPlus | AMPUTEE | - |
dc.subject.keywordPlus | DESIGN | - |
dc.subject.keywordAuthor | textile electrode | - |
dc.subject.keywordAuthor | fabric sensor | - |
dc.subject.keywordAuthor | wearable device | - |
dc.subject.keywordAuthor | sEMG (surface electromyogram) | - |
dc.subject.keywordAuthor | classification | - |
dc.identifier.url | https://iopscience.iop.org/article/10.1088/1361-665X/ab6690 | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
55 Hanyangdeahak-ro, Sangnok-gu, Ansan, Gyeonggi-do, 15588, Korea+82-31-400-4269 sweetbrain@hanyang.ac.kr
COPYRIGHT © 2021 HANYANG UNIVERSITY. ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.