Twenty-one degrees of freedom model based hand pose tracking using a monocular RGB camera
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Choi, Junyeong | - |
dc.contributor.author | Park, Jong-Il | - |
dc.contributor.author | Park, Hanhoon | - |
dc.date.accessioned | 2022-07-15T19:20:40Z | - |
dc.date.available | 2022-07-15T19:20:40Z | - |
dc.date.created | 2021-05-12 | - |
dc.date.issued | 2016-01 | - |
dc.identifier.issn | 0091-3286 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/hanyang/handle/2021.sw.hanyang/155340 | - |
dc.description.abstract | It is difficult to visually track a user's hand because of the many degrees of freedom (DOF) a hand has. For this reason, most model-based hand pose tracking methods have relied on the use of multiview images or RGB-D images. This paper proposes a model-based method that accurately tracks three-dimensional hand poses using monocular RGB images in real time. The main idea of the proposed method is to reduce hand tracking ambiguity by adopting a step-by-step estimation scheme consisting of three steps performed in consecutive order: palm pose estimation, finger yaw motion estimation, and finger pitch motion estimation. In addition, this paper proposes highly effective algorithms for each step. With the assumption that a human hand can be considered as an assemblage of articulated planes, the proposed method uses a piece-wise planar hand model which enables hand model regeneration. The hand model regeneration modifies the hand model to fit the current user's hand and improves the accuracy of the hand pose estimation results. Above all, the proposed method can operate in real time using only CPU-based processing. Consequently, it can be applied to various platforms, including egocentric vision devices such as wearable glasses. The results of several experiments conducted verify the efficiency and accuracy of the proposed method. | - |
dc.language | 영어 | - |
dc.language.iso | en | - |
dc.publisher | SPIE-SOC PHOTO-OPTICAL INSTRUMENTATION ENGINEERS | - |
dc.title | Twenty-one degrees of freedom model based hand pose tracking using a monocular RGB camera | - |
dc.type | Article | - |
dc.contributor.affiliatedAuthor | Park, Jong-Il | - |
dc.identifier.doi | 10.1117/1.OE.55.1.013101 | - |
dc.identifier.scopusid | 2-s2.0-84954138512 | - |
dc.identifier.wosid | 000371283500012 | - |
dc.identifier.bibliographicCitation | OPTICAL ENGINEERING, v.55, no.1, pp.1 - 14 | - |
dc.relation.isPartOf | OPTICAL ENGINEERING | - |
dc.citation.title | OPTICAL ENGINEERING | - |
dc.citation.volume | 55 | - |
dc.citation.number | 1 | - |
dc.citation.startPage | 1 | - |
dc.citation.endPage | 14 | - |
dc.type.rims | ART | - |
dc.type.docType | Article | - |
dc.description.journalClass | 1 | - |
dc.description.isOpenAccess | N | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
dc.relation.journalResearchArea | Optics | - |
dc.relation.journalWebOfScienceCategory | Optics | - |
dc.subject.keywordPlus | Degrees of freedom (mechanics) | - |
dc.subject.keywordPlus | End effectors | - |
dc.subject.keywordPlus | Gesture recognition | - |
dc.subject.keywordPlus | Mechanics | - |
dc.subject.keywordPlus | Motion estimation | - |
dc.subject.keywordAuthor | hand pose tracking | - |
dc.subject.keywordAuthor | stepwise estimation | - |
dc.subject.keywordAuthor | planar hand model | - |
dc.subject.keywordAuthor | model regeneration | - |
dc.subject.keywordAuthor | recursive hypothesis assignment | - |
dc.subject.keywordAuthor | bidirectional particle swam optimization | - |
dc.subject.keywordAuthor | gesture-based interface | - |
dc.identifier.url | https://www.spiedigitallibrary.org/journals/optical-engineering/volume-55/issue-1/013101/Twenty-one-degrees-of-freedom-model-based-hand-pose-tracking/10.1117/1.OE.55.1.013101.short?SSO=1 | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
222, Wangsimni-ro, Seongdong-gu, Seoul, 04763, Korea+82-2-2220-1365
COPYRIGHT © 2021 HANYANG UNIVERSITY.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.