Inertial Sensor-Based Touch and Shake Metaphor for Expressive Control of 3D Virtual Avatars
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Patil, Shashidhar | - |
dc.contributor.author | Chintalapalli, Harinadha Reddy | - |
dc.contributor.author | Kim, Dubeom | - |
dc.contributor.author | Chai, Youngho | - |
dc.date.available | 2019-03-08T17:37:09Z | - |
dc.date.issued | 2015-06 | - |
dc.identifier.issn | 1424-8220 | - |
dc.identifier.issn | 1424-3210 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/cau/handle/2019.sw.cau/9528 | - |
dc.description.abstract | In this paper, we present an inertial sensor-based touch and shake metaphor for expressive control of a 3D virtual avatar in a virtual environment. An intuitive six degrees-of-freedom wireless inertial motion sensor is used as a gesture and motion control input device with a sensor fusion algorithm. The algorithm enables user hand motions to be tracked in 3D space via magnetic, angular rate, and gravity sensors. A quaternion-based complementary filter is implemented to reduce noise and drift. An algorithm based on dynamic time-warping is developed for efficient recognition of dynamic hand gestures with real-time automatic hand gesture segmentation. Our approach enables the recognition of gestures and estimates gesture variations for continuous interaction. We demonstrate the gesture expressivity using an interactive flexible gesture mapping interface for authoring and controlling a 3D virtual avatar and its motion by tracking user dynamic hand gestures. This synthesizes stylistic variations in a 3D virtual avatar, producing motions that are not present in the motion database using hand gesture sequences from a single inertial motion sensor. | - |
dc.format.extent | 23 | - |
dc.language | 영어 | - |
dc.language.iso | ENG | - |
dc.publisher | MDPI AG | - |
dc.title | Inertial Sensor-Based Touch and Shake Metaphor for Expressive Control of 3D Virtual Avatars | - |
dc.type | Article | - |
dc.identifier.doi | 10.3390/s150614435 | - |
dc.identifier.bibliographicCitation | SENSORS, v.15, no.6, pp 14435 - 14457 | - |
dc.description.isOpenAccess | N | - |
dc.identifier.wosid | 000357869200117 | - |
dc.identifier.scopusid | 2-s2.0-84934949425 | - |
dc.citation.endPage | 14457 | - |
dc.citation.number | 6 | - |
dc.citation.startPage | 14435 | - |
dc.citation.title | SENSORS | - |
dc.citation.volume | 15 | - |
dc.type.docType | Article | - |
dc.publisher.location | 스위스 | - |
dc.subject.keywordAuthor | inertial sensors | - |
dc.subject.keywordAuthor | gestural interfaces | - |
dc.subject.keywordAuthor | expressive control | - |
dc.subject.keywordAuthor | gesture recognition | - |
dc.subject.keywordAuthor | gesture variations | - |
dc.subject.keywordAuthor | interactive systems | - |
dc.subject.keywordAuthor | touch and shake | - |
dc.subject.keywordAuthor | virtual avatar | - |
dc.subject.keywordPlus | INDEPENDENT COMPONENT ANALYSIS | - |
dc.subject.keywordPlus | GESTURE RECOGNITION | - |
dc.subject.keywordPlus | MOTION SYNTHESIS | - |
dc.subject.keywordPlus | ALGORITHMS | - |
dc.subject.keywordPlus | FUSION | - |
dc.relation.journalResearchArea | Chemistry | - |
dc.relation.journalResearchArea | Engineering | - |
dc.relation.journalResearchArea | Instruments & Instrumentation | - |
dc.relation.journalWebOfScienceCategory | Chemistry, Analytical | - |
dc.relation.journalWebOfScienceCategory | Engineering, Electrical & Electronic | - |
dc.relation.journalWebOfScienceCategory | Instruments & Instrumentation | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
84, Heukseok-ro, Dongjak-gu, Seoul, Republic of Korea (06974)02-820-6194
COPYRIGHT 2019 Chung-Ang University All Rights Reserved.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.