Dance motion capture and composition using multiple RGB and depth sensors
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Kim, Yejin | - |
dc.date.available | 2020-07-10T05:22:18Z | - |
dc.date.created | 2020-07-06 | - |
dc.date.issued | 2017-02-01 | - |
dc.identifier.issn | 1550-1477 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/hongik/handle/2020.sw.hongik/6125 | - |
dc.description.abstract | Dynamic human movements such as dance are difficult to capture without using external markers due to the high complexity of a dancer's body. This article introduces a marker-free motion capture and composition system for dance motion that uses multiple RGB and depth sensors. Our motion capture system utilizes a set of high-speed RGB and depth sensors to generate skeletal motion data from an expert dancer. During the motion acquisition process, a skeleton tracking method based on a particle filter is provided to estimate the motion parameters for each frame from a sequence of color images and depth features retrieved from the sensors. The expert motion data become archived in a database. The authoring methods in our composition system automate most of the motion editing processes for general users by providing an online motion search with an input posture and then performing motion synthesis on an arbitrary motion path. Using the proposed system, we demonstrate that various dance performances can be composed in an intuitive and efficient way on client devices such as tablets and kiosk PCs. | - |
dc.language | 영어 | - |
dc.language.iso | en | - |
dc.publisher | SAGE PUBLICATIONS INC | - |
dc.subject | TRACKING | - |
dc.title | Dance motion capture and composition using multiple RGB and depth sensors | - |
dc.type | Article | - |
dc.contributor.affiliatedAuthor | Kim, Yejin | - |
dc.identifier.doi | 10.1177/1550147717696083 | - |
dc.identifier.scopusid | 2-s2.0-85014552673 | - |
dc.identifier.wosid | 000394847200025 | - |
dc.identifier.bibliographicCitation | INTERNATIONAL JOURNAL OF DISTRIBUTED SENSOR NETWORKS, v.13, no.2 | - |
dc.relation.isPartOf | INTERNATIONAL JOURNAL OF DISTRIBUTED SENSOR NETWORKS | - |
dc.citation.title | INTERNATIONAL JOURNAL OF DISTRIBUTED SENSOR NETWORKS | - |
dc.citation.volume | 13 | - |
dc.citation.number | 2 | - |
dc.type.rims | ART | - |
dc.type.docType | Article | - |
dc.description.journalClass | 1 | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
dc.relation.journalResearchArea | Computer Science | - |
dc.relation.journalResearchArea | Telecommunications | - |
dc.relation.journalWebOfScienceCategory | Computer Science, Information Systems | - |
dc.relation.journalWebOfScienceCategory | Telecommunications | - |
dc.subject.keywordPlus | TRACKING | - |
dc.subject.keywordAuthor | Motion capture | - |
dc.subject.keywordAuthor | dance motion | - |
dc.subject.keywordAuthor | motion acquisition | - |
dc.subject.keywordAuthor | motion composition | - |
dc.subject.keywordAuthor | motion authoring | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
94, Wausan-ro, Mapo-gu, Seoul, 04066, Korea02-320-1314
COPYRIGHT 2020 HONGIK UNIVERSITY. ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.