Upper Body Pose Estimation Using Deep Learning for a Virtual Reality Avatar
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Anvari, Taravat | - |
dc.contributor.author | Park, Kyoungju | - |
dc.contributor.author | Kim, Ganghyun | - |
dc.date.accessioned | 2023-08-07T04:41:26Z | - |
dc.date.available | 2023-08-07T04:41:26Z | - |
dc.date.issued | 2023-02 | - |
dc.identifier.issn | 2076-3417 | - |
dc.identifier.issn | 2076-3417 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/cau/handle/2019.sw.cau/67323 | - |
dc.description.abstract | With the popularity of virtual reality (VR) games and devices, demand is increasing for estimating and displaying user motion in VR applications. Most pose estimation methods for VR avatars exploit inverse kinematics (IK) and online motion capture methods. In contrast to existing approaches, we aim for a stable process with less computation, usable in a small space. Therefore, our strategy has minimum latency for VR device users, from high-performance to low-performance, in multi-user applications over the network. In this study, we estimate the upper body pose of a VR user in real time using a deep learning method. We propose a novel method inspired by a classical regression model and trained with 3D motion capture data. Thus, our design uses a convolutional neural network (CNN)-based architecture from the joint information of motion capture data and modifies the network input and output to obtain input from a head and both hands. After feeding the model with properly normalized inputs, a head-mounted display (HMD), and two controllers, we render the user's corresponding avatar in VR applications. We used our proposed pose estimation method to build single-user and multi-user applications, measure their performance, conduct a user study, and compare the results with previous methods for VR avatars. | - |
dc.language | 영어 | - |
dc.language.iso | ENG | - |
dc.publisher | MDPI | - |
dc.title | Upper Body Pose Estimation Using Deep Learning for a Virtual Reality Avatar | - |
dc.type | Article | - |
dc.identifier.doi | 10.3390/app13042460 | - |
dc.identifier.bibliographicCitation | APPLIED SCIENCES-BASEL, v.13, no.4 | - |
dc.description.isOpenAccess | Y | - |
dc.identifier.wosid | 000938656300001 | - |
dc.identifier.scopusid | 2-s2.0-85149318899 | - |
dc.citation.number | 4 | - |
dc.citation.title | APPLIED SCIENCES-BASEL | - |
dc.citation.volume | 13 | - |
dc.type.docType | Article | - |
dc.publisher.location | 스위스 | - |
dc.subject.keywordAuthor | avatar | - |
dc.subject.keywordAuthor | immersion | - |
dc.subject.keywordAuthor | pose estimation | - |
dc.subject.keywordAuthor | virtual reality | - |
dc.subject.keywordPlus | EMBODIMENT | - |
dc.subject.keywordPlus | KINECT | - |
dc.relation.journalResearchArea | Chemistry | - |
dc.relation.journalResearchArea | Engineering | - |
dc.relation.journalResearchArea | Materials Science | - |
dc.relation.journalResearchArea | Physics | - |
dc.relation.journalWebOfScienceCategory | Chemistry, Multidisciplinary | - |
dc.relation.journalWebOfScienceCategory | Engineering, Multidisciplinary | - |
dc.relation.journalWebOfScienceCategory | Materials Science, Multidisciplinary | - |
dc.relation.journalWebOfScienceCategory | Physics, Applied | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
84, Heukseok-ro, Dongjak-gu, Seoul, Republic of Korea (06974)02-820-6194
COPYRIGHT 2019 Chung-Ang University All Rights Reserved.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.