Fusion of multiple lidars and inertial sensors for the real-time pose tracking of human motion
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Patil, A.K. | - |
dc.contributor.author | Balasubramanyam, A. | - |
dc.contributor.author | Ryu, J.Y. | - |
dc.contributor.author | Pavan, Kumar B.N. | - |
dc.contributor.author | Chakravarthi, B. | - |
dc.contributor.author | Chai, Y.H. | - |
dc.date.accessioned | 2021-08-19T05:40:32Z | - |
dc.date.available | 2021-08-19T05:40:32Z | - |
dc.date.issued | 2020-09 | - |
dc.identifier.issn | 1424-8220 | - |
dc.identifier.issn | 1424-3210 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/cau/handle/2019.sw.cau/48741 | - |
dc.description.abstract | Today, enhancement in sensing technology enables the use of multiple sensors to track human motion/activity precisely. Tracking human motion has various applications, such as fitness training, healthcare, rehabilitation, human-computer interaction, virtual reality, and activity recognition. Therefore, the fusion of multiple sensors creates new opportunities to develop and improve an existing system. This paper proposes a pose-tracking system by fusing multiple three-dimensional (3D) light detection and ranging (lidar) and inertial measurement unit (IMU) sensors. The initial step estimates the human skeletal parameters proportional to the target user’s height by extracting the point cloud from lidars. Next, IMUs are used to capture the orientation of each skeleton segment and estimate the respective joint positions. In the final stage, the displacement drift in the position is corrected by fusing the data from both sensors in real time. The installation setup is relatively effortless, flexible for sensor locations, and delivers results comparable to the state-of-the-art pose-tracking system. We evaluated the proposed system regarding its accuracy in the user’s height estimation, full-body joint position estimation, and reconstruction of the 3D avatar. We used a publicly available dataset for the experimental evaluation wherever possible. The results reveal that the accuracy of height and the position estimation is well within an acceptable range of ±3–5 cm. The reconstruction of the motion based on the publicly available dataset and our data is precise and realistic. | - |
dc.format.extent | 16 | - |
dc.language | 영어 | - |
dc.language.iso | ENG | - |
dc.publisher | MDPI AG | - |
dc.title | Fusion of multiple lidars and inertial sensors for the real-time pose tracking of human motion | - |
dc.type | Article | - |
dc.identifier.doi | 10.3390/s20185342 | - |
dc.identifier.bibliographicCitation | Sensors (Switzerland), v.20, no.18, pp 1 - 16 | - |
dc.description.isOpenAccess | Y | - |
dc.identifier.wosid | 000580255400001 | - |
dc.identifier.scopusid | 2-s2.0-85091112300 | - |
dc.citation.endPage | 16 | - |
dc.citation.number | 18 | - |
dc.citation.startPage | 1 | - |
dc.citation.title | Sensors (Switzerland) | - |
dc.citation.volume | 20 | - |
dc.type.docType | Article | - |
dc.publisher.location | 스위스 | - |
dc.subject.keywordAuthor | Activity recognition | - |
dc.subject.keywordAuthor | Human motion | - |
dc.subject.keywordAuthor | Inertial sensor | - |
dc.subject.keywordAuthor | Lidar | - |
dc.subject.keywordAuthor | Locomotion | - |
dc.subject.keywordAuthor | Motion reconstruction | - |
dc.subject.keywordAuthor | Position estimation | - |
dc.subject.keywordAuthor | Position tracking | - |
dc.subject.keywordPlus | Human computer interaction | - |
dc.subject.keywordPlus | Medical computing | - |
dc.subject.keywordPlus | Motion tracking | - |
dc.subject.keywordPlus | Optical radar | - |
dc.subject.keywordPlus | Three dimensional computer graphics | - |
dc.subject.keywordPlus | Tracking (position) | - |
dc.subject.keywordPlus | Activity recognition | - |
dc.subject.keywordPlus | Experimental evaluation | - |
dc.subject.keywordPlus | Height estimation | - |
dc.subject.keywordPlus | Inertial measurement unit | - |
dc.subject.keywordPlus | Light detection and ranging | - |
dc.subject.keywordPlus | Position estimation | - |
dc.subject.keywordPlus | Sensing technology | - |
dc.subject.keywordPlus | Threedimensional (3-d) | - |
dc.subject.keywordPlus | Gesture recognition | - |
dc.relation.journalResearchArea | Chemistry | - |
dc.relation.journalResearchArea | Engineering | - |
dc.relation.journalResearchArea | Instruments & Instrumentation | - |
dc.relation.journalWebOfScienceCategory | Chemistry, Analytical | - |
dc.relation.journalWebOfScienceCategory | Engineering, Electrical & Electronic | - |
dc.relation.journalWebOfScienceCategory | Instruments & Instrumentation | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
84, Heukseok-ro, Dongjak-gu, Seoul, Republic of Korea (06974)02-820-6194
COPYRIGHT 2019 Chung-Ang University All Rights Reserved.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.