HAPtics: Human Action Prediction in Real-time via Pose Kinematics
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Ahmad, N. | - |
dc.contributor.author | Ullah, S. | - |
dc.contributor.author | Khan, J. | - |
dc.contributor.author | Choi, C. | - |
dc.contributor.author | Lee, Y. | - |
dc.date.accessioned | 2025-02-13T08:00:24Z | - |
dc.date.available | 2025-02-13T08:00:24Z | - |
dc.date.issued | 2024-12 | - |
dc.identifier.issn | 0302-9743 | - |
dc.identifier.issn | 1611-3349 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/erica/handle/2021.sw.erica/122073 | - |
dc.description.abstract | Recognizing human actions in real-time presents a fundamental challenge, particularly when humans interact with other humans or objects in a shared space. Such systems must be able to recognize and assess real-world human actions from different angles and viewpoints. Consequently, a substantial volume of multi-dimensional human action training data is essential to enable data-driven algorithms to operate effectively in real-world scenarios. This paper introduces the Action Clip dataset, which provides a comprehensive 360-degree view of human actions, capturing rich features from multiple angles. Additionally, we describe the design and implementation of Human Action Prediction via Pose Kinematics (HAPtics), a comprehensive pipeline for real-time human pose estimation and action recognition, all achievable with standard monocular camera sensors. HAPtics utilizes a skeleton modality by transforming initially noisy human pose kinematic structures into skeletal features, such as body velocity, joint velocity, joint angles, and limb lengths derived from joint positions, followed by a classification layer. We have implemented and evaluated HAPtics using four different datasets, demonstrating competitive state-of-the-art performance in pose-based action recognition and real-time performance at 30 frames per second on a live camera. The code and dataset are available at: https://github.com/RaiseLab/HAPtics. © The Author(s), under exclusive license to Springer Nature Switzerland AG 2025. | - |
dc.format.extent | 17 | - |
dc.language | 영어 | - |
dc.language.iso | ENG | - |
dc.publisher | Springer Science and Business Media Deutschland GmbH | - |
dc.title | HAPtics: Human Action Prediction in Real-time via Pose Kinematics | - |
dc.type | Article | - |
dc.publisher.location | 미국 | - |
dc.identifier.doi | 10.1007/978-3-031-78354-8_10 | - |
dc.identifier.scopusid | 2-s2.0-85212501704 | - |
dc.identifier.bibliographicCitation | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) , v.15315 LNCS, pp 145 - 161 | - |
dc.citation.title | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | - |
dc.citation.volume | 15315 LNCS | - |
dc.citation.startPage | 145 | - |
dc.citation.endPage | 161 | - |
dc.type.docType | Conference Paper | - |
dc.description.isOpenAccess | N | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
55 Hanyangdeahak-ro, Sangnok-gu, Ansan, Gyeonggi-do, 15588, Korea+82-31-400-4269 sweetbrain@hanyang.ac.kr
COPYRIGHT © 2021 HANYANG UNIVERSITY. ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.