Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

HAPtics: Human Action Prediction in Real-time via Pose Kinematics

Authors
Ahmad, N.Ullah, S.Khan, J.Choi, C.Lee, Y.
Issue Date
Dec-2024
Publisher
Springer Science and Business Media Deutschland GmbH
Citation
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) , v.15315 LNCS, pp 145 - 161
Pages
17
Indexed
SCIE
SCOPUS
Journal Title
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume
15315 LNCS
Start Page
145
End Page
161
URI
https://scholarworks.bwise.kr/erica/handle/2021.sw.erica/122073
DOI
10.1007/978-3-031-78354-8_10
ISSN
0302-9743
1611-3349
Abstract
Recognizing human actions in real-time presents a fundamental challenge, particularly when humans interact with other humans or objects in a shared space. Such systems must be able to recognize and assess real-world human actions from different angles and viewpoints. Consequently, a substantial volume of multi-dimensional human action training data is essential to enable data-driven algorithms to operate effectively in real-world scenarios. This paper introduces the Action Clip dataset, which provides a comprehensive 360-degree view of human actions, capturing rich features from multiple angles. Additionally, we describe the design and implementation of Human Action Prediction via Pose Kinematics (HAPtics), a comprehensive pipeline for real-time human pose estimation and action recognition, all achievable with standard monocular camera sensors. HAPtics utilizes a skeleton modality by transforming initially noisy human pose kinematic structures into skeletal features, such as body velocity, joint velocity, joint angles, and limb lengths derived from joint positions, followed by a classification layer. We have implemented and evaluated HAPtics using four different datasets, demonstrating competitive state-of-the-art performance in pose-based action recognition and real-time performance at 30 frames per second on a live camera. The code and dataset are available at: https://github.com/RaiseLab/HAPtics. © The Author(s), under exclusive license to Springer Nature Switzerland AG 2025.
Files in This Item
There are no files associated with this item.
Appears in
Collections
COLLEGE OF ENGINEERING SCIENCES > DEPARTMENT OF ROBOT ENGINEERING > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher LEE, YOUNG MOON photo

LEE, YOUNG MOON
ERICA 공학대학 (DEPARTMENT OF ROBOT ENGINEERING)
Read more

Altmetrics

Total Views & Downloads

BROWSE