Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

HAPtics: Human Action Prediction in Real-time via Pose Kinematics

Full metadata record
DC Field Value Language
dc.contributor.authorAhmad, N.-
dc.contributor.authorUllah, S.-
dc.contributor.authorKhan, J.-
dc.contributor.authorChoi, C.-
dc.contributor.authorLee, Y.-
dc.date.accessioned2025-02-13T08:00:24Z-
dc.date.available2025-02-13T08:00:24Z-
dc.date.issued2024-12-
dc.identifier.issn0302-9743-
dc.identifier.issn1611-3349-
dc.identifier.urihttps://scholarworks.bwise.kr/erica/handle/2021.sw.erica/122073-
dc.description.abstractRecognizing human actions in real-time presents a fundamental challenge, particularly when humans interact with other humans or objects in a shared space. Such systems must be able to recognize and assess real-world human actions from different angles and viewpoints. Consequently, a substantial volume of multi-dimensional human action training data is essential to enable data-driven algorithms to operate effectively in real-world scenarios. This paper introduces the Action Clip dataset, which provides a comprehensive 360-degree view of human actions, capturing rich features from multiple angles. Additionally, we describe the design and implementation of Human Action Prediction via Pose Kinematics (HAPtics), a comprehensive pipeline for real-time human pose estimation and action recognition, all achievable with standard monocular camera sensors. HAPtics utilizes a skeleton modality by transforming initially noisy human pose kinematic structures into skeletal features, such as body velocity, joint velocity, joint angles, and limb lengths derived from joint positions, followed by a classification layer. We have implemented and evaluated HAPtics using four different datasets, demonstrating competitive state-of-the-art performance in pose-based action recognition and real-time performance at 30 frames per second on a live camera. The code and dataset are available at: https://github.com/RaiseLab/HAPtics. © The Author(s), under exclusive license to Springer Nature Switzerland AG 2025.-
dc.format.extent17-
dc.language영어-
dc.language.isoENG-
dc.publisherSpringer Science and Business Media Deutschland GmbH-
dc.titleHAPtics: Human Action Prediction in Real-time via Pose Kinematics-
dc.typeArticle-
dc.publisher.location미국-
dc.identifier.doi10.1007/978-3-031-78354-8_10-
dc.identifier.scopusid2-s2.0-85212501704-
dc.identifier.bibliographicCitationLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) , v.15315 LNCS, pp 145 - 161-
dc.citation.titleLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)-
dc.citation.volume15315 LNCS-
dc.citation.startPage145-
dc.citation.endPage161-
dc.type.docTypeConference Paper-
dc.description.isOpenAccessN-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
Files in This Item
There are no files associated with this item.
Appears in
Collections
COLLEGE OF ENGINEERING SCIENCES > DEPARTMENT OF ROBOT ENGINEERING > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher LEE, YOUNG MOON photo

LEE, YOUNG MOON
ERICA 공학대학 (DEPARTMENT OF ROBOT ENGINEERING)
Read more

Altmetrics

Total Views & Downloads

BROWSE