Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Angular features-based human action recognition system for a real application with subtle unit actionsopen access

Authors
Ryu, J.Patil, A.K.Chakravarthi, B.Balasubramanyam, A.Park, S.Chai, Young Ho
Issue Date
2022
Publisher
Institute of Electrical and Electronics Engineers Inc.
Keywords
Benchmark testing; Data mining; ELM classifier; Feature extraction; Human action recognition; motion capture; Sensor systems; Sensors; Skeleton; skeleton; surveillance; Training data
Citation
IEEE Access, v.10, pp 9645 - 9657
Pages
13
Journal Title
IEEE Access
Volume
10
Start Page
9645
End Page
9657
URI
https://scholarworks.bwise.kr/cau/handle/2019.sw.cau/54904
DOI
10.1109/ACCESS.2022.3144456
ISSN
2169-3536
Abstract
Human action recognition (HAR) technology is receiving considerable attention in the field of human-computer interaction. We present a HAR system that works stably in real-world applications. In real-world applications, the HAR system needs to identify detailed actions for specific purposes, and the action data includes many variations. Accordingly, we conducted three experiments. First, we tested our recognition system’s performance on the UTD-MHAD dataset. We compared our system’s accuracy with results from previous research and confirmed that our system achieves an average performance among recognition systems. Furthermore, we hypothesized the use of a HAR system to detect burglary. In the second experiment, we compared the existing benchmark data with our crime detection dataset. We recognized the test scenarios’ data by using the recognition system trained by each dataset. The recognition system trained by our dataset achieved higher accuracy than the past benchmark dataset. The results show that the training data should contain detailed actions for a real application. In the third experiment, we tried to find the motion data type that stably recognizes action regardless of data variation. In a real application, the action data are changed by people. Thus, we introduced variations in the action data using the cross-subject protocol and moving area setting. We trained the recognition system using each position and angle data. In addition, we compared the accuracy of each system. We found that the angle format results in better accuracy because the angle data are beneficial for converting the action variation into a consistent pattern. Author
Files in This Item
Appears in
Collections
Graduate School of Advanced Imaging Sciences, Multimedia and Film > Department of Imaging Science and Arts > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Chai, Young Ho photo

Chai, Young Ho
첨단영상대학원 (영상학과)
Read more

Altmetrics

Total Views & Downloads

BROWSE