Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Transferable Convolutional Neural Networks for IMU-based Motion Gesture Recognition in Human-Machine Interaction

Authors
Kim, J.Lee, J.Kim, W.
Issue Date
Nov-2024
Publisher
IEEE Computer Society
Keywords
Convolutional Neural Networks (CNNs); Hand gesture Recognition (HGR); Human-Robot Interaction (HRI); Transfer Learning
Citation
International Conference on Control, Automation and Systems, pp 61 - 66
Pages
6
Indexed
SCOPUS
Journal Title
International Conference on Control, Automation and Systems
Start Page
61
End Page
66
URI
https://scholarworks.bwise.kr/erica/handle/2021.sw.erica/122222
DOI
10.23919/ICCAS63016.2024.10773204
ISSN
1598-7833
Abstract
Hand gestures, a fundamental aspect of human non-verbal communication, are often leveraged in the domain of Human-Machine Interaction (HMI) to implement more user-friendly interfaces. In this study, we propose a Convolutional Neural Network (CNN) model designed for efficient motion gesture recognition, designed to be deployed on a smartwatch, using only one Inertial Measurement Unit (IMU) sensor worn on the wrist. By directly processing low-dimensional motion data on linear acceleration and angular velocity, our model demonstrates high performance using a simplified model structure. Furthermore, we explore the potential of applying a transfer learning approach to our CNN model for novel gesture classification problems. This method demonstrates that a well-trained CNN model's backbone network effectively extracts motion features necessary for the recognition of new gestures. Validation processes in scenarios with limited data-employing specific training-to-test ratios of 1:3, 1:7, and 1:19-allowed for a comparison of our model's performance against baseline models trained from scratch. Our approach initially achieves an accuracy rate of 99.48±0.25% in recognizing ten distinct motion gestures through the directly processing raw data on linear acceleration and angular velocity directly. Moreover, the transfer learning model outperformed the baseline model trained from scratch with 95.62±0.99%, 93.23±1.41%, 92.81±1.62% accuracy in learning four new gestures under data limitations, respectively. This study shows that the proposed model maintains high performance with lightweight structure, while also highlighting how transfer learning approach can address the challenges of data collection and set the stage for creating more intuitive and user-centric interaction systems. © 2024 ICROS.
Files in This Item
There are no files associated with this item.
Appears in
Collections
COLLEGE OF ENGINEERING SCIENCES > DEPARTMENT OF ROBOT ENGINEERING > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher KIM, WANSOO photo

KIM, WANSOO
ERICA 공학대학 (DEPARTMENT OF ROBOT ENGINEERING)
Read more

Altmetrics

Total Views & Downloads

BROWSE