Human Activity Recognition based on Deep-Temporal Learning using Convolution Neural Networks Features and Bidirectional Gated Recurrent Unit with Features Selectionopen access
- Authors
- 이영문
- Issue Date
- Mar-2023
- Publisher
- Institute of Electrical and Electronics Engineers Inc.
- Keywords
- bidirectional-gated recurrent unit (Bi-GRU); convolution neural networks (CNNs); deep learning; Human activity recognition; recurrent neural networks (RNNs)
- Citation
- IEEE Access, v.1, no.1, pp 1 - 12
- Pages
- 12
- Indexed
- SCIE
SCOPUS
- Journal Title
- IEEE Access
- Volume
- 1
- Number
- 1
- Start Page
- 1
- End Page
- 12
- URI
- https://scholarworks.bwise.kr/erica/handle/2021.sw.erica/112942
- DOI
- 10.1109/ACCESS.2023.3263155
- ISSN
- 2169-3536
- Abstract
- Recurrent Neural Networks (RNNs) and their variants have been demonstrated tremendous successes in modeling sequential data such as audio processing, video processing, time series analysis, and text mining. Inspired by these facts, we propose human activity recognition technique to proceed visual data via utilizing convolution neural network (CNN) and Bidirectional-gated recurrent unit (Bi-GRU). Firstly, we extract deep features from frames sequence of human activities videos using CNN and then select most important features from the deep appearances to improve performance and decrease computational complexity of the model. Secondly, to learn temporal motions of frames sequence, we design Bi-GRU and feed those deep-important features extracted from frames sequence of human activities to Bi-GRU which learn temporal dynamics in forward and backward direction at each time step. We conduct extensive experiments on realistic videos of human activity recognition datasets YouTube11, HMDB51 and UCF101. Lastly, we compare the obtained results with existing methods to show the competence of our proposed technique. © 2013 IEEE.
- Files in This Item
-
Go to Link
- Appears in
Collections - COLLEGE OF ENGINEERING SCIENCES > DEPARTMENT OF ROBOT ENGINEERING > 1. Journal Articles
![qrcode](https://api.qrserver.com/v1/create-qr-code/?size=55x55&data=https://scholarworks.bwise.kr/erica/handle/2021.sw.erica/112942)
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.