Physical Activity Recognition With Statistical-Deep Fusion Model Using Multiple Sensory Data for Smart Health
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Huynh-The, Thien | - |
dc.contributor.author | Hua, Cam-Hao | - |
dc.contributor.author | Tu, Nguyen Anh | - |
dc.contributor.author | Kim, Dong-Seong | - |
dc.date.available | 2021-03-31T02:40:07Z | - |
dc.date.issued | 2021-02-01 | - |
dc.identifier.issn | 2327-4662 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/kumoh/handle/2020.sw.kumoh/19019 | - |
dc.description.abstract | Nowadays, enhancing the living standard with smart healthcare via the Internet of Things is one of the most critical goals of smart cities, in which artificial intelligence plays as the core technology. Many smart services, deployed according to wearable sensor-based physical activity recognition, have been able to early detect unhealthy daily behaviors and further medical risks. Numerous approaches have studied shallow handcrafted features coupled with traditional machine learning (ML) techniques, which find it difficult to model real-world activities. In this work, by revealing deep features from deep convolutional neural networks (DCNNs) in fusion with conventional handcrafted features, we learn an intermediate fusion framework of human activity recognition (HAR). According to transforming the raw signal value to pixel intensity value, segmentation data acquired from a multisensor system are encoded to an activity image for deep model learning. Formulated by several novel residual triple convolutional blocks, the proposed DCNN allows extracting multiscale spatiotemporal signal-level and sensor-level correlations simultaneously from the activity image. In the fusion model, the hybrid feature merged from the handcrafted and deep features is learned by a multiclass support vector machine (SVM) classifier. Based on several experiments of performance evaluation, our fusion approach for activity recognition has achieved the accuracy over 96.0% on three public benchmark data sets, including Daily and Sport Activities, Daily Life Activities, and RealWorld. Furthermore, the method outperforms several state-of-the-art HAR approaches and demonstrates the superiority of the proposed intermediate fusion model in multisensor systems. | - |
dc.format.extent | 11 | - |
dc.language | 영어 | - |
dc.language.iso | ENG | - |
dc.publisher | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC | - |
dc.title | Physical Activity Recognition With Statistical-Deep Fusion Model Using Multiple Sensory Data for Smart Health | - |
dc.type | Article | - |
dc.publisher.location | 미국 | - |
dc.identifier.doi | 10.1109/JIOT.2020.3013272 | - |
dc.identifier.wosid | 000612146000022 | - |
dc.identifier.bibliographicCitation | IEEE INTERNET OF THINGS JOURNAL, v.8, no.3, pp 1533 - 1543 | - |
dc.citation.title | IEEE INTERNET OF THINGS JOURNAL | - |
dc.citation.volume | 8 | - |
dc.citation.number | 3 | - |
dc.citation.startPage | 1533 | - |
dc.citation.endPage | 1543 | - |
dc.type.docType | Article | - |
dc.description.isOpenAccess | N | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
dc.relation.journalResearchArea | Computer Science | - |
dc.relation.journalResearchArea | Engineering | - |
dc.relation.journalResearchArea | Telecommunications | - |
dc.relation.journalWebOfScienceCategory | Computer Science, Information Systems | - |
dc.relation.journalWebOfScienceCategory | Engineering, Electrical & Electronic | - |
dc.relation.journalWebOfScienceCategory | Telecommunications | - |
dc.subject.keywordAuthor | Feature extraction | - |
dc.subject.keywordAuthor | Support vector machines | - |
dc.subject.keywordAuthor | Activity recognition | - |
dc.subject.keywordAuthor | Medical services | - |
dc.subject.keywordAuthor | Correlation | - |
dc.subject.keywordAuthor | Internet of Things | - |
dc.subject.keywordAuthor | Machine learning | - |
dc.subject.keywordAuthor | Deep learning | - |
dc.subject.keywordAuthor | intermediate fusion | - |
dc.subject.keywordAuthor | physical activity (PA) recognition | - |
dc.subject.keywordAuthor | wearable sensor system | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
350-27, Gumi-daero, Gumi-si, Gyeongsangbuk-do, Republic of Korea (39253)054-478-7170
COPYRIGHT 2020 Kumoh University All Rights Reserved.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.