WHITE STAG model: wise human interaction tracking and estimation (WHITE) using spatio-temporal and angular-geometric (STAG) descriptors
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Mahmood, Maria | - |
dc.contributor.author | Jalal, Ahmad | - |
dc.contributor.author | Kim, Kibum | - |
dc.date.accessioned | 2021-06-22T09:07:20Z | - |
dc.date.available | 2021-06-22T09:07:20Z | - |
dc.date.issued | 2020-03 | - |
dc.identifier.issn | 1380-7501 | - |
dc.identifier.issn | 1432-1882 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/erica/handle/2021.sw.erica/1248 | - |
dc.description.abstract | To understand human to human dealing accurately, human interaction recognition (HIR) systems require robust feature extraction and selection methods based on vision sensors. In this paper, we have proposed WHITE STAG model to wisely track human interactions using space time methods as well as shape based angular-geometric sequential approaches over full-body silhouettes and skeleton joints, respectively. After feature extraction, feature space is reduced by employing codebook generation and linear discriminant analysis (LDA). Finally, kernel sliding perceptron is used to recognize multiple classes of human interactions. The proposed WHITE STAG model is validated using two publicly available RGB datasets and one self-annotated intensity interactive dataset as novelty. For evaluation, four experiments are performed using leave-one-out and cross validation testing schemes. Our WHITE STAG model and kernel sliding perceptron outperformed the existing well known statistical state-of-the-art methods by achieving a weighted average recognition rate of 87.48% over UT-Interaction, 87.5% over BIT-Interaction and 85.7% over proposed IM-IntensityInteractive7 datasets. The proposed system should be applicable to various multimedia contents and security applications such as surveillance systems, video based learning, medical futurists, service cobots, and interactive gaming. | - |
dc.format.extent | 32 | - |
dc.language | 영어 | - |
dc.language.iso | ENG | - |
dc.publisher | SPRINGER | - |
dc.title | WHITE STAG model: wise human interaction tracking and estimation (WHITE) using spatio-temporal and angular-geometric (STAG) descriptors | - |
dc.type | Article | - |
dc.publisher.location | 네델란드 | - |
dc.identifier.doi | 10.1007/s11042-019-08527-8 | - |
dc.identifier.scopusid | 2-s2.0-85077083095 | - |
dc.identifier.wosid | 000523441100001 | - |
dc.identifier.bibliographicCitation | MULTIMEDIA TOOLS AND APPLICATIONS, v.79, no.11-12, pp 6919 - 6950 | - |
dc.citation.title | MULTIMEDIA TOOLS AND APPLICATIONS | - |
dc.citation.volume | 79 | - |
dc.citation.number | 11-12 | - |
dc.citation.startPage | 6919 | - |
dc.citation.endPage | 6950 | - |
dc.type.docType | Article | - |
dc.description.isOpenAccess | N | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
dc.relation.journalResearchArea | Computer Science | - |
dc.relation.journalResearchArea | Engineering | - |
dc.relation.journalWebOfScienceCategory | Computer Science, Information Systems | - |
dc.relation.journalWebOfScienceCategory | Computer Science, Software Engineering | - |
dc.relation.journalWebOfScienceCategory | Computer Science, Theory & Methods | - |
dc.relation.journalWebOfScienceCategory | Engineering, Electrical & Electronic | - |
dc.subject.keywordPlus | REPRESENTATION | - |
dc.subject.keywordPlus | RECOGNITION | - |
dc.subject.keywordPlus | VIDEOS | - |
dc.subject.keywordAuthor | Full body silhouettes | - |
dc.subject.keywordAuthor | Human interaction recognition | - |
dc.subject.keywordAuthor | Kernel sliding perceptron | - |
dc.subject.keywordAuthor | Spatio-temporal angular-geometric features | - |
dc.subject.keywordAuthor | Skeleton joints | - |
dc.identifier.url | https://link.springer.com/article/10.1007/s11042-019-08527-8 | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
55 Hanyangdeahak-ro, Sangnok-gu, Ansan, Gyeonggi-do, 15588, Korea+82-31-400-4269 sweetbrain@hanyang.ac.kr
COPYRIGHT © 2021 HANYANG UNIVERSITY. ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.