MultiPoseSeg: Feedback Knowledge Transfer for Multi-Person Pose Estimation and Instance Segmentation
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Ahmad, Niaz | - |
dc.contributor.author | Khan, Jawad | - |
dc.contributor.author | Kim, Jeremy Yuhyun | - |
dc.contributor.author | Lee, Youngmoon | - |
dc.date.accessioned | 2023-07-05T05:33:15Z | - |
dc.date.available | 2023-07-05T05:33:15Z | - |
dc.date.issued | 2022-11 | - |
dc.identifier.issn | 1051-4651 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/erica/handle/2021.sw.erica/112952 | - |
dc.description.abstract | Multi-person pose estimation and instance segmentation suffer large performance loss when images are with an increasing number of people and their uncontrolled complex appearance. Yet, existing models cannot efficiently leverage unbalanced training images, i.e., few of them are with multi-person, and most are with single-person, making them ineffective for challenging multi-person scenarios. To tackle multi-person cases with a limited portion of them, we propose MultiPoseSeg, a data preparation and feedback knowledge transfer system designed for multi-person pose estimation and instance segmentation. First, MultiPoseSeg categorically performs random data reduction to reduce the single-person bias from the train dataset. Second, MultiPoseSeg employs the knowledge transfer from ancestor models to converge the model learning with a limited amount of data and time. This way, our model learns and train on human pose and instance segmentation to advance the training and testing accuracy. Finally, MultiPoseSeg proposes keypoint maps to identify the keypoint coordinates for soft and hard keypoints and segmentation maps to assign centroid to each human instance, which helps to cluster the pixels to a particular instance. We have evaluated MultiPoseSeg using COCO and OCHuman challenging datasets and demonstrated MultiPoseSeg outperforms state-of-the-art bottom-up models in terms of both accuracy and runtime performance, achieving 0.728 mAP for pose and 0.445 mAP for segmentation on COCO dataset. All the unbiased data and code has been made available at: https://github.com/RaiseLab/MultiPoseSeg | - |
dc.format.extent | 7 | - |
dc.language | 영어 | - |
dc.language.iso | ENG | - |
dc.publisher | IEEE | - |
dc.title | MultiPoseSeg: Feedback Knowledge Transfer for Multi-Person Pose Estimation and Instance Segmentation | - |
dc.type | Article | - |
dc.publisher.location | 미국 | - |
dc.identifier.doi | 10.1109/ICPR56361.2022.9956648 | - |
dc.identifier.scopusid | 2-s2.0-85143629082 | - |
dc.identifier.wosid | 000897707602012 | - |
dc.identifier.bibliographicCitation | 2022 26th International Conference on Pattern Recognition (ICPR), pp 2086 - 2092 | - |
dc.citation.title | 2022 26th International Conference on Pattern Recognition (ICPR) | - |
dc.citation.startPage | 2086 | - |
dc.citation.endPage | 2092 | - |
dc.type.docType | Proceedings Paper | - |
dc.description.isOpenAccess | N | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
dc.relation.journalResearchArea | Computer Science | - |
dc.relation.journalResearchArea | Engineering | - |
dc.relation.journalResearchArea | Imaging Science & Photographic Technology | - |
dc.relation.journalWebOfScienceCategory | Computer Science, Artificial Intelligence | - |
dc.relation.journalWebOfScienceCategory | Engineering, Electrical & Electronic | - |
dc.relation.journalWebOfScienceCategory | Imaging Science & Photographic Technology | - |
dc.identifier.url | https://ieeexplore.ieee.org/document/9956648 | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
55 Hanyangdeahak-ro, Sangnok-gu, Ansan, Gyeonggi-do, 15588, Korea+82-31-400-4269 sweetbrain@hanyang.ac.kr
COPYRIGHT © 2021 HANYANG UNIVERSITY. ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.