Progressive Multi-View Instance Matching: Occlusion-Robust Approach with Initial Segmentation Enhancement
- Authors
- 고현석
- Issue Date
- Feb-2025
- Publisher
- IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
- Keywords
- Instance segmentation; multi-view image dataset; multi-view instance matching; occlusion; segmentation refinement
- Citation
- IEEE SENSORS JOURNAL, v.25, no.3, pp 1 - 13
- Pages
- 13
- Indexed
- SCIE
SCOPUS
- Journal Title
- IEEE SENSORS JOURNAL
- Volume
- 25
- Number
- 3
- Start Page
- 1
- End Page
- 13
- URI
- https://scholarworks.bwise.kr/erica/handle/2021.sw.erica/121998
- DOI
- 10.1109/JSEN.2024.3515115
- ISSN
- 1530-437X
1558-1748
- Abstract
- The growing demand for immersive media has led to active research on multi-view images. However, challenges such as occlusion and the lack of publicly available multi-view datasets hinder progress in this field. To address these issues, we introduce a novel dataset, SMIIM (Synthetic Multi-view Images for Instance Matching), designed for multi-view instance matching. We also propose a progressive matching algorithm, a three-stage process
that effectively handles occlusion. Finally, we improve the image segmentation results by refining the masks generated by existing networks with our matching results. Compared to existing instance matching algorithms, our method not only provides faster processing time but also significantly enhances performance, achieving an average ID matching accuracy of 97.5% and a mean Intersection over Union (mIoU) improvement of 19.8% not only on our dataset but also on standardized datasets and real-world datasets. Multi-object matching in multi-view environments is rare, making our research a valuable contribution to this field. The SMIIM dataset will be released to facilitate further research and development in the field of multi-view image processing.
- Files in This Item
-
Go to Link
- Appears in
Collections - COLLEGE OF ENGINEERING SCIENCES > SCHOOL OF ELECTRICAL ENGINEERING > 1. Journal Articles

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.