Unsupervised Deep Event Stereo for Depth Estimation
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Uddin, S. M. Nadim | - |
dc.contributor.author | Ahmed, Soikat Hasan | - |
dc.contributor.author | Jung, Yong Ju | - |
dc.date.accessioned | 2023-01-03T01:40:14Z | - |
dc.date.available | 2023-01-03T01:40:14Z | - |
dc.date.created | 2022-12-16 | - |
dc.date.issued | 2022-11 | - |
dc.identifier.issn | 1051-8215 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/gachon/handle/2020.sw.gachon/86388 | - |
dc.description.abstract | Bio-inspired event cameras have been considered effective alternatives to traditional frame-based cameras for stereo depth estimation, especially in challenging conditions such as low-light or high-speed environments. Recently, deep learning-based supervised event stereo matching methods have achieved significant performance improvements over the traditional event stereo methods. However, the supervised methods depend on ground-truth disparity maps for training, and it is difficult to secure a large amount of ground-truth disparity maps. A feasible alternative is to devise an unsupervised event stereo method that can be trained without ground-truth disparity maps. To this end, we propose the first unsupervised event stereo matching method that can predict dense disparity maps, and is trained by transforming the depth estimation problem into a warping-based reconstruction problem. We propose a novel unsupervised loss function that enforces the network to minimize the feature-level epipolar correlation difference between the ground-truth intensity images and warped images. Moreover, we propose a novel event embedding mechanism that utilizes both temporal and spatial neighboring events to capture spatio-temporal relationships among the events for stereo matching. Experimental results reveal that the proposed method outperforms the baseline unsupervised methods by significant margins (e.g., up to 16.88% improvement) and achieves comparable results with the existing supervised methods. Extensive ablation studies validate the efficacy of the proposed modules and architectural choices. | - |
dc.language | 영어 | - |
dc.language.iso | en | - |
dc.publisher | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC | - |
dc.relation.isPartOf | IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY | - |
dc.title | Unsupervised Deep Event Stereo for Depth Estimation | - |
dc.type | Article | - |
dc.type.rims | ART | - |
dc.description.journalClass | 1 | - |
dc.identifier.wosid | 000876020600018 | - |
dc.identifier.doi | 10.1109/TCSVT.2022.3189480 | - |
dc.identifier.bibliographicCitation | IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, v.32, no.11, pp.7489 - 7504 | - |
dc.description.isOpenAccess | N | - |
dc.identifier.scopusid | 2-s2.0-85134204897 | - |
dc.citation.endPage | 7504 | - |
dc.citation.startPage | 7489 | - |
dc.citation.title | IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY | - |
dc.citation.volume | 32 | - |
dc.citation.number | 11 | - |
dc.contributor.affiliatedAuthor | Uddin, S. M. Nadim | - |
dc.contributor.affiliatedAuthor | Ahmed, Soikat Hasan | - |
dc.contributor.affiliatedAuthor | Jung, Yong Ju | - |
dc.type.docType | Article | - |
dc.subject.keywordAuthor | Cameras | - |
dc.subject.keywordAuthor | Estimation | - |
dc.subject.keywordAuthor | Image matching | - |
dc.subject.keywordAuthor | Correlation | - |
dc.subject.keywordAuthor | Training | - |
dc.subject.keywordAuthor | Lighting | - |
dc.subject.keywordAuthor | Image reconstruction | - |
dc.subject.keywordAuthor | Event camera | - |
dc.subject.keywordAuthor | stereo matching | - |
dc.subject.keywordAuthor | depth estimation | - |
dc.subject.keywordAuthor | unsupervised deep learning | - |
dc.subject.keywordPlus | DISPARITY ESTIMATION | - |
dc.subject.keywordPlus | OPTICAL-FLOW | - |
dc.subject.keywordPlus | IMAGE | - |
dc.relation.journalResearchArea | Engineering | - |
dc.relation.journalWebOfScienceCategory | Engineering, Electrical & Electronic | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
1342, Seongnam-daero, Sujeong-gu, Seongnam-si, Gyeonggi-do, Republic of Korea(13120)031-750-5114
COPYRIGHT 2020 Gachon University All Rights Reserved.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.