Robust Feature Tracking in DVS Event Stream using Bezier Mapping
- Authors
- Seok, Hochang; Lim, Jong woo
- Issue Date
- Mar-2020
- Publisher
- IEEE
- Citation
- 2020 IEEE Winter Conference on Applications of Computer Vision (WACV), pp.1658 - 1667
- Indexed
- OTHER
- Journal Title
- 2020 IEEE Winter Conference on Applications of Computer Vision (WACV)
- Start Page
- 1658
- End Page
- 1667
- URI
- https://scholarworks.bwise.kr/hanyang/handle/2021.sw.hanyang/145977
- DOI
- 10.1109/WACV45572.2020.9093607
- Abstract
- Unlike conventional cameras, event cameras capture the intensity changes at each pixel with very little delay. Such changes are recorded as an event stream with their positions, timestamps, and polarities continuously, thus there is no notion of `frame' as in conventional cameras. As many applications including 3D pose estimation use 2D trajectories of feature points, it is necessary to detect and track the feature points robustly and accurately in a continuous event stream. In conventional feature tracking algorithms for event streams, the events in fixed time intervals are converted into the event images by stacking the events at their pixel locations, and the features are tracked in the event images. Such simple stacking of events yields blurry event images due to the camera motion, and it can significantly degrade the tracking quality. We propose to align the events in the time intervals along Bézier curves to minimize the misalignment. Since the camera motion is unknown, the Bézier curve is estimated to maximize the variance of the warped event pixels. Instead of the initial patches for tracking, we use the temporally integrated template patches, as it captures rich texture information from accurately aligned events. Extensive experimental evaluations in 2D feature tracking as well as 3D pose estimation show that our method significantly outperforms the conventional approaches.
- Files in This Item
-
Go to Link
- Appears in
Collections - 서울 공과대학 > 서울 컴퓨터소프트웨어학부 > 1. Journal Articles
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.