Optical flow-based real-time object tracking using non-prior training active feature model
- Authors
- Shin, J; Kim, S; Kang, S; Lee, SW; Paik, Joonki; Abidi, B; Abidi, M
- Issue Date
- Jun-2005
- Publisher
- ACADEMIC PRESS LTD ELSEVIER SCIENCE LTD
- Citation
- REAL-TIME IMAGING, v.11, no.3, pp 204 - 218
- Pages
- 15
- Journal Title
- REAL-TIME IMAGING
- Volume
- 11
- Number
- 3
- Start Page
- 204
- End Page
- 218
- URI
- https://scholarworks.bwise.kr/cau/handle/2019.sw.cau/24587
- DOI
- 10.1016/j.rti.2005.03.006
- ISSN
- 1077-2014
1096-116X
- Abstract
- This paper presents a feature-based object tracking algorithm using optical flow under the non-prior training (NPT) active feature model (AFM) framework. The proposed tracking procedure can be divided into three steps: (i) localization of an object-of-interest, (ii) prediction and correction of the object's position by utilizing spatio-temporal information, and (iii) restoration of occlusion using NPT-AFM. The proposed algorithm can track both rigid and deformable objects, and is robust against the object's sudden motion because both a feature point and the corresponding motion direction are tracked at the same time. Tracking performance is not degraded-even with complicated background because feature points inside an object are completely separated from background. Finally, the AFM enables stable tracking of occluded objects with maximum 60% occlusion. NPT-AFM, which is one of the major contributions of this paper, removes the off-line, preprocessing step for generating a priori training set. The training set used for model fitting can be updated at each frame to make more robust object's features under occluded situation. The proposed AFM can track deformable, partially occluded objects by using the greatly reduced number of feature points rather than taking entire shapes in the existing shape-based methods. The on-line updating of the training set and reducing the number of feature points can realize a real-time, robust tracking system. Experiments have been performed using several in-house video blips of a static camera including objects such as a robot moving on a floor and people walking both indoor and outdoor. In order to show the performance of the proposed tracking algorithm, some experiments have been performed under noisy and low-contrast environment. For more objective comparison, PETS 2001 and PETS 2002 datasets were also used. (C) 2005 Elsevier Ltd. All rights reserved.
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - Graduate School of Advanced Imaging Sciences, Multimedia and Film > Department of Imaging Science and Arts > 1. Journal Articles
![qrcode](https://api.qrserver.com/v1/create-qr-code/?size=55x55&data=https://scholarworks.bwise.kr/cau/handle/2019.sw.cau/24587)
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.