Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

ROVO: Robust omnidirectional visual odometry for wide-baseline wide-FOV camera systems

Full metadata record
DC Field Value Language
dc.contributor.authorSeok, Hochang-
dc.contributor.authorLim, Jongwoo-
dc.date.accessioned2022-07-09T15:02:28Z-
dc.date.available2022-07-09T15:02:28Z-
dc.date.created2021-05-13-
dc.date.issued2019-05-
dc.identifier.issn1050-4729-
dc.identifier.urihttps://scholarworks.bwise.kr/hanyang/handle/2021.sw.hanyang/147830-
dc.description.abstractIn this paper we propose a robust visual odometry system for a wide-baseline camera rig with wide field-of-view (FOV) fisheye lenses, which provides full omnidirectional stereo observations of the environment. For more robust and accurate ego-motion estimation we adds three components to the standard VO pipeline, 1) the hybrid projection model for improved feature matching, 2) multi-view P3P RANSAC algorithm for pose estimation, and 3) online update of rig extrinsic parameters. The hybrid projection model combines the perspective and cylindrical projection to maximize the overlap between views and minimize the image distortion that degrades feature matching performance. The multi-view P3P RANSAC algorithm extends the conventional P3P RANSAC to multi-view images so that all feature matches in all views are considered in the inlier counting for robust pose estimation. Finally the online extrinsic calibration is seamlessly integrated in the backend optimization framework so that the changes in camera poses due to shocks or vibrations can be corrected automatically. The proposed system is extensively evaluated with synthetic datasets with ground-truth and real sequences of highly dynamic environment, and its superior performance is demonstrated.-
dc.language영어-
dc.language.isoen-
dc.publisherInstitute of Electrical and Electronics Engineers Inc.-
dc.titleROVO: Robust omnidirectional visual odometry for wide-baseline wide-FOV camera systems-
dc.typeArticle-
dc.contributor.affiliatedAuthorLim, Jongwoo-
dc.identifier.doi10.1109/ICRA.2019.8793758-
dc.identifier.scopusid2-s2.0-85071495022-
dc.identifier.wosid000494942304095-
dc.identifier.bibliographicCitationProceedings - IEEE International Conference on Robotics and Automation, v.2019-May, pp.6344 - 6350-
dc.relation.isPartOfProceedings - IEEE International Conference on Robotics and Automation-
dc.citation.titleProceedings - IEEE International Conference on Robotics and Automation-
dc.citation.volume2019-May-
dc.citation.startPage6344-
dc.citation.endPage6350-
dc.type.rimsART-
dc.type.docTypeConference Paper-
dc.description.journalClass1-
dc.description.isOpenAccessN-
dc.description.journalRegisteredClassscopus-
dc.relation.journalResearchAreaAutomation & Control Systems-
dc.relation.journalResearchAreaRobotics-
dc.relation.journalWebOfScienceCategoryAutomation & Control Systemsl-
dc.relation.journalWebOfScienceCategoryRobotics-
dc.identifier.urlhttps://ieeexplore.ieee.org/document/8793758-
Files in This Item
Go to Link
Appears in
Collections
서울 공과대학 > 서울 컴퓨터소프트웨어학부 > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Lim, Jongwoo photo

Lim, Jongwoo
COLLEGE OF ENGINEERING (SCHOOL OF COMPUTER SCIENCE)
Read more

Altmetrics

Total Views & Downloads

BROWSE