CalibBD: Extrinsic Calibration of the LiDAR and Camera Using a Bidirectional Neural Networkopen access
- Nguyen, An Duy; Yoo, Myungsik
- Issue Date
- IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
- LiDAR; camera; calibration; deep learning; accuracy
- IEEE ACCESS, v.10, pp.121261 - 121271
- Journal Title
- IEEE ACCESS
- Start Page
- End Page
- With the rapid growth of self-driving vehicles, automobiles demand diverse data from multiple sensors to perceive the surrounding environment. Calibrating preprocessing between multiple sensors is necessary to utilize the data effectively. In particular, the LiDAR-camera pair, a suitable complement with 2D-3D information for each other, has been widely used in autonomous vehicles. Most traditional calibration methods require specific calibration targets set up under complicated environmental conditions, which require expensive human manual work. In this study, we propose a deep neural network that does not require any specific targets and offline setup to find the six degrees of freedom (6 DoF) transformation between LiDAR and the camera. Unlike previous deep learning CNN-based methods, which use raw 3D point clouds and 2D images frame by frame, CalibBD utilizes Bi-LSTM for sequence data to extract temporal features between consecutive frames. It not only predicts the calibration parameters by minimizing both transformation and depth losses but also calibrates the camera parameters by using temporal loss to refine the calibration parameters. The proposed model achieves a steady performance under various deviations of mis-calibration parameters and achieved higher results in terms of accuracy than the state-of-the-art CNN-based method on the KITTI datasets.
- Files in This Item
Go to Link
- Appears in
- ETC > 1. Journal Articles
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.