Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

An Extrinsic Calibration Method between LiDAR and GNSS/INS for Autonomous Driving (2209.07694v2)

Published 16 Sep 2022 in cs.RO

Abstract: Accurate and reliable sensor calibration is critical for fusing LiDAR and inertial measurements in autonomous driving. This paper proposes a novel three-stage extrinsic calibration method between LiDAR and GNSS/INS for autonomous driving. The first stage can quickly calibrate the extrinsic parameters between the sensors through point cloud surface features so that the extrinsic can be narrowed from a large initial error to a small error range in little time. The second stage can further calibrate the extrinsic parameters based on LiDAR-mapping space occupancy while removing motion distortion. In the final stage, the z-axis errors caused by the plane motion of the autonomous vehicle are corrected, and an accurate extrinsic parameter is finally obtained. Specifically, This method utilizes the planar features in the environment, making it possible to quickly carry out calibration. Experimental results on real-world data sets demonstrate the reliability and accuracy of our method. The codes are open-sourced on the Github website. The code link is https://github.com/OpenCalib/LiDAR2INS.

Citations (1)

Summary

  • The paper introduces a novel three-stage extrinsic calibration method that effectively narrows large initial offsets using adaptive voxelization and sliding window optimization.
  • It refines alignment through an octree-based spatial occupancy analysis, significantly reducing processing time to under 30 seconds while improving 3D point cloud accuracy.
  • It corrects z-axis discrepancies using fiducial reference points, achieving high calibration precision with minimized translation and rotation errors.

Overview of "An Extrinsic Calibration Method between LiDAR and GNSS/INS for Autonomous Driving"

The paper presented by Yan et al. focuses on a novel extrinsic calibration methodology between LiDAR and GNSS/INS, crucial for advancing autonomous driving systems. This is significant due to the essential requirement of accurate sensor fusion for effective navigation and mapping. Hinging on a three-stage procedure, the method aims to improve the precision of transformations between LiDAR and GNSS/INS systems while addressing some of the limitations found in prior techniques.

Methodology and Process

The proposed calibration method is divided into three stages:

  1. Rough Calibration: This initial stage is geared towards quickly narrowing down errors from potentially large initial offsets in the extrinsic parameters. Using adaptive voxelization for point cloud feature extraction, rough calibration involves a sliding window optimization which aligns multi-frame point cloud features. This process leverages plane feature points and is shown to be computationally efficient, reducing time and resource demands.
  2. Calibration Refinement: To further enhance the alignment accuracy, this stage employs an octree-based spatial occupancy analysis. The goal is to minimize the three-dimensional voxels occupied by the merged point clouds, suggesting better alignment and less redundancy in space. The refinement process benefits from the robustness of the initial estimation from rough calibration and significantly improves the quality of the mapping.
  3. Z-Axis Correction: Recognizing that typical calibration scenarios may not provide sufficient excitation along the z-axis, this paper proposes a fiducial-point-based correction method. By utilizing predefined reference points, this approach rectifies potential height discrepancies ensuring the calibration remains viable across the full 3D space.

Numerical Results and Claims

The numerical results obtained from real-world scenarios demonstrate the efficacy of the proposed method. With time metrics indicating a substantial reduction in processing time (less than 30 seconds), the results from various testing scenarios exhibit high calibration accuracy and consistency. Mean absolute errors (MAE) in both translation and rotation are significantly minimized across the board, surpassing the performance of existing techniques such as Lidar-align.

Implications and Future Work

The implications of this research are notable for the field of autonomous driving. By providing a fast yet precise calibration process, the deployment of autonomous systems in diverse and large-scale environments becomes more feasible. This method's ability to streamline the integration of GNSS/INS and LiDAR data can greatly aid in enhancing vehicle localization and mapping tasks, critical components in the autonomous driving pipeline.

The paper also opens avenues for further research, particularly in environments with fewer planar features or more complex structures where current calibration techniques may falter. Future work could focus on enhancing the adaptability and robustness of the proposed method under different environmental constraints. Additionally, implementing machine learning algorithms to predict optimal calibration parameters might be another promising direction to explore, further reducing manual intervention and computational overhead.

In conclusion, the paper contributes a well-structured approach to tackling the challenges of sensor calibration in autonomous vehicles. By advancing this fundamental aspect, it supports improving the broader deployment and reliability of autonomous systems in real-world applications.

Github Logo Streamline Icon: https://streamlinehq.com

GitHub

X Twitter Logo Streamline Icon: https://streamlinehq.com