Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

SensorX2car: Sensors-to-car calibration for autonomous driving in road scenarios (2301.07279v2)

Published 18 Jan 2023 in cs.RO and cs.CV

Abstract: Properly-calibrated sensors are the prerequisite for a dependable autonomous driving system. However, most prior methods focus on extrinsic calibration between sensors, and few focus on the misalignment between the sensors and the vehicle coordinate system. Existing targetless approaches rely on specific prior knowledge, such as driving routes and road features, to handle this misalignment. This work removes these limitations and proposes more general calibration methods for four commonly used sensors: Camera, LiDAR, GNSS/INS, and millimeter-wave Radar. By utilizing sensor-specific patterns: image feature, 3D LiDAR points, GNSS/INS solved pose, and radar speed, we design four corresponding methods to mainly calibrate the rotation from sensor to car during normal driving within minutes, composing a toolbox named SensorX2car. Real-world and simulated experiments demonstrate the practicality of our proposed methods. Meanwhile, the related codes have been open-sourced to benefit the community. To the best of our knowledge, SensorX2car is the first open-source sensor-to-car calibration toolbox. The code is available at https://github.com/OpenCalib/SensorX2car.

Citations (8)

Summary

  • The paper introduces SensorX2car, a toolbox that achieves real-time calibration for cameras, LiDAR, GNSS/INS, and radar in autonomous vehicles.
  • The paper employs deep learning for camera calibration and SLAM for LiDAR, ensuring robust sensor alignment even in challenging road scenarios.
  • The paper validates the toolbox through simulations and real-world tests, demonstrating precise calibration with low variance in yaw and pitch measurements.

Evaluation of SensorX2car: A Calibration Toolbox for Autonomous Driving Sensors

The paper under review, titled "SensorX2car: Sensors-to-car calibration for autonomous driving in road scenarios," introduces SensorX2car, a comprehensive, open-source toolbox designed for the calibration of multiple sensors used in autonomous vehicles. This work identifies and addresses the challenge of real-time, on-road calibration of sensors relative to the vehicle's coordinate system. The authors focus on four key sensors: Camera, LiDAR, GNSS/INS, and millimeter-wave Radar, each requiring distinct calibration methodologies to ensure reliable operation under diverse driving conditions.

Technical Contributions

  1. Camera-to-Car Calibration: The paper puts forward a novel calibration method for cameras using a deep learning-based detection network to estimate the vanishing point and horizon line from images. These features allow for the derivation of the camera's roll, pitch, and yaw. The authors effectively address common obstacles such as absent lane markings by utilizing horizon lines, therefore enhancing calibration reliability across varied driving environments.
  2. LiDAR-to-Car Calibration: LiDAR systems are calibrated through a combination of SLAM algorithms for trajectory estimation and ground plane extraction for pitch and roll calibration. The proposed approach offers robust solutions irrespective of the driving trajectory, effectively minimizing dependencies on pre-defined road features.
  3. GNSS/INS-to-Car Calibration: The calibration focuses primarily on the yaw angle, a critical factor for accurate vehicle orientation determination. By leveraging GNSS/INS data outputs, the authors ensure that rotational discrepancies are minimized, reinforcing decision-making accuracy in vehicular operations.
  4. Radar-to-Car Calibration: Calibration of millimeter-wave Radar is facilitated through a twostep process that encompasses object speed and vehicle speed relationships, refined with spatial localization of both the vehicle and observed static objects. This method confronts the challenge of large noise inherent in radar measurements.

Experimental Results and Key Findings

The authors validate the practicality and effectiveness of SensorX2car through a blend of simulations and real-world experiments. Calibration stability and accuracy are the primary metrics for evaluation. The results illustrate impressive precision across all sensor types, with the variance in calibration outcomes notably low, particularly in yaw and pitch, underscoring the reliability of these methods in real-world autonomous driving contexts.

Implications and Future Directions

Practically, SensorX2car opens opportunities for improved real-time sensor calibration that could simplify and reduce the operational costs associated with deploying autonomous vehicles. The toolbox's open-source nature invites further development and adaptation by the research community, fostering advancements in autonomous driving technology.

Theoretically, the work establishes a foundation for exploring comprehensive six-degrees-of-freedom (6-DoF) calibration, paving the way for advancements in full positional and orientational calibration. Future explorations could benefit from integrating these methodologies with adaptive algorithms capable of self-calibration amidst dynamically changing driving environments.

In conclusion, SensorX2car represents a significant advancement in autonomous vehicle technology, providing a robust, flexible, and accessible framework for sensor calibration. This paper's contributions lay the groundwork for ongoing improvements and developments in the accuracy and reliability of autonomous driving systems.

Github Logo Streamline Icon: https://streamlinehq.com

GitHub