Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Tightly Coupled 3D Lidar Inertial Odometry and Mapping (1904.06993v1)

Published 15 Apr 2019 in cs.RO and cs.CV

Abstract: Ego-motion estimation is a fundamental requirement for most mobile robotic applications. By sensor fusion, we can compensate the deficiencies of stand-alone sensors and provide more reliable estimations. We introduce a tightly coupled lidar-IMU fusion method in this paper. By jointly minimizing the cost derived from lidar and IMU measurements, the lidar-IMU odometry (LIO) can perform well with acceptable drift after long-term experiment, even in challenging cases where the lidar measurements can be degraded. Besides, to obtain more reliable estimations of the lidar poses, a rotation-constrained refinement algorithm (LIO-mapping) is proposed to further align the lidar poses with the global map. The experiment results demonstrate that the proposed method can estimate the poses of the sensor pair at the IMU update rate with high precision, even under fast motion conditions or with insufficient features.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Haoyang Ye (27 papers)
  2. Yuying Chen (18 papers)
  3. Ming Liu (421 papers)
Citations (392)

Summary

  • The paper introduces a tightly coupled lidar-IMU fusion system that minimizes drift and enhances pose estimation accuracy.
  • It employs joint optimization and a rotation-constrained refinement to reliably combine lidar and IMU data, even in degraded conditions.
  • Comprehensive indoor and outdoor tests validate its superior real-time performance over traditional lidar-only and loosely coupled methods.

Overview of Tightly Coupled 3D Lidar Inertial Odometry and Mapping

The paper "Tightly Coupled 3D Lidar Inertial Odometry and Mapping" addresses a fundamental challenge in mobile robotics: ego-motion estimation. By leveraging sensor fusion techniques, the authors introduce a method that tightly integrates lidar and inertial measurement unit (IMU) data to enhance estimation accuracy and robustness, particularly under challenging conditions. This method is named lidar-IMU odometry (LIO) and is supplemented by a rotation-constrained refinement algorithm, referred to as LIO-mapping. The paper demonstrates that the proposed approach mitigates the limitations inherent to standalone sensors, offering reliable performance in environments where lidar measurements may be degraded and supports high-accuracy pose estimation even amidst rapid motion.

Methodology

The fusion method presented in this paper employs a joint optimization strategy. It minimizes the cost associated with both lidar and IMU measurements. This understanding enables the LIO system to deliver accurate state estimations with minimal drift over long-term operations, even when lidar data quality is compromised. Moreover, the authors propose a rotation-constrained refinement procedure to fine-tune lidar poses within the global map context, ensuring that pose estimates align consistently with gravity. This is particularly useful in enhancing the robustness and consistency of the generated point-cloud maps.

Contributions and Results

The paper makes several key contributions, notably:

  1. Development of a highly accurate, tightly coupled lidar-IMU odometry system capable of real-time estimation at high update rates.
  2. Introduction of a rotationally constrained refinement process that optimizes the final poses and point-cloud maps, enhancing reliability in lidar-compromised scenarios.
  3. Comprehensive validation through extensive indoor and outdoor testing, showcasing superior performance relative to existing state-of-the-art approaches, both lidar-only and loosely coupled lidar-IMU algorithms.

The authors provide source code for researchers and developers, marking the first open-source implementation of tightly coupled lidar and IMU fusion, which stands to benefit the broader research community significantly.

Implications and Future Directions

The implications of this research are substantial both in practical and theoretical realms. Practically, the capability to accurately estimate poses with reduced drift and without heavy reliance on environment-specific features positions this work to advance the efficacy of autonomous navigation systems. Theoretically, the integration framework may inspire further research into tighter coupling of various sensor modalities for diverse robotic applications. Future developments could explore enhancing computational efficiency, expanding the range of applicable environments, and training systems for specific industry applications such as autonomous vehicles or drones.

In conclusion, "Tightly Coupled 3D Lidar Inertial Odometry and Mapping" provides significant insights into lidar-IMU fusion methodologies for robotics. By addressing the inconsistencies of standalone sensors and offering a robust framework for pose estimation, this work opens up new avenues for further exploration in sensor fusion and integrative mapping techniques.