Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

LIC-Fusion: LiDAR-Inertial-Camera Odometry (1909.04102v2)

Published 9 Sep 2019 in cs.RO

Abstract: This paper presents a tightly-coupled multi-sensor fusion algorithm termed LiDAR-inertial-camera fusion (LIC-Fusion), which efficiently fuses IMU measurements, sparse visual features, and extracted LiDAR points. In particular, the proposed LIC-Fusion performs online spatial and temporal sensor calibration between all three asynchronous sensors, in order to compensate for possible calibration variations. The key contribution is the optimal (up to linearization errors) multi-modal sensor fusion of detected and tracked sparse edge/surf feature points from LiDAR scans within an efficient MSCKF-based framework, alongside sparse visual feature observations and IMU readings. We perform extensive experiments in both indoor and outdoor environments, showing that the proposed LIC-Fusion outperforms the state-of-the-art visual-inertial odometry (VIO) and LiDAR odometry methods in terms of estimation accuracy and robustness to aggressive motions.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Xingxing Zuo (36 papers)
  2. Patrick Geneva (10 papers)
  3. Woosik Lee (3 papers)
  4. Yong Liu (721 papers)
  5. Guoquan Huang (32 papers)
Citations (140)

Summary

  • The paper introduces a tightly-coupled sensor fusion framework that integrates LiDAR, IMU, and camera data within an MSCKF, enhancing 3D odometry.
  • The system achieves robust feature extraction by selectively tracking edge and surf features, reducing computational load while preserving vital environmental details.
  • The algorithm demonstrates superior accuracy and robustness in diverse experiments, significantly reducing trajectory estimation errors compared to standalone methods.

Expert Overview of LIC-Fusion: LiDAR-Inertial-Camera Odometry

The paper introduces "LIC-Fusion," a tightly-coupled multi-sensor fusion algorithm that integrates data from LiDAR, inertial measurement units (IMUs), and cameras for improved odometry. Primarily, it addresses the challenges of 3D motion tracking in autonomous systems, offering an optimal fusion of diverse sensory inputs to enhance spatial and temporal calibration efficiency.

Key Contributions and Methodology

  1. Sensor Fusion Framework: LIC-Fusion stands out by performing an integrated fusion of sparse visual features from cameras, IMU readings, and environmental features extracted from LiDAR. This fusion occurs within an MSCKF (Multi-State Constraint Kalman Filter) framework, providing a comprehensive approach to solving spatial and temporal disparities between asynchronous sensors.
  2. Robust Feature Extraction: The system selectively extracts and tracks both edge and surf features from LiDAR data, optimizing the computational load while preserving essential environmental details for reliable odometry.
  3. Real-time Calibration: A crucial innovation lies in LIC-Fusion's capability to conduct real-time calibration for both spatial and temporal misalignments across sensors, ensuring that data from all sources are synchronized and accurately interpreted within a single cohesive system.
  4. Performance Validation: Extensive experimentation in both indoor and outdoor environments showcases LIC-Fusion's superiority in estimation accuracy and motion robustness compared to standalone visual-inertial and LiDAR odometry systems. This assertion is underscored by quantitative evaluations that demonstrate a significant reduction in trajectory estimation errors.

Results and Comparative Analysis

The experiments reported are extensive and demonstrate LIC-Fusion's robustness across diverse conditions. The paper details tests over varying trajectories and motion dynamics, pointing out tangible improvements in accuracy and reliability over existing solutions. For instance, compared to the MSCKF-based VIO and LOAM for LiDAR odometry, LIC-Fusion reduced average drift notably, maintaining robustness even during aggressive maneuvers.

Implications and Future Directions

The integration of multiple sensor modalities as seen in LIC-Fusion suggests a shift towards more comprehensive and robust navigational frameworks in AI-powered systems. Practically, this enhances applicability in autonomous vehicles and mobile robotics by improving navigation reliability in complex environments and under challenging illumination conditions. Theoretically, it supports advancements in SLAM technologies and opens avenues for further research in real-time sensor fusion and optimization algorithms.

Future developments may focus on refining computational efficiency, possibly integrating machine learning techniques to optimize the feature extraction and data fusion processes dynamically. Also, incorporating loop closure mechanisms in real-time could further mitigate cumulative errors, enhancing long-term localization fidelity.

In conclusion, LIC-Fusion exemplifies a sophisticated advancement in the multi-sensor fusion domain, significantly contributing to the accuracy and robustness of autonomous system navigation. Its blend of real-time calibration and integrated sensor data handling offers a template for future research and technological deployments in autonomous mobility solutions.

Youtube Logo Streamline Icon: https://streamlinehq.com