Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Tightly-Coupled LiDAR-Visual-Inertial SLAM and Large-Scale Volumetric Occupancy Mapping (2403.02280v1)

Published 4 Mar 2024 in cs.RO

Abstract: Autonomous navigation is one of the key requirements for every potential application of mobile robots in the real-world. Besides high-accuracy state estimation, a suitable and globally consistent representation of the 3D environment is indispensable. We present a fully tightly-coupled LiDAR-Visual-Inertial SLAM system and 3D mapping framework applying local submapping strategies to achieve scalability to large-scale environments. A novel and correspondence-free, inherently probabilistic, formulation of LiDAR residuals is introduced, expressed only in terms of the occupancy fields and its respective gradients. These residuals can be added to a factor graph optimisation problem, either as frame-to-map factors for the live estimates or as map-to-map factors aligning the submaps with respect to one another. Experimental validation demonstrates that the approach achieves state-of-the-art pose accuracy and furthermore produces globally consistent volumetric occupancy submaps which can be directly used in downstream tasks such as navigation or exploration.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (26)
  1. D. Wisth, M. Camurri, and M. Fallon, “Vilens: Visual, inertial, lidar, and leg odometry for all-terrain legged robots,” IEEE Transactions on Robotics, vol. 39, no. 1, pp. 309–326, 2022.
  2. J. Lin and F. Zhang, “R 3 live: A robust, real-time, rgb-colored, lidar-inertial-visual tightly-coupled state estimation and mapping package,” in 2022 International Conference on Robotics and Automation (ICRA).   IEEE, 2022, pp. 10 672–10 678.
  3. T. Shan, B. Englot, C. Ratti, and D. Rus, “Lvi-sam: Tightly-coupled lidar-visual-inertial odometry via smoothing and mapping,” in 2021 IEEE international conference on robotics and automation (ICRA).   IEEE, 2021, pp. 5692–5698.
  4. P. Dellenbach, J.-E. Deschaud, B. Jacquet, and F. Goulette, “Ct-icp: Real-time elastic lidar odometry with loop closure,” in 2022 International Conference on Robotics and Automation (ICRA).   IEEE, 2022, pp. 5580–5586.
  5. M. Ramezani, K. Khosoussi, G. Catt, P. Moghadam, J. Williams, P. Borges, F. Pauling, and N. Kottege, “Wildcat: Online continuous-time 3d lidar-inertial slam,” arXiv preprint arXiv:2205.12595, 2022.
  6. V. Reijgwart, A. Millane, H. Oleynikova, R. Siegwart, C. Cadena, and J. Nieto, “Voxgraph: Globally consistent, volumetric mapping using signed distance function submaps,” IEEE Robotics and Automation Letters, vol. 5, no. 1, pp. 227–234, 2019.
  7. Y. Wang, M. Ramezani, M. Mattamala, S. T. Digumarti, and M. Fallon, “Strategies for large scale elastic and semantic lidar reconstruction,” Robotics and Autonomous Systems, vol. 155, p. 104185, 2022.
  8. L. Zhang, M. Helmberger, L. F. T. Fu, D. Wisth, M. Camurri, D. Scaramuzza, and M. Fallon, “Hilti-oxford dataset: A millimeter-accurate benchmark for simultaneous localization and mapping,” IEEE Robotics and Automation Letters, vol. 8, no. 1, pp. 408–415, 2022.
  9. J. Zhang and S. Singh, “Loam: Lidar odometry and mapping in real-time.” in Robotics: Science and systems, vol. 2, no. 9.   Berkeley, CA, 2014, pp. 1–9.
  10. T. Shan, B. Englot, D. Meyers, W. Wang, C. Ratti, and D. Rus, “Lio-sam: Tightly-coupled lidar inertial odometry via smoothing and mapping,” in 2020 IEEE/RSJ international conference on intelligent robots and systems (IROS).   IEEE, 2020, pp. 5135–5142.
  11. W. Xu and F. Zhang, “Fast-lio: A fast, robust lidar-inertial odometry package by tightly-coupled iterated kalman filter,” IEEE Robotics and Automation Letters, vol. 6, no. 2, pp. 3317–3324, 2021.
  12. W. Xu, Y. Cai, D. He, J. Lin, and F. Zhang, “Fast-lio2: Fast direct lidar-inertial odometry,” IEEE Transactions on Robotics, vol. 38, no. 4, pp. 2053–2073, 2022.
  13. Z. Liu and F. Zhang, “Balm: Bundle adjustment for lidar mapping,” IEEE Robotics and Automation Letters, vol. 6, no. 2, pp. 3184–3191, 2021.
  14. A. I. Mourikis and S. I. Roumeliotis, “A multi-state constraint kalman filter for vision-aided inertial navigation,” in Proceedings 2007 IEEE international conference on robotics and automation.   IEEE, 2007, pp. 3565–3572.
  15. X. Zuo, P. Geneva, W. Lee, Y. Liu, and G. Huang, “Lic-fusion: Lidar-inertial-camera odometry,” in 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2019, pp. 5848–5854.
  16. X. Zuo, Y. Yang, P. Geneva, J. Lv, Y. Liu, G. Huang, and M. Pollefeys, “Lic-fusion 2.0: Lidar-inertial-camera odometry with sliding-window plane-feature tracking,” in 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2020, pp. 5112–5119.
  17. M. Bosse, P. Newman, J. Leonard, M. Soika, W. Feiten, and S. Teller, “An atlas framework for scalable mapping,” in 2003 IEEE International Conference on Robotics and Automation (Cat. No. 03CH37422), vol. 2.   IEEE, 2003, pp. 1899–1906.
  18. B.-J. Ho, P. Sodhi, P. Teixeira, M. Hsiao, T. Kusnur, and M. Kaess, “Virtual occupancy grid map for submap-based pose graph slam and planning in 3d environments,” in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2018, pp. 2175–2182.
  19. A. Hornung, K. M. Wurm, M. Bennewitz, C. Stachniss, and W. Burgard, “Octomap: An efficient probabilistic 3d mapping framework based on octrees,” Autonomous robots, vol. 34, pp. 189–206, 2013.
  20. Y. Wang, N. Funk, M. Ramezani, S. Papatheodorou, M. Popović, M. Camurri, S. Leutenegger, and M. Fallon, “Elastic and efficient lidar reconstruction for large-scale exploration tasks,” in 2021 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2021, pp. 5035–5041.
  21. N. Funk, J. Tarrio, S. Papatheodorou, M. Popović, P. F. Alcantarilla, and S. Leutenegger, “Multi-resolution 3d mapping with explicit free space representation for fast and accurate mobile robot motion planning,” IEEE Robotics and Automation Letters, vol. 6, no. 2, pp. 3553–3560, 2021.
  22. S. Leutenegger, “Okvis2: Realtime scalable visual-inertial slam with loop closure,” arXiv preprint arXiv:2202.09199, 2022.
  23. C. Forster, L. Carlone, F. Dellaert, and D. Scaramuzza, “On-manifold preintegration for real-time visual–inertial odometry,” IEEE Transactions on Robotics, vol. 33, no. 1, pp. 1–21, 2016.
  24. C. Zheng, Q. Zhu, W. Xu, X. Liu, Q. Guo, and F. Zhang, “Fast-livo: Fast and tightly-coupled sparse-direct lidar-inertial-visual odometry,” 2022.
  25. S. Agarwal, K. Mierle, and T. C. S. Team, “Ceres Solver,” 3 2022. [Online]. Available: https://github.com/ceres-solver/ceres-solver
  26. J. Maye, P. Furgale, and R. Siegwart, “Self-supervised calibration for robotic systems,” in 2013 IEEE Intelligent Vehicles Symposium (IV).   IEEE, 2013, pp. 473–480.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Simon Boche (7 papers)
  2. Sebastián Barbas Laina (12 papers)
  3. Stefan Leutenegger (66 papers)
Citations (4)

Summary

Tightly-Coupled LiDAR-Visual-Inertial SLAM and Large-Scale Volumetric Occupancy Mapping

Introduction

In autonomous navigation, precise localisation is essential, but so is an accurate representation of the 3D environment. Traditional SLAM (Simultaneous Localization And Mapping) systems that fuse different sensory inputs (such as stereo vision, Inertial Measurement Units (IMU), and Light Detection and Ranging (LiDAR) sensors) have shown promise in achieving accurate localisation. However, most current systems represent the 3D world in formats not immediately suitable for navigation and exploration tasks, which require knowledge of free space. In this paper, a novel approach integrating LiDAR, visual and inertial data in a tightly-coupled SLAM system is presented. The system produces globally consistent volumetric occupancy maps, enhancing both localisation accuracy and the practical utility of the generated maps for robotic navigation.

System Overview

The core innovation lies in the fusion of LiDAR, visual, and inertial measurements in a tightly-coupled SLAM system that also incorporates a volumetric mapping approach. The system leverages LiDAR data not only to improve localisation accuracy but also to update occupancy maps of the environment in real-time. A significant contribution is the introduction of novel LiDAR residuals based on occupancy fields and their gradients, enabling efficient addition of LiDAR data into the factor graph optimization without necessitating expensive data association steps.

Mapping Approach

The mapping module employs a submapping strategy to manage the scalability for large-scale environments, dividing the map into local submaps that are individually consistent. These submaps are then globally aligned and integrated into the SLAM system through novel frame-to-map and map-to-map optimization factors. This strategy not only contributes to maintaining the global consistency of the map but also improves the robustness and accuracy of the SLAM system by leveraging the volumetric information in the optimization process.

Experimental Results

The system was comprehensively evaluated on the HILTI 2022 SLAM Challenge, showing competitive performance in terms of localization accuracy against state-of-the-art methods. Additionally, the qualitative evaluation of the occupancy maps demonstrates their consistency and utility for navigation tasks. The system performs efficiently in real-time, with further enhancements achievable through parameter adjustments tailored to the processing capabilities of the deployment platform.

Conclusion and Future Work

This work introduces a state-of-the-art approach for tightly-coupled LiDAR-Visual-Inertial SLAM, capable of producing accurate, globally consistent volumetric maps. Future developments will focus on refining the uncertainty model for LiDAR measurements, enhancing robustness to difficult scenarios where visual tracking may fail, and expanding the framework to support autonomous exploration and navigation through dynamically generated submaps. This research represents a significant step forward in realizing fully autonomous robotic systems capable of navigating and understanding complex 3D environments in real-time.

Implications

The presented system has broad implications for the development of autonomous robotic navigation. By providing highly accurate localisation and a detailed, navigable map of the environment, robots can operate more effectively in complex, unstructured settings. This capability is crucial for a wide range of applications, including search and rescue operations in disaster-stricken areas, autonomous exploration in unknown territories, and sophisticated navigation tasks in industrial automation.

X Twitter Logo Streamline Icon: https://streamlinehq.com
Youtube Logo Streamline Icon: https://streamlinehq.com