Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

RELEAD: Resilient Localization with Enhanced LiDAR Odometry in Adverse Environments (2402.18934v2)

Published 29 Feb 2024 in cs.RO

Abstract: LiDAR-based localization is valuable for applications like mining surveys and underground facility maintenance. However, existing methods can struggle when dealing with uninformative geometric structures in challenging scenarios. This paper presents RELEAD, a LiDAR-centric solution designed to address scan-matching degradation. Our method enables degeneracy-free point cloud registration by solving constrained ESIKF updates in the front end and incorporates multisensor constraints, even when dealing with outlier measurements, through graph optimization based on Graduated Non-Convexity (GNC). Additionally, we propose a robust Incremental Fixed Lag Smoother (rIFL) for efficient GNC-based optimization. RELEAD has undergone extensive evaluation in degenerate scenarios and has outperformed existing state-of-the-art LiDAR-Inertial odometry and LiDAR-Visual-Inertial odometry methods.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (34)
  1. J. Zhang and S. Singh, “Loam: Lidar odometry and mapping in real-time,” in Robotics: Science and Systems, 2014.
  2. J. Behley and C. Stachniss, “Efficient surfel-based slam using 3d laser range data in urban environments,” Robotics: Science and Systems XIV, 2018.
  3. Z. Liu and F. Zhang, “Balm: Bundle adjustment for lidar mapping,” IEEE Robotics and Automation Letters, vol. 6, no. 2, pp. 3184–3191, 2021.
  4. K. Li, M. Li, and U. D. Hanebeck, “Towards high-performance solid-state-lidar-inertial odometry and mapping,” IEEE Robotics and Automation Letters, vol. 6, no. 3, pp. 5167–5174, 2021.
  5. T. Tuna, J. Nubert, Y. Nava, S. Khattak, and M. Hutter, “X-icp: Localizability-aware lidar registration for robust localization in extreme environments,” IEEE Transactions on Robotics, vol. 40, pp. 452–471, 2024.
  6. J. Zhang, M. Kaess, and S. Singh, “On degeneracy of optimization-based state estimation problems,” 2016 IEEE International Conference on Robotics and Automation (ICRA), pp. 809–816, 2016.
  7. K. Ebadi, M. Palieri, S. Wood, C. Padgett, and A.-a. Agha-mohammadi, “Dare-slam: Degeneracy-aware and resilient loop closing in perceptually-degraded environments,” Journal of Intelligent & Robotic Systems, vol. 102, pp. 1–25, 2021.
  8. A. Hinduja, B.-J. Ho, and M. Kaess, “Degeneracy-aware factors with applications to underwater slam,” 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 1293–1299, 2019.
  9. X. Ding, F. Han, T. Yang, Y. Wang, and R. Xiong, “Degeneration-aware localization with arbitrary global-local sensor fusion,” Sensors, vol. 21, no. 12, p. 4042, 2021.
  10. J. Gräter, A. Wilczynski, and M. Lauer, “Limo: Lidar-monocular visual odometry,” 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 7872–7879, 2018.
  11. Y. Zhu, C. Zheng, C. Yuan, X. Huang, and X. Hong, “Camvox: A low-cost and accurate lidar-assisted visual slam system,” 2021 IEEE International Conference on Robotics and Automation (ICRA), pp. 5049–5055, 2020.
  12. T. Shan, B. Englot, C. Ratti, and D. Rus, “Lvi-sam: Tightly-coupled lidar-visual-inertial odometry via smoothing and mapping,” 2021 IEEE International Conference on Robotics and Automation (ICRA), pp. 5692–5698, 2021.
  13. J. Lin and F. Zhang, “R 3 live: A robust, real-time, rgb-colored, lidar-inertial-visual tightly-coupled state estimation and mapping package,” in 2022 International Conference on Robotics and Automation (ICRA).   IEEE, 2022, pp. 10 672–10 678.
  14. J. Nubert, S. Khattak, and M. Hutter, “Graph-based multi-sensor fusion for consistent localization of autonomous construction robots,” arXiv preprint arXiv:2203.01389, 2022.
  15. S. Zhao, H. Zhang, P. Wang, L. Nogueira, and S. Scherer, “Super odometry: Imu-centric lidar-visual-inertial estimator for challenging environments,” in 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2021, pp. 8729–8736.
  16. N. Khedekar, M. Kulkarni, and K. Alexis, “Mimosa: A multi-modal slam framework for resilient autonomy against sensor degradation,” in 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2022, pp. 7153–7159.
  17. Y. Wang, W. Song, Y. Lou, Y. Zhang, F. Huang, Z. Tu, and Q. Liang, “Rail vehicle localization and mapping with lidar-vision-inertial-gnss fusion,” IEEE Robotics and Automation Letters, vol. 7, no. 4, pp. 9818–9825, 2022.
  18. S. Khattak, H. Nguyen, F. Mascarich, T. Dang, and K. Alexis, “Complementary multi–modal sensor fusion for resilient robot pose estimation in subterranean environments,” 2020 International Conference on Unmanned Aircraft Systems (ICUAS), pp. 1024–1029, 2020.
  19. M. Palieri, B. Morrell, A. Thakur, K. Ebadi, J. Nash, A. Chatterjee, C. Kanellakis, L. Carlone, C. Guaragnella, and A. akbar Agha-mohammadi, “Locus: A multi-sensor lidar-centric solution for high-precision odometry and 3d mapping in real-time,” IEEE Robotics and Automation Letters, vol. 6, pp. 421–428, 2020.
  20. A. Reinke, M. Palieri, B. Morrell, Y. Chang, K. Ebadi, L. Carlone, and A.-A. Agha-Mohammadi, “Locus 2.0: Robust and computationally efficient lidar odometry for real-time 3d mapping,” IEEE Robotics and Automation Letters, vol. 7, no. 4, pp. 9043–9050, 2022.
  21. J. Zhang and S. Singh, “Laser-visual-inertial odometry and mapping with high robustness and low drift,” Journal of Field Robotics, vol. 35, pp. 1242 – 1264, 2018.
  22. A. Santamaria-Navarro, R. Thakker, D. D. Fan, B. Morrell, and A. akbar Agha-mohammadi, “Towards resilient autonomous navigation of drones,” ArXiv, vol. abs/2008.09679, 2020.
  23. K. Ebadi, L. Bernreiter, H. Biggie, G. Catt, Y. Chang, A. Chatterjee, C. E. Denniston, S.-P. Deschênes, K. Harlow, S. Khattak, L. Nogueira, M. Palieri, P. Petráček, M. Petrlík, A. Reinke, V. Krátký, S. Zhao, A.-a. Agha-mohammadi, K. Alexis, C. Heckman, K. Khosoussi, N. Kottege, B. Morrell, M. Hutter, F. Pauling, F. Pomerleau, M. Saska, S. Scherer, R. Siegwart, J. L. Williams, and L. Carlone, “Present and future of slam in extreme environments: The darpa subt challenge,” IEEE Transactions on Robotics, vol. 40, pp. 936–959, 2024.
  24. W. Xu and F. Zhang, “Fast-lio: A fast, robust lidar-inertial odometry package by tightly-coupled iterated kalman filter,” IEEE Robotics and Automation Letters, vol. 6, pp. 3317–3324, 2020.
  25. C. Hertzberg, R. Wagner, U. Frese, and L. Schröder, “Integrating generic sensor fusion algorithms with sound state representations through encapsulation of manifolds,” Information Fusion, vol. 14, no. 1, pp. 57–77, 2013.
  26. M. Jung, S. Jung, and A. Kim, “Asynchronous multiple lidar-inertial odometry using point-wise inter-lidar uncertainty propagation,” IEEE Robotics and Automation Letters, 2023.
  27. H. Yang, P. Antonante, V. Tzoumas, and L. Carlone, “Graduated non-convexity for robust spatial perception: From non-minimal solvers to global outlier rejection,” IEEE Robotics and Automation Letters, vol. 5, no. 2, pp. 1127–1134, 2020.
  28. D. McGann, J. G. Rogers, and M. Kaess, “Robust incremental smoothing and mapping (risam),” in 2023 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2023, pp. 4157–4163.
  29. H.-P. Chiu, S. Williams, F. Dellaert, S. Samarasekera, and R. Kumar, “Robust vision-aided navigation using sliding-window factor graphs,” in 2013 IEEE International Conference on Robotics and Automation.   IEEE, 2013, pp. 46–53.
  30. M. Kaess, H. Johannsson, R. Roberts, V. Ila, J. J. Leonard, and F. Dellaert, “isam2: Incremental smoothing and mapping using the bayes tree,” The International Journal of Robotics Research, vol. 31, no. 2, pp. 216–235, 2012.
  31. D. Feng, Y. Qi, S. Zhong, Z. Chen, Y. Jiao, Q. Chen, T. Jiang, and H. Chen, “S3e: A large-scale multimodal dataset for collaborative slam,” ArXiv, vol. abs/2210.13723, 2022.
  32. T. Shan, B. Englot, D. Meyers, W. Wang, C. Ratti, and D. Rus, “Lio-sam: Tightly-coupled lidar inertial odometry via smoothing and mapping,” 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 5135–5142, 2020.
  33. W. Xu, Y. Cai, D. He, J. Lin, and F. Zhang, “Fast-lio2: Fast direct lidar-inertial odometry,” IEEE Transactions on Robotics, vol. 38, pp. 2053–2073, 2021.
  34. S. Fakoorian, K. Otsu, S. Khattak, M. Palieri, and A.-a. Agha-mohammadi, “Rose: Robust state estimation via online covariance adaption,” in The International Symposium of Robotics Research.   Springer, 2022, pp. 452–467.
Citations (3)

Summary

We haven't generated a summary for this paper yet.