Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
194 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Degradation Resilient LiDAR-Radar-Inertial Odometry (2403.05332v1)

Published 8 Mar 2024 in cs.RO and eess.SP

Abstract: Enabling autonomous robots to operate robustly in challenging environments is necessary in a future with increased autonomy. For many autonomous systems, estimation and odometry remains a single point of failure, from which it can often be difficult, if not impossible, to recover. As such robust odometry solutions are of key importance. In this work a method for tightly-coupled LiDAR-Radar-Inertial fusion for odometry is proposed, enabling the mitigation of the effects of LiDAR degeneracy by leveraging a complementary perception modality while preserving the accuracy of LiDAR in well-conditioned environments. The proposed approach combines modalities in a factor graph-based windowed smoother with sensor information-specific factor formulations which enable, in the case of degeneracy, partial information to be conveyed to the graph along the non-degenerate axes. The proposed method is evaluated in real-world tests on a flying robot experiencing degraded conditions including geometric self-similarity as well as obscurant occlusion. For the benefit of the community we release the datasets presented: https://github.com/ntnu-arl/lidar_degeneracy_datasets.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (54)
  1. M. Bijelic, T. Gruber, and W. Ritter, “A benchmark for lidar sensors in fog: Is detection breaking down?” in 2018 IEEE Intelligent Vehicles Symposium (IV).   IEEE, 2018, pp. 760–767.
  2. K. Ebadi, L. Bernreiter, H. Biggie, G. Catt, Y. Chang, A. Chatterjee, C. E. Denniston, S.-P. Deschênes, K. Harlow, S. Khattak et al., “Present and future of slam in extreme underground environments,” arXiv preprint arXiv:2208.01787, 2022.
  3. S. Khattak, H. Nguyen, F. Mascarich, T. Dang, and K. Alexis, “Complementary multi–modal sensor fusion for resilient robot pose estimation in subterranean environments,” in 2020 International Conference on Unmanned Aircraft Systems (ICUAS).   IEEE, 2020, pp. 1024–1029.
  4. M. Tranzatto, T. Miki, M. Dharmadhikari, L. Bernreiter, M. Kulkarni, F. Mascarich, O. Andersson, S. Khattak, M. Hutter, R. Siegwart, and K. Alexis, “Cerberus in the darpa subterranean challenge,” Science Robotics, vol. 7, no. 66, p. eabp9742, 2022.
  5. J. Zhang and S. Singh, “Visual-lidar odometry and mapping: Low-drift, robust, and fast,” in 2015 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2015, pp. 2174–2181.
  6. T. G. Phillips, N. Guenther, and P. R. McAree, “When the dust settles: The four behaviors of lidar in the presence of fine airborne particulates,” Journal of field robotics, vol. 34, no. 5, pp. 985–1009, 2017.
  7. K. Harlow, H. Jang, T. D. Barfoot, A. Kim, and C. Heckman, “A new wave in robotics: Survey on recent mmwave radar applications in robotics,” arXiv preprint arXiv:2305.01135, 2023.
  8. P. Besl and N. D. McKay, “A method for registration of 3-d shapes,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 14, no. 2, pp. 239–256, 1992.
  9. K. Koide, M. Yokozuka, S. Oishi, and A. Banno, “Voxelized gicp for fast and accurate 3d point cloud registration,” EasyChair Preprint no. 2703, EasyChair, 2020.
  10. J. Zhang and S. Singh, “Loam: Lidar odometry and mapping in real-time,” in Proceedings of Robotics: Science and Systems, Berkeley, USA, July 2014.
  11. T. Shan and B. Englot, “Lego-loam: Lightweight and ground-optimized lidar odometry and mapping on variable terrain,” in IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2018, pp. 4758–4765.
  12. S. Khattak, H. Nguyen, F. Mascarich, T. Dang, and K. Alexis, “Complementary multi–modal sensor fusion for resilient robot pose estimation in subterranean environments,” in 2020 International Conference on Unmanned Aircraft Systems (ICUAS), 2020, pp. 1024–1029.
  13. N. Khedekar, M. Kulkarni, and K. Alexis, “Mimosa: A multi-modal slam framework for resilient autonomy against sensor degradation,” in 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2022, pp. 7153–7159.
  14. S. Zhao, H. Zhang, P. Wang, L. Nogueira, and S. Scherer, “Super odometry: Imu-centric lidar-visual-inertial estimator for challenging environments,” in 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2021, pp. 8729–8736.
  15. W. Xu and F. Zhang, “Fast-lio: A fast, robust lidar-inertial odometry package by tightly-coupled iterated kalman filter,” IEEE Robotics and Automation Letters, vol. 6, no. 2, pp. 3317–3324, 2021.
  16. T. Shan, B. Englot, D. Meyers, W. Wang, C. Ratti, and R. Daniela, “Lio-sam: Tightly-coupled lidar inertial odometry via smoothing and mapping,” in IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2020, pp. 5135–5142.
  17. W. Xu, Y. Cai, D. He, J. Lin, and F. Zhang, “Fast-lio2: Fast direct lidar-inertial odometry,” IEEE Transactions on Robotics, vol. 38, no. 4, pp. 2053–2073, 2022.
  18. D.-U. Seo, H. Lim, S. Lee, and H. Myung, “Pago-loam: Robust ground-optimized lidar odometry,” in 2022 19th International Conference on Ubiquitous Robots (UR), 2022, pp. 1–7.
  19. F. Schuster, C. G. Keller, M. Rapp, M. Haueis, and C. Curio, “Landmark based radar slam using graph optimization,” in 2016 IEEE 19th International Conference on Intelligent Transportation Systems (ITSC).   IEEE, 2016, pp. 2559–2564.
  20. M. Schoen, M. Horn, M. Hahn, and J. Dickmann, “Real-time radar slam,” in Workshop Fahrerassistenzsysteme und automatisiertes Fahren, 2017, pp. 1–11.
  21. M. Holder, S. Hellwig, and H. Winner, “Real-time pose graph slam based on radar,” in 2019 IEEE Intelligent Vehicles Symposium (IV).   IEEE, 2019, pp. 1145–1151.
  22. J. Guan, S. Madani, S. Jog, S. Gupta, and H. Hassanieh, “Through fog high-resolution imaging using millimeter wave radar,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2020, pp. 11 464–11 473.
  23. D. Barnes, M. Gadd, P. Murcutt, P. Newman, and I. Posner, “The oxford radar robotcar dataset: A radar extension to the oxford robotcar dataset,” in 2020 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2020, pp. 6433–6438.
  24. A. Kramer, K. Harlow, C. Williams, and C. Heckman, “Coloradar: The direct 3d millimeter wave radar dataset,” The International Journal of Robotics Research, vol. 41, no. 4, pp. 351–360, 2022.
  25. C. X. Lu, S. Rosa, P. Zhao, B. Wang, C. Chen, J. A. Stankovic, N. Trigoni, and A. Markham, “See through smoke: robust indoor mapping with low-cost mmwave radar,” in Proceedings of the 18th International Conference on Mobile Systems, Applications, and Services, 2020, pp. 14–27.
  26. D. Vivet, P. Checchin, and R. Chapuis, “Localization and mapping using only a rotating fmcw radar sensor,” Sensors, vol. 13, no. 4, pp. 4527–4552, 2013.
  27. R. Aldera, M. Gadd, D. De Martini, and P. Newman, “What goes around: Leveraging a constant-curvature motion constraint in radar odometry,” IEEE Robotics and Automation Letters, vol. 7, no. 3, pp. 7865–7872, 2022.
  28. Y. Li, Y. Liu, Y. Wang, Y. Lin, and W. Shen, “The millimeter-wave radar slam assisted by the rcs feature of the target and imu,” Sensors, vol. 20, no. 18, p. 5421, 2020.
  29. K. Burnett, D. J. Yoon, A. P. Schoellig, and T. D. Barfoot, “Radar odometry combining probabilistic estimation and unsupervised feature learning,” arXiv preprint arXiv:2105.14152, 2021.
  30. Z. Hong, Y. Petillot, A. Wallace, and S. Wang, “Radar slam: A robust slam system for all weather conditions,” arXiv preprint arXiv:2104.05347, 2021.
  31. Z. Hong, Y. Petillot, and S. Wang, “Radarslam: Radar based large-scale slam in all weathers,” in 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2020, pp. 5164–5170.
  32. J. W. Marck, A. Mohamoud, E. vd Houwen, and R. van Heijster, “Indoor radar slam a radar application for vision and gps denied environments,” in 2013 European radar conference.   IEEE, 2013, pp. 471–474.
  33. C. Doer and G. F. Trommer, “Radar inertial odometry with online calibration,” in 2020 European Navigation Conference (ENC).   IEEE, 2020, pp. 1–10.
  34. ——, “An ekf based approach to radar inertial odometry,” in 2020 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI).   IEEE, 2020, pp. 152–159.
  35. ——, “Yaw aided radar inertial odometry using manhattan world assumptions,” in 2021 28th Saint Petersburg International Conference on Integrated Navigation Systems (ICINS).   IEEE, 2021, pp. 1–9.
  36. C. Doer and G. Trommer, “x-rio: Radar inertial odometry with multiple radar sensors and yaw aiding,” Gyroscopy and Navigation, vol. 12, no. 4, pp. 329–339, 2021.
  37. J. Michalczyk, C. Schöffmann, A. Fornasier, J. Steinbrener, and S. Weiss, “Radar-inertial state-estimation for uav motion in highly agile manoeuvres,” in 2022 International Conference on Unmanned Aircraft Systems (ICUAS).   IEEE, 2022, pp. 583–589.
  38. J. Michalczyk, R. Jung, and S. Weiss, “Tightly-coupled ekf-based radar-inertial odometry,” in 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2022, pp. 12 336–12 343.
  39. A. Kramer, C. Stahoviak, A. Santamaria-Navarro, A.-A. Agha-Mohammadi, and C. Heckman, “Radar-inertial ego-velocity estimation for visually degraded environments,” in 2020 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2020, pp. 5739–5746.
  40. A. Kramer and C. Heckman, “Radar-inertial state estimation and obstacle detection for micro-aerial vehicles in dense fog,” in Experimental Robotics, B. Siciliano, C. Laschi, and O. Khatib, Eds.   Cham: Springer International Publishing, 2021, pp. 3–16.
  41. D. Adolfsson, M. Magnusson, A. Alhashimi, A. J. Lilienthal, and H. Andreasson, “Lidar-level localization with radar? the cfear approach to accurate, fast, and robust large-scale radar odometry in diverse environments,” IEEE Transactions on robotics, vol. 39, no. 2, pp. 1476–1495, 2022.
  42. K. Burnett, Y. Wu, D. J. Yoon, A. P. Schoellig, and T. D. Barfoot, “Are we ready for radar to replace lidar in all-weather mapping and localization?” IEEE Robotics and Automation Letters, vol. 7, no. 4, pp. 10 328–10 335, 2022.
  43. P. Fritsche, S. Kueppers, G. Briese, and B. Wagner, “Fusing lidar and radar data to perform slam in harsh environments,” in Informatics in Control, Automation and Robotics: 13th International Conference, ICINCO 2016 Lisbon, Portugal, 29-31 July, 2016.   Springer, 2018, pp. 175–189.
  44. Y. S. Park, J. Kim, and A. Kim, “Radar localization and mapping for indoor disaster environments via multi-modal registration to prior lidar map,” in 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2019, pp. 1307–1314.
  45. H. Yin, R. Chen, Y. Wang, and R. Xiong, “Rall: end-to-end radar localization on lidar map using differentiable measurement model,” IEEE Transactions on Intelligent Transportation Systems, vol. 23, no. 7, pp. 6737–6750, 2021.
  46. M. Mielle, M. Magnusson, and A. J. Lilienthal, “A comparative analysis of radar and lidar sensing for localization and mapping,” in 2019 European Conference on Mobile Robots (ECMR).   IEEE, 2019, pp. 1–6.
  47. J. Nubert, S. Khattak, and M. Hutter, “Graph-based multi-sensor fusion for consistent localization of autonomous construction robots,” in IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2022.
  48. J. Solà, J. Deray, and D. Atchuthan, “A micro lie theory for state estimation in robotics,” 2021.
  49. C. Forster, L. Carlone, F. Dellaert, and D. Scaramuzza, “On-Manifold Preintegration for Real-Time Visual–Inertial Odometry,” IEEE Transactions on Robotics, vol. 33, no. 1, pp. 1–21, Feb. 2017. [Online]. Available: https://ieeexplore.ieee.org/document/7557075/
  50. D. Kellner, M. Barjenbruch, J. Klappstein, J. Dickmann, and K. Dietmayer, “Instantaneous ego-motion estimation using Doppler radar,” in 16th International IEEE Conference on Intelligent Transportation Systems (ITSC 2013).   IEEE, Oct. 2013, pp. 869–874.
  51. D. Kellner, M. Barjenbruch, K. Dietmayer, J. Klappstein, and J. Dickmann, “Instantaneous lateral velocity estimation of a vehicle using doppler radar,” in Proceedings of the 16th International Conference on Information Fusion, 2013, pp. 877–884.
  52. D. Kellner, M. Barjenbruch, J. Klappstein, J. Dickmann, and K. Dietmayer, “Instantaneous ego-motion estimation using multiple doppler radars,” in 2014 IEEE International Conference on Robotics and Automation (ICRA), 2014, pp. 1592–1597.
  53. F. Dellaert and G. Contributors, “borglab/gtsam,” May 2022. [Online]. Available: https://github.com/borglab/gtsam)
  54. P. D. Petris, H. Nguyen, M. Dharmadhikari, M. Kulkarni, N. Khedekar, F. Mascarich, and K. Alexis, “Rmf-owl: A collision-tolerant flying robot for autonomous subterranean exploration,” in 2022 International Conference on Unmanned Aircraft Systems (ICUAS), 2022, pp. 536–543.
Citations (4)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

Youtube Logo Streamline Icon: https://streamlinehq.com