Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

ATI-CTLO:Adaptive Temporal Interval-based Continuous-Time LiDAR-Only Odometry (2407.20619v3)

Published 30 Jul 2024 in cs.RO

Abstract: The motion distortion in LiDAR scans caused by aggressive robot motion and varying terrain features significantly impacts the positioning and mapping performance of 3D LiDAR odometry. Existing distortion correction solutions often struggle to balance computational complexity and accuracy. In this work, we propose an Adaptive Temporal Interval-based Continuous-Time LiDAR-only Odometry, utilizing straightforward and efficient linear interpolation. Our method flexibly adjusts the temporal intervals between control nodes according to the dynamics of motion and environmental characteristics. This adaptability enhances performance across various motion states and improves robustness in challenging, feature-sparse environments. We validate the effectiveness of our method on multiple datasets across different platforms, achieving accuracy comparable to state-of-the-art LiDAR-only odometry methods. Notably, in scenarios involving aggressive motion and sparse features, our method outperforms existing solutions.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (33)
  1. D. Lee, M. Jung, W. Yang, and A. Kim, “Lidar odometry survey: recent advancements and remaining challenges,” Intelligent Service Robotics, pp. 1–24, 2024.
  2. H. Wang, C. Wang, C.-L. Chen, and L. Xie, “F-loam: Fast lidar odometry and mapping,” in 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 4390–4396, IEEE, 2021.
  3. I. Vizzo, T. Guadagnino, B. Mersch, L. Wiesmann, J. Behley, and C. Stachniss, “Kiss-icp: In defense of point-to-point icp–simple, accurate, and robust registration if done the right way,” IEEE Robotics and Automation Letters, vol. 8, no. 2, pp. 1029–1036, 2023.
  4. T. Shan, B. Englot, D. Meyers, W. Wang, C. Ratti, and D. Rus, “Lio-sam: Tightly-coupled lidar inertial odometry via smoothing and mapping,” in 2020 IEEE/RSJ international conference on intelligent robots and systems (IROS), pp. 5135–5142, IEEE, 2020.
  5. W. Xu, Y. Cai, D. He, J. Lin, and F. Zhang, “Fast-lio2: Fast direct lidar-inertial odometry,” IEEE Transactions on Robotics, vol. 38, no. 4, pp. 2053–2073, 2022.
  6. S. Yang, Z. Zhang, Z. Fu, and Z. Manchester, “Cerberus: Low-drift visual-inertial-leg odometry for agile locomotion,” in 2023 IEEE International Conference on Robotics and Automation (ICRA), pp. 4193–4199, IEEE, 2023.
  7. D. He, W. Xu, N. Chen, F. Kong, C. Yuan, and F. Zhang, “Point-lio: Robust high-bandwidth light detection and ranging inertial odometry,” Advanced Intelligent Systems, vol. 5, no. 7, p. 2200459, 2023.
  8. P. Dellenbach, J.-E. Deschaud, B. Jacquet, and F. Goulette, “Ct-icp: Real-time elastic lidar odometry with loop closure,” in 2022 International Conference on Robotics and Automation (ICRA), pp. 5580–5586, IEEE, 2022.
  9. T.-M. Nguyen, D. Duberg, P. Jensfelt, S. Yuan, and L. Xie, “Slict: Multi-input multi-scale surfel-based lidar-inertial continuous-time odometry and mapping,” IEEE Robotics and Automation Letters, vol. 8, no. 4, pp. 2102–2109, 2023.
  10. X. Zheng and J. Zhu, “Ectlo: Effective continuous-time odometry using range image for lidar with small fov,” in 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 9102–9109, IEEE, 2023.
  11. D. Droeschel and S. Behnke, “Efficient continuous-time slam for 3d lidar-based online mapping,” in 2018 IEEE International Conference on Robotics and Automation (ICRA), pp. 5000–5007, IEEE, 2018.
  12. J. Quenzel and S. Behnke, “Real-time multi-adaptive-resolution-surfel 6d lidar odometry using continuous-time trajectory optimization,” in 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 5499–5506, IEEE, 2021.
  13. K. Chen, R. Nemiroff, and B. T. Lopez, “Direct lidar-inertial odometry: Lightweight lio with continuous-time motion correction,” in 2023 IEEE International Conference on Robotics and Automation (ICRA), pp. 3983–3989, IEEE, 2023.
  14. X. Zheng and J. Zhu, “Traj-lo: In defense of lidar-only odometry using an effective continuous-time trajectory,” IEEE Robotics and Automation Letters, 2024.
  15. P. J. Besl and N. D. McKay, “Method for registration of 3-d shapes,” in Sensor fusion IV: control paradigms and data structures, vol. 1611, pp. 586–606, Spie, 1992.
  16. J. Zhang and S. Singh, “Loam: Lidar odometry and mapping in real-time.,” in Robotics: Science and systems, vol. 2, pp. 1–9, Berkeley, CA, 2014.
  17. T. Shan and B. Englot, “Lego-loam: Lightweight and ground-optimized lidar odometry and mapping on variable terrain,” in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 4758–4765, IEEE, 2018.
  18. S. Anderson and T. D. Barfoot, “Full steam ahead: Exactly sparse gaussian process regression for batch continuous-time trajectory estimation on se (3),” in 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 157–164, IEEE, 2015.
  19. J. Dong, M. Mukadam, B. Boots, and F. Dellaert, “Sparse gaussian processes on matrix lie groups: A unified framework for optimizing continuous-time trajectories,” in 2018 IEEE International Conference on Robotics and Automation (ICRA), pp. 6497–6504, IEEE, 2018.
  20. C. Le Gentil, T. Vidal-Calleja, and S. Huang, “In2lama: Inertial lidar localisation and mapping,” in 2019 International Conference on Robotics and Automation (ICRA), pp. 6388–6394, IEEE, 2019.
  21. G. Cioffi, T. Cieslewski, and D. Scaramuzza, “Continuous-time vs. discrete-time vision-based slam: A comparative study,” IEEE Robotics and Automation Letters, vol. 7, no. 2, pp. 2399–2406, 2022.
  22. C. Sommer, V. Usenko, D. Schubert, N. Demmel, and D. Cremers, “Efficient derivative computation for cumulative b-splines on lie groups,” in Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp. 11148–11156, 2020.
  23. J. Lv, K. Hu, J. Xu, Y. Liu, X. Ma, and X. Zuo, “Clins: Continuous-time trajectory estimation for lidar-inertial system,” in 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 6657–6663, IEEE, 2021.
  24. X. Lang, C. Chen, K. Tang, Y. Ma, J. Lv, Y. Liu, and X. Zuo, “Coco-lic: continuous-time tightly-coupled lidar-inertial-camera odometry using non-uniform b-spline,” IEEE Robotics and Automation Letters, 2023.
  25. J. Zhang, M. Kaess, and S. Singh, “On degeneracy of optimization-based state estimation problems,” in 2016 IEEE International Conference on Robotics and Automation (ICRA), pp. 809–816, IEEE, 2016.
  26. A. Hinduja, B.-J. Ho, and M. Kaess, “Degeneracy-aware factors with applications to underwater slam,” in 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 1293–1299, IEEE, 2019.
  27. R. Ren, H. Fu, H. Xue, X. Li, X. Hu, and M. Wu, “Lidar-based robust localization for field autonomous vehicles in off-road environments,” Journal of Field Robotics, vol. 38, no. 8, pp. 1059–1077, 2021.
  28. J. Nubert, E. Walther, S. Khattak, and M. Hutter, “Learning-based localizability estimation for robust lidar localization,” in 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 17–24, IEEE, 2022.
  29. C. Yang, Z. Chai, X. Yang, H. Zhuang, and M. Yang, “Recognition of degradation scenarios for lidar slam applications,” in 2022 IEEE International Conference on Robotics and Biomimetics (ROBIO), pp. 1726–1731, IEEE, 2022.
  30. W. Zhen and S. Scherer, “Estimating the localizability in tunnel-like environments using lidar and uwb,” in 2019 International Conference on Robotics and Automation (ICRA), pp. 4903–4908, IEEE, 2019.
  31. T. Tuna, J. Nubert, Y. Nava, S. Khattak, and M. Hutter, “X-icp: Localizability-aware lidar registration for robust localization in extreme environments,” IEEE Transactions on Robotics, 2023.
  32. J. Yin, A. Li, T. Li, W. Yu, and D. Zou, “M2dgr: A multi-sensor and multi-scenario slam dataset for ground robots,” IEEE Robotics and Automation Letters, vol. 7, no. 2, pp. 2266–2273, 2021.
  33. T.-M. Nguyen, S. Yuan, M. Cao, Y. Lyu, T. H. Nguyen, and L. Xie, “Ntu viral: A visual-inertial-ranging-lidar dataset, from an aerial vehicle viewpoint,” The International Journal of Robotics Research, vol. 41, no. 3, pp. 270–280, 2022.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com