Papers
Topics
Authors
Recent
Search
2000 character limit reached

Less is More: Physical-enhanced Radar-Inertial Odometry

Published 3 Feb 2024 in cs.RO | (2402.02200v1)

Abstract: Radar offers the advantage of providing additional physical properties related to observed objects. In this study, we design a physical-enhanced radar-inertial odometry system that capitalizes on the Doppler velocities and radar cross-section information. The filter for static radar points, correspondence estimation, and residual functions are all strengthened by integrating the physical properties. We conduct experiments on both public datasets and our self-collected data, with different mobile platforms and sensor types. Our quantitative results demonstrate that the proposed radar-inertial odometry system outperforms alternative methods using the physical-enhanced components. Our findings also reveal that using the physical properties results in fewer radar points for odometry estimation, but the performance is still guaranteed and even improved, thus aligning with the ``less is more'' principle.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (31)
  1. S. Clark and G. Dissanayake, “Simultaneous localization and map building using millimeter wave radar to extract natural features,” in Proceedings 1999 IEEE International Conference on Robotics and Automation (Cat. No. 99CH36288C), vol. 2.   IEEE, 1999, pp. 1316–1321.
  2. K. Harlow, H. Jang, T. D. Barfoot, A. Kim, and C. Heckman, “A new wave in robotics: Survey on recent mmwave radar applications in robotics,” arXiv preprint arXiv:2305.01135, 2023.
  3. D. Barnes, M. Gadd, P. Murcutt, P. Newman, and I. Posner, “The oxford radar robotcar dataset: A radar extension to the oxford robotcar dataset,” in 2020 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2020, pp. 6433–6438.
  4. Y. Cheng, J. Su, M. Jiang, and Y. Liu, “A novel radar point cloud generation method for robot environment perception,” IEEE Transactions on Robotics, vol. 38, no. 6, pp. 3754–3773, 2022.
  5. A. Kramer, C. Stahoviak, A. Santamaria-Navarro, A.-A. Agha-Mohammadi, and C. Heckman, “Radar-inertial ego-velocity estimation for visually degraded environments,” in 2020 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2020, pp. 5739–5746.
  6. C. Doer and G. F. Trommer, “An ekf based approach to radar inertial odometry,” in 2020 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI).   IEEE, 2020, pp. 152–159.
  7. Y. Zhuang, B. Wang, J. Huai, and M. Li, “4d iriom: 4d imaging radar inertial odometry and mapping,” IEEE Robotics and Automation Letters, 2023.
  8. J. Michalczyk, R. Jung, and S. Weiss, “Tightly-coupled ekf-based radar-inertial odometry,” in 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2022, pp. 12 336–12 343.
  9. J. Michalczyk, R. Jung, C. Brommer, and S. Weiss, “Multi-state tightly-coupled ekf-based radar-inertial odometry with persistent landmarks,” in 2023 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2023, pp. 4011–4017.
  10. Y. Almalioglu, M. Turan, C. X. Lu, N. Trigoni, and A. Markham, “Milli-rio: Ego-motion estimation with low-cost millimetre-wave radar,” IEEE Sensors Journal, vol. 21, no. 3, pp. 3314–3323, 2020.
  11. T. Qin, P. Li, and S. Shen, “Vins-mono: A robust and versatile monocular visual-inertial state estimator,” IEEE Transactions on Robotics, vol. 34, no. 4, pp. 1004–1020, 2018.
  12. W. Xu, Y. Cai, D. He, J. Lin, and F. Zhang, “Fast-lio2: Fast direct lidar-inertial odometry,” IEEE Transactions on Robotics, vol. 38, no. 4, pp. 2053–2073, 2022.
  13. G. Kim, Y. S. Park, Y. Cho, J. Jeong, and A. Kim, “Mulran: Multimodal range dataset for urban place recognition,” in 2020 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2020, pp. 6246–6253.
  14. M. Sheeny, E. De Pellegrin, S. Mukherjee, A. Ahrabian, S. Wang, and A. Wallace, “Radiate: A radar dataset for automotive perception in bad weather,” in 2021 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2021, pp. 1–7.
  15. S. H. Cen and P. Newman, “Precise ego-motion estimation with millimeter-wave radar under diverse and challenging conditions,” in 2018 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2018, pp. 6045–6052.
  16. ——, “Radar-only ego-motion estimation in difficult settings via graph matching,” in 2019 International Conference on Robotics and Automation (ICRA).   IEEE, 2019, pp. 298–304.
  17. D. Adolfsson, M. Magnusson, A. Alhashimi, A. J. Lilienthal, and H. Andreasson, “Lidar-level localization with radar? the cfear approach to accurate, fast, and robust large-scale radar odometry in diverse environments,” IEEE Transactions on robotics, vol. 39, no. 2, pp. 1476–1495, 2022.
  18. D. Adolfsson, M. Karlsson, V. Kubelka, M. Magnusson, and H. Andreasson, “Tbv radar slam–trust but verify loop candidates,” IEEE Robotics and Automation Letters, 2023.
  19. D. Barnes, R. Weston, and I. Posner, “Masking by moving: Learning distraction-free radar odometry from pose information,” arXiv preprint arXiv:1909.03752, 2019.
  20. K. Burnett, D. J. Yoon, A. P. Schoellig, and T. D. Barfoot, “Radar odometry combining probabilistic estimation and unsupervised feature learning,” arXiv preprint arXiv:2105.14152, 2021.
  21. H. Yin, X. Xu, Y. Wang, and R. Xiong, “Radar-to-lidar: Heterogeneous place recognition via joint learning,” Frontiers in Robotics and AI, vol. 8, p. 661199, 2021.
  22. H. Yin, R. Chen, Y. Wang, and R. Xiong, “Rall: end-to-end radar localization on lidar map using differentiable measurement model,” IEEE Transactions on Intelligent Transportation Systems, vol. 23, no. 7, pp. 6737–6750, 2021.
  23. Z. Hong, Y. Petillot, A. Wallace, and S. Wang, “Radarslam: A robust simultaneous localization and mapping system for all weather conditions,” The International Journal of Robotics Research, vol. 41, no. 5, pp. 519–542, 2022.
  24. A. Kramer, K. Harlow, C. Williams, and C. Heckman, “Coloradar: The direct 3d millimeter wave radar dataset,” The International Journal of Robotics Research, vol. 41, no. 4, pp. 351–360, 2022.
  25. X. Li, H. Zhang, and W. Chen, “4d radar-based pose graph slam with ego-velocity pre-integration factor,” IEEE Robotics and Automation Letters, 2023.
  26. J. Zhang, H. Zhuge, Z. Wu, G. Peng, M. Wen, Y. Liu, and D. Wang, “4dradarslam: A 4d imaging radar slam system for large-scale environments based on pose graph optimization,” in 2023 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2023, pp. 8333–8340.
  27. C. X. Lu, M. R. U. Saputra, P. Zhao, Y. Almalioglu, P. P. De Gusmao, C. Chen, K. Sun, N. Trigoni, and A. Markham, “milliego: single-chip mmwave radar aided egomotion estimation via deep sensor fusion,” in Proceedings of the 18th Conference on Embedded Networked Sensor Systems, 2020, pp. 109–122.
  28. C. Schöffmann, B. Ubezio, C. Böhm, S. Mühlbacher-Karrer, and H. Zangl, “Virtual radar: Real-time millimeter-wave radar sensor simulation for perception-driven robotics,” IEEE Robotics and Automation Letters, vol. 6, no. 3, pp. 4704–4711, 2021.
  29. B. D. Lucas and T. Kanade, “An iterative image registration technique with an application to stereo vision,” in IJCAI’81: 7th international joint conference on Artificial intelligence, vol. 2, 1981, pp. 674–679.
  30. S. Agarwal, K. Mierle, and T. C. S. Team, “Ceres Solver,” 3 2022. [Online]. Available: https://github.com/ceres-solver/ceres-solver
  31. M. Grupp, “evo: Python package for the evaluation of odometry and slam.” https://github.com/MichaelGrupp/evo, 2017.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.