Towards introspective loop closure in 4D radar SLAM (2404.03940v1)
Abstract: Imaging radar is an emerging sensor modality in the context of Localization and Mapping (SLAM), especially suitable for vision-obstructed environments. This article investigates the use of 4D imaging radars for SLAM and analyzes the challenges in robust loop closure. Previous work indicates that 4D radars, together with inertial measurements, offer ample information for accurate odometry estimation. However, the low field of view, limited resolution, and sparse and noisy measurements render loop closure a significantly more challenging problem. Our work builds on the previous work - TBV SLAM - which was proposed for robust loop closure with 360$\circ$ spinning radars. This article highlights and addresses challenges inherited from a directional 4D radar, such as sparsity, noise, and reduced field of view, and discusses why the common definition of a loop closure is unsuitable. By combining multiple quality measures for accurate loop closure detection adapted to 4D radar data, significant results in trajectory estimation are achieved; the absolute trajectory error is as low as 0.46 m over a distance of 1.8 km, with consistent operation over multiple environments.
- J. Zhang, H. Zhuge, Z. Wu, G. Peng, M. Wen, Y. Liu, and D. Wang, “4dradarslam: A 4d imaging radar slam system for large-scale environments based on pose graph optimization,” in 2023 IEEE International Conference on Robotics and Automation (ICRA), 2023, pp. 8333–8340.
- X. Li, H. Zhang, and W. Chen, “4d radar-based pose graph slam with ego-velocity pre-integration factor,” IEEE Robotics and Automation Letters, vol. 8, no. 8, pp. 5124–5131, 2023.
- Y. Zhuang, B. Wang, J. Huai, and M. Li, “4d iriom: 4d imaging radar inertial odometry and mapping,” IEEE Robotics and Automation Letters, vol. 8, no. 6, pp. 3246–3253, 2023.
- G. Kim and A. Kim, “Scan context: Egocentric spatial descriptor for place recognition within 3d point cloud map,” in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2018, pp. 4802–4809.
- D. Adolfsson, M. Karlsson, V. Kubelka, M. Magnusson, and H. Andreasson, “Tbv radar slam – trust but verify loop candidates,” IEEE Robotics and Automation Letters, vol. 8, no. 6, pp. 3613–3620, 2023.
- V. Kubelka, E. Fritz, and M. Magnusson, “Do we need scan-matching in radar odometry?” arXiv preprint arXiv:2310.18117, 2023.
- C. Doer and G. F. Trommer, “An ekf based approach to radar inertial odometry,” in 2020 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI), 2020, pp. 152–159.
- A. Kramer, C. Stahoviak, A. Santamaria-Navarro, A.-a. Agha-mohammadi, and C. Heckman, “Radar-inertial ego-velocity estimation for visually degraded environments,” in 2020 IEEE International Conference on Robotics and Automation (ICRA), 2020, pp. 5739–5746.
- C. Doer and G. F. Trommer, “Yaw aided radar inertial odometry using manhattan world assumptions,” in 2021 28th Saint Petersburg International Conference on Integrated Navigation Systems (ICINS). IEEE, 2021, pp. 1–9.
- Y. Z. Ng, B. Choi, R. Tan, and L. Heng, “Continuous-time radar-inertial odometry for automotive radars,” in 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2021, pp. 323–330.
- A. Galeote-Luque, V. Kubelka, M. Magnusson, J.-R. Ruiz-Sarmiento, and J. Gonzalez-Jimenez, “Doppler-only single-scan 3d vehicle odometry,” arXiv preprint arXiv:2310.04113, 2023.
- J.-T. Huang, R. Xu, A. Hinduja, and M. Kaess, “Multi-radar inertial odometry for 3d state estimation using mmwave imaging radar,” arXiv preprint arXiv:2311.08608, 2023.
- K. Retan, F. Loshaj, and M. Heizmann, “Radar odometry on se(3)𝑠𝑒3se(3)italic_s italic_e ( 3 ) with constant velocity motion prior,” IEEE Robotics and Automation Letters, vol. 6, no. 4, pp. 6386–6393, 2021.
- ——, “Radar odometry on se(3)𝑠𝑒3se(3)italic_s italic_e ( 3 ) with constant acceleration motion prior and polar measurement model,” arXiv preprint arXiv:2209.05956, 2022.
- J. Michalczyk, R. Jung, and S. Weiss, “Tightly-coupled ekf-based radar-inertial odometry,” in 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2022, pp. 12 336–12 343.
- J. Michalczyk, R. Jung, C. Brommer, and S. Weiss, “Multi-state tightly-coupled ekf-based radar-inertial odometry with persistent landmarks,” in 2023 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2023, pp. 4011–4017.
- S. Lu, G. Zhuo, L. Xiong, X. Zhu, L. Zheng, Z. He, M. Zhou, X. Lu, and J. Bai, “Efficient deep-learning 4d automotive radar odometry method,” IEEE Transactions on Intelligent Vehicles, 2023.
- H. Wang, C. Wang, and L. Xie, “Intensity scan context: Coding intensity and geometry relations for loop closure detection,” in 2020 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2020, pp. 2095–2101.
- B. Wang, Y. Zhuang, and N. El-Bendary, “4d radar/imu/gnss integrated positioning and mapping for large-scale environments,” The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, vol. 48, pp. 1223–1228, 2023.
- D. Adolfsson, M. Magnusson, A. Alhashimi, A. J. Lilienthal, and H. Andreasson, “Lidar-level localization with radar? the cfear approach to accurate, fast, and robust large-scale radar odometry in diverse environments,” IEEE Transactions on Robotics, vol. 39, no. 2, pp. 1476–1495, 2023.
- D. Adolfsson, M. Castellano-Quero, M. Magnusson, A. J. Lilienthal, and H. Andreasson, “Coral: Introspection for robust radar and lidar perception in diverse environments using differential entropy,” Robotics and Autonomous Systems, vol. 155, p. 104136, 2022.
- A. Geiger, P. Lenz, C. Stiller, and R. Urtasun, “Vision meets robotics: The kitti dataset,” The International Journal of Robotics Research, vol. 32, no. 11, pp. 1231–1237, 2013.
- S. Gupta, T. Guadagnino, B. Mersch, I. Vizzo, and C. Stachniss, “Effectively detecting loop closures using point cloud density maps,” in Proc. of the IEEE Intl. Conf. on Robotics & Automation (ICRA), 2024.