ROAMER: Robust Offroad Autonomy using Multimodal State Estimation with Radar Velocity Integration (2401.17404v1)
Abstract: Reliable offroad autonomy requires low-latency, high-accuracy state estimates of pose as well as velocity, which remain viable throughout environments with sub-optimal operating conditions for the utilized perception modalities. As state estimation remains a single point of failure system in the majority of aspiring autonomous systems, failing to address the environmental degradation the perception sensors could potentially experience given the operating conditions, can be a mission-critical shortcoming. In this work, a method for integration of radar velocity information in a LiDAR-inertial odometry solution is proposed, enabling consistent estimation performance even with degraded LiDAR-inertial odometry. The proposed method utilizes the direct velocity-measuring capabilities of an Frequency Modulated Continuous Wave (FMCW) radar sensor to enhance the LiDAR-inertial smoother solution onboard the vehicle through integration of the forward velocity measurement into the graph-based smoother. This leads to increased robustness in the overall estimation solution, even in the absence of LiDAR data. This method was validated by hardware experiments conducted onboard an all-terrain vehicle traveling at high speed, ~12 m/s, in demanding offroad environments.
- D. Rodríguez-Martínez, M. Van Winnendael, and K. Yoshida, “High-speed mobility on planetary surfaces: A technical review,” Journal of Field Robotics, vol. 36, no. 8, pp. 1436–1455, 2019.
- L. Matthies, A. Kennett, L. Kerber, A. Fraeman, and R. C. Anderson, “Prospects for very long-range mars rover missions,” in 2022 IEEE Aerospace Conference (AERO). IEEE, 2022, pp. 1–11.
- W. Wen, T. Pfeifer, X. Bai, and L.-T. Hsu, “Factor graph optimization for gnss/ins integration: A comparison with the extended kalman filter,” NAVIGATION: Journal of the Institute of Navigation, vol. 68, no. 2, pp. 315–331, 2021. [Online]. Available: https://navi.ion.org/content/68/2/315
- P. Geneva, K. Eckenhoff, W. Lee, Y. Yang, and G. Huang, “Openvins: A research platform for visual-inertial estimation,” in 2020 IEEE International Conference on Robotics and Automation (ICRA), 2020, pp. 4666–4672.
- W. Xu, Y. Cai, D. He, J. Lin, and F. Zhang, “Fast-lio2: Fast direct lidar-inertial odometry,” IEEE Transactions on Robotics, vol. 38, no. 4, pp. 2053–2073, 2022.
- S. Khattak, C. Papachristos, and K. Alexis, “Keyframe-based thermal–inertial odometry,” Journal of Field Robotics, vol. 37, no. 4, pp. 552–579, 2020. [Online]. Available: https://onlinelibrary.wiley.com/doi/abs/10.1002/rob.21932
- C. Doer and G. Trommer, “x-rio: Radar inertial odometry with multiple radar sensors and yaw aiding,” Gyroscopy and Navigation, vol. 12, no. 4, pp. 329–339, 2021.
- J. Levinson, J. Askeland, J. Becker, J. Dolson, D. Held, S. Kammel, J. Z. Kolter, D. Langer, O. Pink, V. Pratt, M. Sokolsky, G. Stanek, D. Stavens, A. Teichman, M. Werling, and S. Thrun, “Towards fully autonomous driving: Systems and algorithms,” in 2011 IEEE Intelligent Vehicles Symposium (IV), 2011, pp. 163–168.
- J. Zhang and S. Singh, “Loam: Lidar odometry and mapping in real-time,” in Proceedings of Robotics: Science and Systems, Berkeley, USA, July 2014.
- S. Khattak, H. Nguyen, F. Mascarich, T. Dang, and K. Alexis, “Complementary multi–modal sensor fusion for resilient robot pose estimation in subterranean environments,” in 2020 International Conference on Unmanned Aircraft Systems (ICUAS), 2020, pp. 1024–1029.
- T. Shan, B. Englot, D. Meyers, W. Wang, C. Ratti, and R. Daniela, “Lio-sam: Tightly-coupled lidar inertial odometry via smoothing and mapping,” in IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2020, pp. 5135–5142.
- T. Tuna, J. Nubert, Y. Nava, S. Khattak, and M. Hutter, “X-icp: Localizability-aware lidar registration for robust localization in extreme environments,” arXiv preprint arXiv:2211.16335, 2022.
- K. Ebadi, L. Bernreiter, H. Biggie, G. Catt, Y. Chang, A. Chatterjee, C. E. Denniston, S.-P. Deschênes, K. Harlow, S. Khattak et al., “Present and future of slam in extreme underground environments,” arXiv preprint arXiv:2208.01787, 2022.
- K. Burnett, Y. Wu, D. J. Yoon, A. P. Schoellig, and T. D. Barfoot, “Are we ready for radar to replace lidar in all-weather mapping and localization?” IEEE Robotics and Automation Letters, vol. 7, no. 4, pp. 10 328–10 335, 2022.
- F. Sezgin, D. Vriesman, D. Steinhauser, R. Lugner, and T. Brandmeier, “Safe autonomous driving in adverse weather: Sensor evaluation and performance monitoring,” in 2023 IEEE Intelligent Vehicles Symposium (IV), 2023, pp. 1–6.
- K. Harlow, H. Jang, T. D. Barfoot, A. Kim, and C. Heckman, “A new wave in robotics: Survey on recent mmwave radar applications in robotics,” arXiv preprint arXiv:2305.01135, 2023.
- D. Kellner, M. Barjenbruch, J. Klappstein, J. Dickmann, and K. Dietmayer, “Instantaneous ego-motion estimation using Doppler radar,” in 16th International IEEE Conference on Intelligent Transportation Systems (ITSC 2013). IEEE, Oct. 2013, pp. 869–874.
- ——, “Instantaneous ego-motion estimation using multiple doppler radars,” in 2014 IEEE International Conference on Robotics and Automation (ICRA), 2014, pp. 1592–1597.
- J. Michalczyk, R. Jung, and S. Weiss, “Tightly-coupled ekf-based radar-inertial odometry,” in 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2022, pp. 12 336–12 343.
- A. Kramer and C. Heckman, “Radar-inertial state estimation and obstacle detection for micro-aerial vehicles in dense fog,” in Experimental Robotics, B. Siciliano, C. Laschi, and O. Khatib, Eds. Cham: Springer International Publishing, 2021, pp. 3–16.
- K. Burnett, D. J. Yoon, A. P. Schoellig, and T. Barfoot, “Radar Odometry Combining Probabilistic Estimation and Unsupervised Feature Learning,” in Proceedings of Robotics: Science and Systems, Virtual, July 2021.
- Z. Hong, Y. Petillot, and S. Wang, “Radarslam: Radar based large-scale slam in all weathers,” in 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2020, pp. 5164–5170.
- D. Adolfsson, M. Magnusson, A. Alhashimi, A. J. Lilienthal, and H. Andreasson, “Lidar-level localization with radar? the cfear approach to accurate, fast, and robust large-scale radar odometry in diverse environments,” IEEE Transactions on robotics, vol. 39, no. 2, pp. 1476–1495, 2022.
- C. Brommer, R. Jung, J. Steinbrener, and S. Weiss, “MaRS : A Modular and Robust Sensor-Fusion Framework,” 2020.
- J. Nubert, S. Khattak, and M. Hutter, “Graph-based multi-sensor fusion for consistent localization of autonomous construction robots,” in IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2022.
- S. Fakoorian, K. Otsu, S. Khattak, M. Palieri, and A.-a. Agha-mohammadi, “Rose: Robust state estimation via online covariance adaption,” in The International Symposium of Robotics Research. Springer, 2022, pp. 452–467.
- P. Fritsche, S. Kueppers, G. Briese, and B. Wagner, “Fusing lidar and radar data to perform slam in harsh environments,” in Informatics in Control, Automation and Robotics: 13th International Conference, ICINCO 2016 Lisbon, Portugal, 29-31 July, 2016. Springer, 2018, pp. 175–189.
- Y. S. Park, J. Kim, and A. Kim, “Radar localization and mapping for indoor disaster environments via multi-modal registration to prior lidar map,” in 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2019, pp. 1307–1314.
- H. Yin, R. Chen, Y. Wang, and R. Xiong, “Rall: End-to-end radar localization on lidar map using differentiable measurement model,” IEEE Transactions on Intelligent Transportation Systems, vol. 23, no. 7, pp. 6737–6750, 2022.
- J. Solà, J. Deray, and D. Atchuthan, “A micro lie theory for state estimation in robotics,” 2021.
- C. Forster, L. Carlone, F. Dellaert, and D. Scaramuzza, “On-Manifold Preintegration for Real-Time Visual–Inertial Odometry,” IEEE Transactions on Robotics, vol. 33, no. 1, pp. 1–21, Feb. 2017. [Online]. Available: https://ieeexplore.ieee.org/document/7557075/
- F. Dellaert and G. Contributors, “borglab/gtsam,” May 2022. [Online]. Available: https://github.com/borglab/gtsam)