TAIL: A Terrain-Aware Multi-Modal SLAM Dataset for Robot Locomotion in Deformable Granular Environments (2403.16875v1)
Abstract: Terrain-aware perception holds the potential to improve the robustness and accuracy of autonomous robot navigation in the wilds, thereby facilitating effective off-road traversals. However, the lack of multi-modal perception across various motion patterns hinders the solutions of Simultaneous Localization And Mapping (SLAM), especially when confronting non-geometric hazards in demanding landscapes. In this paper, we first propose a Terrain-Aware multI-modaL (TAIL) dataset tailored to deformable and sandy terrains. It incorporates various types of robotic proprioception and distinct ground interactions for the unique challenges and benchmark of multi-sensor fusion SLAM. The versatile sensor suite comprises stereo frame cameras, multiple ground-pointing RGB-D cameras, a rotating 3D LiDAR, an IMU, and an RTK device. This ensemble is hardware-synchronized, well-calibrated, and self-contained. Utilizing both wheeled and quadrupedal locomotion, we efficiently collect comprehensive sequences to capture rich unstructured scenarios. It spans the spectrum of scope, terrain interactions, scene changes, ground-level properties, and dynamic robot characteristics. We benchmark several state-of-the-art SLAM methods against ground truth and provide performance validations. Corresponding challenges and limitations are also reported. All associated resources are accessible upon request at \url{https://tailrobot.github.io/}.
- G. Webster and V. McGregor, “Nasa’s mars rover has uncertain future as sixth anniversary nears,” 2009. [Online]. Available: https://mars.nasa.gov/mer/newsroom/pressreleases/20091231a.html
- R. A. Kerr, “Mars rover trapped in sand, but what can end a mission?” Science, vol. 324, no. 5930, pp. 998–998, 2009.
- D. D. Fan, K. Otsu, Y. Kubo, A. Dixit, J. Burdick, and A.-A. Agha-Mohammadi, “Step: Stochastic traversability evaluation and planning for risk-aware off-road navigation,” in Robotics: Science and Systems XVII. Robotics: Science and Systems Foundation, 2021.
- M. Smith, I. Baldwin, W. Churchill, R. Paul, and P. Newman, “The new college vision and laser data set,” The International Journal of Robotics Research, vol. 28, no. 5, pp. 595–599, 2009.
- T. Shan, B. Englot, C. Ratti, and D. Rus, “Lvi-sam: Tightly-coupled lidar-visual-inertial odometry via smoothing and mapping,” in 2021 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2021, pp. 5692–5698.
- E. Mueggler, H. Rebecq, G. Gallego, T. Delbruck, and D. Scaramuzza, “The event-camera dataset and simulator: Event-based data for pose estimation, visual odometry, and slam,” The International Journal of Robotics Research, vol. 36, no. 2, pp. 142–149, 2017.
- M. Burri, J. Nikolic, P. Gohl, T. Schneider, J. Rehder, S. Omari, M. W. Achtelik, and R. Siegwart, “The euroc micro aerial vehicle datasets,” The International Journal of Robotics Research, vol. 35, no. 10, pp. 1157–1163, 2016.
- J. Jeong, Y. Cho, Y.-S. Shin, H. Roh, and A. Kim, “Complex urban lidar data set,” in 2018 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2018, pp. 6344–6351.
- A. Geiger, P. Lenz, C. Stiller, and R. Urtasun, “Vision meets robotics: The kitti dataset,” The International Journal of Robotics Research, vol. 32, no. 11, pp. 1231–1237, 2013.
- J. Yin, A. Li, T. Li, W. Yu, and D. Zou, “M2dgr: A multi-sensor and multi-scenario slam dataset for ground robots,” IEEE Robotics and Automation Letters, vol. 7, no. 2, pp. 2266–2273, 2021.
- J. Jiao, H. Wei, T. Hu, X. Hu, Y. Zhu, Z. He, J. Wu, J. Yu, X. Xie, H. Huang et al., “Fusionportable: A multi-sensor campus-scene dataset for evaluation of localization and mapping accuracy on diverse platforms,” in 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2022, pp. 3851–3856.
- L. Gao, Y. Liang, J. Yang, S. Wu, C. Wang, J. Chen, and L. Kneip, “Vector: A versatile event-centric benchmark for multi-sensor slam,” IEEE Robotics and Automation Letters, vol. 7, no. 3, pp. 8217–8224, 2022.
- S. Zhao, D. Singh, H. Sun, R. Jiang, Y. Gao, T. Wu, J. Karhade, C. Whittaker, I. Higgins, J. Xu, Y. Qiu, S. Saha, C. Wang, W. Wang, and S. Scherer, “Subt-mrs: A subterranean, multi-robot, multi-spectral and multi-degraded dataset for robust slam,” 2023.
- S. Zhao, H. Zhang, P. Wang, L. Nogueira, and S. Scherer, “Super odometry: Imu-centric lidar-visual-inertial estimator for challenging environments,” in 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2021, pp. 8729–8736.
- J. Knights, K. Vidanapathirana, M. Ramezani, S. Sridharan, C. Fookes, and P. Moghadam, “Wild-places: A large-scale dataset for lidar place recognition in unstructured natural environments,” in 2023 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2023, pp. 11 322–11 328.
- M. Ramezani, K. Khosoussi, G. Catt, P. Moghadam, J. Williams, P. Borges, F. Pauling, and N. Kottege, “Wildcat: Online continuous-time 3d lidar-inertial slam,” arXiv preprint arXiv:2205.12595, 2022.
- S. Jeong, H. Kim, and Y. Cho, “Diter: Diverse terrain and multi-modal dataset for field robot navigation in outdoor environments,” in 2023 IEEE International Conference on Robotics and Automation (ICRA) Workshop, 2023.
- L. Meyer, M. Smíšek, A. Fontan Villacampa, L. Oliva Maza, D. Medina, M. J. Schuster, F. Steidle, M. Vayugundla, M. G. Müller, B. Rebele et al., “The madmax data set for visual-inertial rover navigation on mars,” Journal of Field Robotics, vol. 38, no. 6, pp. 833–853, 2021.
- R. A. Hewitt, E. Boukas, M. Azkarate, M. Pagnamenta, J. A. Marshall, A. Gasteratos, and G. Visentin, “The katwijk beach planetary rover dataset,” The International Journal of Robotics Research, vol. 37, no. 1, pp. 3–12, 2018.
- J. G. Rogers, J. M. Gregory, J. Fink, and E. Stump, “Test your slam! the subt-tunnel dataset and metric for mapping,” in 2020 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2020, pp. 955–961.
- C. Yao, G. Shi, Z. Zhu, and Z. Jia, “Staf: Interaction-based design and evaluation of sensorized terrain-adaptive foot for legged robot traversing on soft slopes,” in IEEE/ASME Transactions on Mechatronics. IEEE, 2023, mininor revision.
- G. Shi, C. Yao, W. Wang, Z. Zhu, and Z. Jia, “Adaptive planar foot with compliant ankle joint and multi-modal sensing for quadruped robots,” in 2022 IEEE International Conference on Robotics and Biomimetics (ROBIO). IEEE, 2022, pp. 52–57.
- C. Yao, F. Xue, Z. Wang, Y. Yuan, Z. Zhu, L. Ding, and Z. Jia, “Wheel vision: Wheel-terrain interaction measurement and analysis using a sensorized transparent wheel on deformable terrains,” IEEE Robotics and Automation Letters, vol. 8, no. 12, pp. 7938–7945, 2023.
- F. Xue, C. Yao, Y. Yuan, Y. Ge, W. Shi, Z. Zhu, L. Ding, and Z. Jia, “Wheel-terrain contact geometry estimation and interaction analysis using aside-wheel camera over deformable terrains,” IEEE Robotics and Automation Letters, vol. 8, no. 11, pp. 7639–7646, 2023.
- C. Yao, H. Zhu, S. Lv, D. Zhang, and Z. Jia, “Robust method for static 3d point cloud map building using multi-view images with multi-resolution,” in 2021 IEEE International Conference on Real-time Computing and Robotics (RCAR). IEEE, 2021, pp. 782–787.
- H. Zhu, C. Yao, Z. Zhu, Z. Liu, and Z. Jia, “Fusing panoptic segmentation and geometry information for robust visual slam in dynamic environments,” in 2022 IEEE 18th International Conference on Automation Science and Engineering (CASE). IEEE, 2022, pp. 1648–1653.
- C. Yao, G. Shi, Y. Ge, Z. Zhu, and Z. Jia, “Predict the physics-informed terrain properties over deformable soils using sensorized foot for quadruped robots,” in 2023 International Conference on Advanced Robotics and Mechatronics (ICARM). IEEE, 2023, pp. 330–335.
- R. Mur-Artal and J. D. Tardós, “Orb-slam2: An open-source slam system for monocular, stereo, and rgb-d cameras,” IEEE Transactions on Robotics, vol. 33, no. 5, pp. 1255–1262, 2017.
- T. Qin, S. Cao, J. Pan, and S. Shen, “A general optimization-based framework for global pose estimation with multiple sensors,” arXiv preprint arXiv:1901.03642, 2019.
- J. Zhang and S. Singh, “Loam: Lidar odometry and mapping in real-time.” in Robotics: Science and Systems, vol. 2, no. 9. Berkeley, CA, 2014, pp. 1–9.
- W. Xu, Y. Cai, D. He, J. Lin, and F. Zhang, “Fast-lio2: Fast direct lidar-inertial odometry,” IEEE Transactions on Robotics, vol. 38, no. 4, pp. 2053–2073, 2022.
- C. Zheng, Q. Zhu, W. Xu, X. Liu, Q. Guo, and F. Zhang, “Fast-livo: Fast and tightly-coupled sparse-direct lidar-inertial-visual odometry,” in 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2022, pp. 4003–4009.
- T.-M. Nguyen, S. Yuan, M. Cao, Y. Lyu, T. H. Nguyen, and L. Xie, “Ntu viral: A visual-inertial-ranging-lidar dataset, from an aerial vehicle viewpoint,” The International Journal of Robotics Research, p. 02783649211052312, 2021.
- Chen Yao (10 papers)
- Yangtao Ge (2 papers)
- Guowei Shi (2 papers)
- Zirui Wang (83 papers)
- Ningbo Yang (2 papers)
- Zheng Zhu (200 papers)
- Hexiang Wei (6 papers)
- Yuntian Zhao (3 papers)
- Jing Wu (182 papers)
- Zhenzhong Jia (11 papers)