MUN-FRL: A Visual Inertial LiDAR Dataset for Aerial Autonomous Navigation and Mapping (2310.08435v1)
Abstract: This paper presents a unique outdoor aerial visual-inertial-LiDAR dataset captured using a multi-sensor payload to promote the global navigation satellite system (GNSS)-denied navigation research. The dataset features flight distances ranging from 300m to 5km, collected using a DJI M600 hexacopter drone and the National Research Council (NRC) Bell 412 Advanced Systems Research Aircraft (ASRA). The dataset consists of hardware synchronized monocular images, IMU measurements, 3D LiDAR point-clouds, and high-precision real-time kinematic (RTK)-GNSS based ground truth. Ten datasets were collected as ROS bags over 100 mins of outdoor environment footage ranging from urban areas, highways, hillsides, prairies, and waterfronts. The datasets were collected to facilitate the development of visual-inertial-LiDAR odometry and mapping algorithms, visual-inertial navigation algorithms, object detection, segmentation, and landing zone detection algorithms based upon real-world drone and full-scale helicopter data. All the datasets contain raw sensor measurements, hardware timestamps, and spatio-temporally aligned ground truth. The intrinsic and extrinsic calibrations of the sensors are also provided along with raw calibration datasets. A performance summary of state-of-the-art methods applied on the datasets is also provided.
- N. Boysen, S. Fedtke, and S. Schwerdfeger, “Last-mile delivery concepts: a survey from an operational research perspective,” OR Spectrum 2020 43:1, vol. 43, no. 1, pp. 1–58, 9 2020.
- V. R. Miranda, A. M. Rezende, T. L. Rocha, H. Azpúrua, L. C. Pimenta, and G. M. Freitas, “Autonomous Navigation System for a Delivery Drone,” Journal of Control, Automation and Electrical Systems, vol. 33, no. 1, pp. 141–155, 2 2022.
- Z. Pei, T. Fang, K. Weng, and W. Yi, “Urban On-Demand Delivery via Autonomous Aerial Mobility: Formulation and Exact Algorithm,” IEEE Transactions on Automation Science and Engineering, pp. 1–15, 6 2022.
- J. P. Aurambout, K. Gkoumas, and B. Ciuffo, “Last mile delivery by drones: an estimation of viable market potential and access to citizens across European cities,” European Transport Research Review, vol. 11, no. 1, pp. 1–21, 12 2019.
- A. P. Cohen, S. A. Shaheen, and E. M. Farrar, “Urban Air Mobility: History, Ecosystem, Market Potential, and Challenges,” IEEE Transactions on Intelligent Transportation Systems, vol. 22, no. 9, pp. 6074–6087, 9 2021.
- X. Yang and P. Wei, “Autonomous Free Flight Operations in Urban Air Mobility with Computational Guidance and Collision Avoidance,” IEEE Transactions on Intelligent Transportation Systems, vol. 22, no. 9, pp. 5962–5975, 9 2021.
- J. Zhang and S. Singh, “LOAM: Lidar Odometry and Mapping in Real-time,” Robotics: Science and Systems, 2014.
- J. Jiao, H. Ye, Y. Zhu, and M. Liu, “Robust Odometry and Mapping for Multi-LiDAR Systems with Online Extrinsic Calibration,” IEEE Transactions on Robotics, vol. 38, no. 1, pp. 351–371, 2 2022.
- W. Ding, S. Hou, H. Gao, G. Wan, and S. Song, “LiDAR Inertial Odometry Aided Robust LiDAR Localization System in Changing City Scenes,” Proceedings - IEEE International Conference on Robotics and Automation, pp. 4322–4328, 5 2020.
- S. Song, H. Lim, A. J. Lee, and H. Myung, “DynaVINS: a Visual-Inertial SLAM for Dynamic Environments,” IEEE Robotics and Automation Letters, pp. 1–8, 2022.
- Y. Liang, S. Muller, D. Schwendner, D. Rolle, D. Ganesch, and I. Schaffer, “A Scalable Framework for Robust Vehicle State Estimation with a Fusion of a Low-Cost IMU, the GNSS, Radar, a Camera and Lidar,” IEEE International Conference on Intelligent Robots and Systems, pp. 1661–1668, 10 2020.
- D. Adolfsson, M. Magnusson, A. Alhashimi, A. J. Lilienthal, and H. Andreasson, “Lidar-Level Localization With Radar? The CFEAR Approach to Accurate, Fast, and Robust Large-Scale Radar Odometry in Diverse Environments,” IEEE Transactions on Robotics, pp. 1–20, 2022.
- T. Qin, J. Pan, S. Cao, and S. Shen, “A General Optimization-based Framework for Local Odometry Estimation with Multiple Sensors,” 1 2019.
- M. A. Gomaa, O. De Silva, G. K. Mann, and R. G. Gosine, “Observability-Constrained VINS for MAVs using Interacting Multiple Model Algorithm,” IEEE Transactions on Aerospace and Electronic Systems, 2020.
- R. G. Thalagala, O. De Silva, G. K. Mann, and R. G. Gosine, “Two Key-Frame State Marginalization for Computationally Efficient Visual Inertial Navigation,” in 2021 European Control Conference, ECC 2021. Institute of Electrical and Electronics Engineers Inc., 2021, pp. 1138–1143.
- T. Shan, B. Englot, C. Ratti, and D. Rus, “LVI-SAM: Tightly-coupled Lidar-Visual-Inertial Odometry via Smoothing and Mapping,” 4 2021.
- T. Shan, B. Englot, D. Meyers, W. Wang, C. Ratti, and D. Rus, “LIO-SAM: Tightly-coupled Lidar Inertial Odometry via Smoothing and Mapping,” IEEE International Conference on Intelligent Robots and Systems, pp. 5135–5142, 7 2020.
- T. M. Nguyen, S. Yuan, M. Cao, Y. Lyu, T. H. Nguyen, and L. Xie, “NTU VIRAL: A visual-inertial-ranging-lidar dataset, from an aerial vehicle viewpoint:,” https://doi.org/10.1177/02783649211052312, vol. 0, no. 0, pp. 1–11, 11 2021.
- A. Geiger, P. Lenz, C. Stiller, and R. Urtasun, “Vision meets robotics: The KITTI dataset,” International Journal of Robotics Research, vol. 32, no. 11, pp. 1231–1237, 2013.
- N. Carlevaris-Bianco, A. K. Ushani, and R. M. Eustice, “University of Michigan North Campus long-term vision and lidar dataset:,” http://dx.doi.org/10.1177/0278364915614638, vol. 35, no. 9, pp. 1023–1035, 12 2015.
- P. Wenzel, R. Wang, N. Yang, Q. Cheng, Q. Khan, L. von Stumberg, N. Zeller, and D. Cremers, “4Seasons: A Cross-Season Dataset for Multi-Weather SLAM in Autonomous Driving,” Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 12544 LNCS, pp. 404–417, 9 2020.
- M. Burri, J. Nikolic, P. Gohl, T. Schneider, J. Rehder, S. Omari, M. W. Achtelik, and R. Siegwart, “The EuRoC micro aerial vehicle datasets,” The International Journal of Robotics Research, vol. 35, no. 10, pp. 1157–1163, 9 2016.
- D. Schubert, T. Goll, N. Demmel, V. Usenko, J. Stuckler, and D. Cremers, “The TUM VI Benchmark for Evaluating Visual-Inertial Odometry,” IEEE International Conference on Intelligent Robots and Systems, pp. 1680–1687, 2018.
- A. L. Majdik, C. Till, and D. Scaramuzza, “The Zurich urban micro aerial vehicle dataset:,” http://dx.doi.org/10.1177/0278364917702237, vol. 36, no. 3, pp. 269–273, 4 2017.
- W. Maddern, G. Pascoe, C. Linegar, and P. Newman, “1 year, 1000 km: The Oxford RobotCar dataset:,” http://dx.doi.org/10.1177/0278364916679498, vol. 36, no. 1, pp. 3–15, 11 2016.
- J. Jeong, Y. Cho, Y. S. Shin, H. Roh, and A. Kim, “Complex urban dataset with multi-level sensors from highly diverse urban environments:,” https://doi.org/10.1177/0278364919843996, vol. 38, no. 6, pp. 642–657, 4 2019.
- M. Ramezani, Y. Wang, M. Camurri, D. Wisth, M. Mattamala, and M. Fallon, “The newer college dataset: Handheld LiDAR, inertial and vision with ground truth,” IEEE International Conference on Intelligent Robots and Systems, pp. 4353–4360, 10 2020.
- B. Pfrommer, N. Sanket, K. Daniilidis, and J. Cleveland, “PennCOSYVIO: A challenging Visual Inertial Odometry benchmark,” Proceedings - IEEE International Conference on Robotics and Automation, pp. 3847–3854, 7 2017.
- D. Zuñiga-Noël, A. Jaenal, R. Gomez-Ojeda, and J. Gonzalez-Jimenez, “The UMA-VI dataset: Visual–inertial odometry in low-textured and dynamic illumination environments:,” https://doi.org/10.1177/0278364920938439, vol. 39, no. 9, pp. 1052–1060, 7 2020.
- M. Faizullin, A. Kornilova, and G. Ferrer, “Open-Source LiDAR Time Synchronization System by Mimicking GNSS-clock,” IEEE International Symposium on Precision Clock Synchronization for Measurement, Control, and Communication, ISPCS, vol. 2022-Octob, 2022.
- M. Fonder, M. Droogenbroeck, and V. M. Droogenbroeck, “Mid-air: A multi-modal dataset for extremely low altitude drone flights,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, vol. 2019-June, pp. 553–562, 6 2019.
- W. Wang, D. Zhu, X. Wang, Y. Hu, Y. Qiu, C. Wang, Y. Hu, A. Kapoor, and S. Scherer, “TartanAir: A Dataset to Push the Limits of Visual SLAM,” IEEE International Conference on Intelligent Robots and Systems, pp. 4909–4916, 10 2020.
- I. Cisneros, P. Yin, J. Zhang, H. Choset, and S. Scherer, “ALTO: A Large-Scale Dataset for UAV Visual Place Recognition and Localization,” 7 2022. [Online]. Available: https://arxiv.org/abs/2207.12317v1
- M. Schleiss, F. Rouatbi, and D. Cremers, “VPAIR – Aerial Visual Place Recognition and Localization in Large-scale Outdoor Environments,” 5 2022. [Online]. Available: https://arxiv.org/abs/2205.11567v1
- P. Furgale, J. Rehder, and R. Siegwart, “Unified temporal and spatial calibration for multi-sensor systems,” in 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 11 2013, pp. 1280–1286.
- D. Didula, D. S. Oscar, and T. Kusal, “VI-LOAM_ISLAB,” 2022. [Online]. Available: https://github.com/didzdissanayaka8/VI-LOAM˙ISLAB
- Ravindu G. Thalagala (1 paper)
- Sahan M. Gunawardena (1 paper)
- Oscar De Silva (4 papers)
- Awantha Jayasiri (2 papers)
- Arthur Gubbels (1 paper)
- George K. I Mann (11 papers)
- Raymond G. Gosine (11 papers)