OmniNxt: A Fully Open-source and Compact Aerial Robot with Omnidirectional Visual Perception (2403.20085v1)
Abstract: Adopting omnidirectional Field of View (FoV) cameras in aerial robots vastly improves perception ability, significantly advancing aerial robotics's capabilities in inspection, reconstruction, and rescue tasks. However, such sensors also elevate system complexity, e.g., hardware design, and corresponding algorithm, which limits researchers from utilizing aerial robots with omnidirectional FoV in their research. To bridge this gap, we propose OmniNxt, a fully open-source aerial robotics platform with omnidirectional perception. We design a high-performance flight controller NxtPX4 and a multi-fisheye camera set for OmniNxt. Meanwhile, the compatible software is carefully devised, which empowers OmniNxt to achieve accurate localization and real-time dense mapping with limited computation resource occupancy. We conducted extensive real-world experiments to validate the superior performance of OmniNxt in practical applications. All the hardware and software are open-access at https://github.com/HKUST-Aerial-Robotics/OmniNxt, and we provide docker images of each crucial module in the proposed system. Project page: https://hkust-aerial-robotics.github.io/OmniNxt.
- B. Zhou, Y. Zhang, X. Chen, and S. Shen, “FUEL: Fast UAV Exploration using Incremental Frontier Structure and Hierarchical Planning,” 2020.
- C. Feng, H. Li, J. Jiang, X. Chen, S. Shen, and B. Zhou, “FC-Planner: A Skeleton-guided Planning Framework for Fast Aerial Coverage of Complex 3D Scenes,” arXiv preprint arXiv:2309.13882, 2023.
- E. Lygouras, A. Gasteratos, K. Tarchanidis, and A. Mitropoulos, “ROLFER: A fully autonomous aerial rescue support system,” Microprocessors and Microsystems, vol. 61, pp. 32–42, 2018.
- K. Mohta, M. Watterson, Y. Mulgaonkar, S. Liu, C. Qu, A. Makineni, K. Saulnier, K. Sun, A. Zhu, J. Delmerico, K. Karydis, N. Atanasov, G. Loianno, D. Scaramuzza, K. Daniilidis, C. J. Taylor, and V. Kumar, “Fast, autonomous flight in GPS-denied and cluttered environments,” Journal of Field Robotics, vol. 35, no. 1, pp. 101–120, 2018.
- T. Baca, M. Petrlik, M. Vrba, V. Spurny, R. Penicka, D. Hert, and M. Saska, “The MRS UAV System: Pushing the Frontiers of Reproducible Research, Real-world Deployment, and Education with Autonomous Unmanned Aerial Vehicles,” Journal of Intelligent and Robotic Systems, vol. 102, no. 1, Apr. 2021.
- P. Foehn, E. Kaufmann, A. Romero, R. Penicka, S. Sun, L. Bauersfeld, T. Laengle, G. Cioffi, Y. Song, A. Loquercio, and D. Scaramuzza, “Agilicious: Open-source and open-hardware agile quadrotor for vision-based flight,” Science Robotics, vol. 7, no. 67, p. eabl6259, 2022.
- W. Gao, K. Wang, W. Ding, F. Gao, T. Qin, and S. Shen, “Autonomous aerial robot using dual-fisheye cameras,” Journal of Field Robotics, vol. 37, no. 4, pp. 497–514, 2020.
- Z. Zhang, H. Rebecq, C. Forster, and D. Scaramuzza, “Benefit of large field-of-view cameras for visual odometry,” in 2016 IEEE International Conference on Robotics and Automation (ICRA), 2016, pp. 801–808.
- Z. Wang, K. Yang, H. Shi, P. Li, F. Gao, and K. Wang, “LF-VIO: A Visual-Inertial-Odometry Framework for Large Field-of-View Cameras with Negative Plane,” in 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2022.
- D. Mellinger and V. Kumar, “Minimum snap trajectory generation and control for quadrotors,” in 2011 IEEE International Conference on Robotics and Automation, 2011, pp. 2520–2525.
- DJI. [Online]. Available: https://enterprise.dji.com/zh-tw/mavic-3-enterprise/
- SKYDIO. [Online]. Available: https://www.skydio.com/x10/
- D. He, W. Xu, N. Chen, F. Kong, C. Yuan, and F. Zhang, “Point-LIO: Robust High-Bandwidth Light Detection and Ranging Inertial Odometry,” Advanced Intelligent Systems, vol. 5, no. 7, p. 2200459, 2023.
- Fast-LAB. [Online]. Available: https://github.com/ZJU-FAST-Lab/Fast-Drone-250
- H. Matsuki, L. von Stumberg, V. Usenko, J. Stückler, and D. Cremers, “Omnidirectional DSO: Direct Sparse Odometry with Fisheye Cameras,” CoRR, vol. abs/1808.02775, 2018.
- T. Qin, P. Li, and S. Shen, “Vins-mono: A robust and versatile monocular visual-inertial state estimator,” IEEE Transactions on Robotics, vol. 34, no. 4, pp. 1004–1020, 2018.
- C. Won, J. Ryu, and J. Lim, “End-to-End Learning for Omnidirectional Stereo Matching with Uncertainty Prior,” IEEE Transactions on Pattern Analysis and Machine Intelligence (PAMI), 2020.
- S. Xie, D. Wang, and Y. Liu, “Omnividar: Omnidirectional depth estimation from multi-fisheye images,” in 2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). Los Alamitos, CA, USA: IEEE Computer Society, jun 2023, pp. 21 529–21 538.
- L. Meier, D. Honegger, and M. Pollefeys, “Px4: A node-based multithreaded open source robotics framework for deeply embedded platforms,” 2015 IEEE International Conference on Robotics and Automation (ICRA), pp. 6235–6240, 2015.
- B. P. Duisterhof, Y. Hu, S. H. Teng, M. Kaess, and S. Scherer, “TartanCalib: Iterative Wide-Angle Lens Calibration using Adaptive SubPixel Refinement of AprilTags,” 2022.
- P. Furgale, J. Rehder, and R. Siegwart, “Unified temporal and spatial calibration for multi-sensor systems,” in 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2013, pp. 1280–1286.
- J. Rehder, J. Nikolic, T. Schneider, T. Hinzmann, and R. Siegwart, “Extending kalibr: Calibrating the extrinsics of multiple IMUs and of individual axes,” in 2016 IEEE International Conference on Robotics and Automation (ICRA), 2016, pp. 4304–4311.
- H. Xu, P. Liu, X. Chen, and S. Shen, “D2superscript𝐷2D^{2}italic_D start_POSTSUPERSCRIPT 2 end_POSTSUPERSCRIPTSLAM: Decentralized and Distributed Collaborative Visual-inertial SLAM System for Aerial Swarm,” 2023.
- C. Mei and P. Rives, “Single view point omnidirectional camera calibration from planar grids,” in Proceedings 2007 IEEE International Conference on Robotics and Automation, 2007, pp. 3945–3950.
- E. Plaut, E. B. Yaacov, and B. E. Shlomo, “3D Object Detection from a Single Fisheye Image Without a Single Fisheye Training Image,” 2021.
- P.-E. Sarlin, D. DeTone, T. Malisiewicz, and A. Rabinovich, “SuperGlue: Learning Feature Matching with Graph Neural Networks,” 2020.
- B. Lucas and T. Kanade, “An iterative image registration technique with an application to stereo vision (ijcai),” vol. 81, 04 1981.
- S. Agarwal, K. Mierle, and T. C. S. Team, “Ceres Solver,” 10 2023. [Online]. Available: https://github.com/ceres-solver/ceres-solver
- INTEL. [Online]. Available: https://www.intelrealsense.com/depth-camera-d435/
- M. Grupp, “evo: Python package for the evaluation of odometry and SLAM.” https://github.com/MichaelGrupp/evo, 2017.
- V. Tankovich, C. Häne, Y. Zhang, A. Kowdle, S. Fanello, and S. Bouaziz, “HITNet: Hierarchical Iterative Tile Refinement Network for Real-time Stereo Matching,” 2023.
- X. Zhou, Z. Wang, H. Ye, C. Xu, and F. Gao, “EGO-Planner: An ESDF-free Gradient-based Local Planner for Quadrotors,” 2020.
- INTEL. [Online]. Available: https://www.intelrealsense.com/lidar-camera-l515/
- C. Jiang, H. Zhang, P. Liu, Z. Yu, H. Cheng, B. Zhou, and S. Shen, “H22{}_{2}start_FLOATSUBSCRIPT 2 end_FLOATSUBSCRIPT-mapping: Real-time dense mapping using hierarchical hybrid representation,” IEEE Robotics and Automation Letters, vol. 8, no. 10, pp. 6787–6794, 2023.