Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Virtual Omnidirectional Perception for Downwash Prediction within a Team of Nano Multirotors Flying in Close Proximity (2303.03898v3)

Published 7 Mar 2023 in cs.RO

Abstract: Teams of flying robots can be used for inspection, delivery, and construction tasks, in which they might be required to fly very close to each other. In such close-proximity cases, nonlinear aerodynamic effects can cause catastrophic crashes, necessitating each robots' awareness of the surrounding. Existing approaches rely on multiple, expensive or heavy perception sensors. Such perception methods are impractical to use on nano multirotors that are constrained with respect to weight, computation, and price. Instead, we propose to use the often ignored yaw degree-of-freedom of multirotors to spin a single, cheap and lightweight monocular camera at a high angular rate for omnidirectional awareness of the neighboring robots. We provide a dataset collected with real-world physical flights as well as with 3D-rendered scenes and compare two existing learning-based methods in different settings with respect to success rate, relative position estimation, and downwash prediction accuracy. We demonstrate that our proposed spinning camera is capable of predicting the presence of aerodynamic downwash with an $F_1$ score of over 80% in a challenging swapping task.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (27)
  1. M. Basiri, F. Schill, D. Floreano, and P. Lima, “Audio-based localization for swarms of micro air vehicles,” Proc. IEEE Int. Conf. Robot. Autom., pp. 4729–4734, 2014.
  2. J. Roberts, T. Stirling, J.-C. Zufferey, and D. Floreano, “3-D relative positioning sensor for indoor collective flying robots,” Autonomous Robots, vol. 33, 2012.
  3. F. Schilling, F. Schiano, and D. Floreano, “Vision-based drone flocking in outdoor environments,” IEEE Trans. Robot. Autom. Lett., vol. 6, no. 2, pp. 2954–2961, 2021.
  4. D. Shukla and N. Komerath, “Multirotor drone aerodynamic interaction investigation,” Drones, vol. 2, no. 4, 2018.
  5. D. Yeo, E. Shrestha, D. Paley, and E. Atkins, “An empirical model of rotorcraft uav downwash for disturbance localization and avoidance,” 01 2015.
  6. J. A. Preiss, W. Hönig, N. Ayanian, and G. S. Sukhatme, “Downwash-aware trajectory planning for large quadrotor teams,” in Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst., 2017, pp. 250–257.
  7. D. Mellinger and A. Kushleyev, “Mixed-integer quadratic program (MIQP) trajectory generation for heterogeneous quadrotor teams,” Proc. IEEE Int. Conf. Robot. Autom., pp. 477–483, 05 2012.
  8. K. P. Jain, T. Fortmuller, J. Byun, S. A. Mäkiharju, and M. W. Mueller, “Modeling of aerodynamic disturbances for proximity flight of multirotors,” in Int. Conf. Unmanned Aircraft Systems, 2019, pp. 1261–1269.
  9. G. Shi, X. Shi, M. O’Connell, R. Yu, K. Azizzadenesheli, A. Anandkumar, Y. Yue, and S.-J. Chung, “Neural lander: Stable drone landing control using learned dynamics,” in Proc. IEEE Int. Conf. Robot. Autom., 2019, pp. 9784–9790.
  10. G. Shi, W. Hönig, X. Shi, Y. Yue, and S.-J. Chung, “Neural-swarm2: Planning and control of heterogeneous multirotor swarms using learned interactions,” IEEE Trans. Robot., vol. 38, no. 2, pp. 1063–1079, 2022.
  11. H. Smith, A. Shankar, J. Blumenkamp, J. Gielis, and A. Prorok, “SO(2)-equivariant downwash models for close proximity flight,” CoRR, vol. abs/2305.18983, 2023.
  12. X. Kan, J. Thomas, H. Teng, H. G. Tanner, V. Kumar, and K. Karydis, “Analysis of ground effect for small-scale uavs in forward flight,” IEEE Trans. Robot. Autom. Lett., vol. 4, no. 4, pp. 3860–3867, 2019.
  13. E. Montijano, E. Cristofalo, D. Zhou, M. Schwager, and C. Sagüés, “Vision-based distributed formation control without an external positioning system,” IEEE Trans. Robot., vol. 32, no. 2, pp. 339–351, 2016.
  14. J. Faigl, T. Krajník, J. Chudoba, L. Přeučil, and M. Saska, “Low-cost embedded system for relative localization in robotic swarms,” in Proc. IEEE Int. Conf. Robot. Autom., 2013, pp. 993–998.
  15. T. Krajník, M. A. Nitsche, J. Faigl, P. Vanek, M. Saska, L. Preucil, T. Duckett, and M. Mejail, “A practical multirobot localization system,” Journal of Intelligent & Robotic Systems, vol. 76, pp. 539–562, 2014.
  16. M. Saska, T. Báca, J. Thomas, J. Chudoba, L. Preucil, T. Krajník, J. Faigl, G. Loianno, and V. Kumar, “System for deployment of groups of unmanned micro aerial vehicles in gps-denied environments using onboard visual relative localization,” Auton. Robots, vol. 41, no. 4, pp. 919–944, 2017.
  17. V. Walter, N. Staub, A. Franchi, and M. Saska, “UVDAR system for visual relative localization with application to leader–follower formations of multirotor UAVs,” IEEE Trans. Robot. Autom. Lett., vol. 4, no. 3, pp. 2637–2644, 2019.
  18. M. Vrba and M. Saska, “Marker-less micro aerial vehicle detection and localization using convolutional neural networks,” IEEE Trans. Robot. Autom. Lett., vol. 5, no. 2, pp. 2459–2466, 2020.
  19. A. Carrio, J. Tordesillas, S. Vemprala, S. Saripalli, P. Campoy, and J. P. How, “Onboard detection and localization of drones using depth maps,” IEEE Access, vol. 8, pp. 30 480–30 490, 2020.
  20. S. Bonato, S. C. Lambertenghi, E. Cereda, A. Giusti, and D. Palossi, “Ultra-low power deep learning-based monocular relative localization onboard nano-quadrotors,” 2023.
  21. S. Li, C. De Wagter, and G. C. H. E. De Croon, “Self-supervised monocular multi-robot relative localization with efficient deep neural networks,” in Proc. IEEE Int. Conf. Robot. Autom., 2022, pp. 9689–9695.
  22. P. Zhang, G. Chen, Y. Li, and W. Dong, “Agile formation control of drone flocking enhanced with active vision-based relative localization,” IEEE Robotics Autom. Lett., vol. 7, no. 3, pp. 6359–6366, 2022.
  23. D. Mellinger and V. Kumar, “Minimum snap trajectory generation and control for quadrotors,” in Proc. IEEE Int. Conf. Robot. Autom., 2011, pp. 2520–2525.
  24. Bitcraze, “PWM to thrust measurements for the crazyflie 2.x,” https://www.bitcraze.io/documentation/repository/crazyflie-firmware/master/functional-areas/pwm-to-thrust/, 2022, [Online; accessed July 2023].
  25. Joseph Redmon and Ali Farhadi, “YOLO9000: better, faster, stronger,” in IEEE Conf. on Computer Vision and Pattern Recognition, 2017, pp. 6517–6525.
  26. J. Redmon and A. Farhadi, “Yolov3: An incremental improvement,” CoRR, vol. abs/1804.02767, 2018.
  27. J. A. Preiss, W. Hönig, G. S. Sukhatme, and N. Ayanian, “Crazyswarm: A large nano-quadcopter swarm,” in Proc. IEEE Int. Conf. Robot. Autom., 2017, pp. 3299–3304.
Citations (4)

Summary

We haven't generated a summary for this paper yet.