Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

High-throughput Visual Nano-drone to Nano-drone Relative Localization using Onboard Fully Convolutional Networks (2402.13756v3)

Published 21 Feb 2024 in cs.CV and cs.RO

Abstract: Relative drone-to-drone localization is a fundamental building block for any swarm operations. We address this task in the context of miniaturized nano-drones, i.e., 10cm in diameter, which show an ever-growing interest due to novel use cases enabled by their reduced form factor. The price for their versatility comes with limited onboard resources, i.e., sensors, processing units, and memory, which limits the complexity of the onboard algorithms. A traditional solution to overcome these limitations is represented by lightweight deep learning models directly deployed aboard nano-drones. This work tackles the challenging relative pose estimation between nano-drones using only a gray-scale low-resolution camera and an ultra-low-power System-on-Chip (SoC) hosted onboard. We present a vertically integrated system based on a novel vision-based fully convolutional neural network (FCNN), which runs at 39Hz within 101mW onboard a Crazyflie nano-drone extended with the GWT GAP8 SoC. We compare our FCNN against three State-of-the-Art (SoA) systems. Considering the best-performing SoA approach, our model results in an R-squared improvement from 32 to 47% on the horizontal image coordinate and from 18 to 55% on the vertical image coordinate, on a real-world dataset of 30k images. Finally, our in-field tests show a reduction of the average tracking error of 37% compared to a previous SoA work and an endurance performance up to the entire battery lifetime of 4 minutes.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (21)
  1. S. Guo, B. Alkouz, B. Shahzaad, A. Lakhdari, and A. Bouguettaya, “Drone formation for efficient swarm energy consumption,” in 2023 IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events (PerCom Workshops).   Los Alamitos, CA, USA: IEEE Computer Society, mar 2023, pp. 294–296. [Online]. Available: https://doi.ieeecomputersociety.org/10.1109/PerComWorkshops56833.2023.10150322
  2. G. A. Cardona and J. M. Calderon, “Robot swarm navigation and victim detection using rendezvous consensus in search and rescue operations,” Applied Sciences, vol. 9, no. 8, p. 1702, 2019.
  3. S. Bonato, S. C. Lambertenghi, E. Cereda, A. Giusti, and D. Palossi, “Ultra-low power deep learning-based monocular relative localization onboard nano-quadrotors,” in 2023 IEEE International Conference on Robotics and Automation (ICRA), 2023, pp. 3411–3417.
  4. S. Li, C. De Wagter, and G. C. H. E. De Croon, “Self-supervised monocular multi-robot relative localization with efficient deep neural networks,” in 2022 International Conference on Robotics and Automation (ICRA), 2022, pp. 9689–9695.
  5. A. Moldagalieva and W. Hönig, “Virtual omnidirectional perception for downwash prediction within a team of nano multirotors flying in close proximity,” 2023.
  6. K. Wahba and W. Hönig, “Efficient optimization-based cable force allocation for geometric control of multiple quadrotors transporting a payload,” CoRR, vol. abs/2304.02359, 2023. [Online]. Available: https://doi.org/10.48550/arXiv.2304.02359
  7. J. Burgués, V. Hernández, A. J. Lilienthal, and S. Marco, “Smelling nano aerial vehicle for gas source localization and mapping,” Sensors, vol. 19, no. 3, 2019. [Online]. Available: https://www.mdpi.com/1424-8220/19/3/478
  8. D. Palossi, N. Zimmerman, A. Burrello, F. Conti, H. Muller, L. M. Gambardella, L. Benini, A. Giusti, and J. Guzzi, “Fully onboard ai-powered human-drone pose estimation on ultralow-power autonomous flying nano-uavs,” IEEE Internet of Things Journal, vol. 9, no. 3, pp. 1913–1929, 2022.
  9. D. Palossi, F. Tombari, S. Salti, M. Ruggiero, L. Di Stefano, and L. Benini, “Gpu-shot: Parallel optimization for real-time 3d local description,” in 2013 IEEE Conference on Computer Vision and Pattern Recognition Workshops.   IEEE, 2013, pp. 584–591.
  10. I. Ouattara, V. Korhonen, and A. Visala, “Lidar-odometry based uav pose estimation in young forest environment,” IFAC-PapersOnLine, vol. 55, no. 32, pp. 95–100, 2022, 7th IFAC Conference on Sensing, Control and Automation Technologies for Agriculture AGRICONTROL 2022. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S2405896322027549
  11. V. Niculescu, D. Palossi, M. Magno, and L. Benini, “Energy-efficient, precise uwb-based 3-d localization of sensor nodes with a nano-uav,” IEEE Internet of Things Journal, vol. 10, no. 7, pp. 5760–5777, 2023.
  12. G. Chi, Z. Yang, J. Xu, C. Wu, J. Zhang, J. Liang, and Y. Liu, “Wi-drone: Wi-fi-based 6-dof tracking for indoor drone flight control,” in Proceedings of the 20th Annual International Conference on Mobile Systems, Applications and Services, ser. MobiSys ’22.   New York, NY, USA: Association for Computing Machinery, 2022, p. 56–68. [Online]. Available: https://doi.org/10.1145/3498361.3538936
  13. K. N. Tahar and S. Kamarudin, “Uav onboard gps in positioning determination,” ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, vol. XLI-B1, pp. 1037–1042, 06 2016.
  14. Z. Farid, R. Nordin, and M. Ismail, “Recent advances in wireless indoor localization techniques and system,” Journal of Computer Networks and Communications, vol. 2013, 01 2013.
  15. J. Roberts, T. Stirling, J.-C. Zufferey, and D. Floreano, “3-d relative positioning sensor for indoor collective flying robots,” Autonomous Robots, vol. 33, 08 2012.
  16. M. Pourjabar, A. AlKatheeri, M. Rusci, A. Barcis, V. Niculescu, E. Ferrante, D. Palossi, and L. Benini, “Land & localize: An infrastructure-free and scalable nano-drones swarm with uwb-based localization,” 2023.
  17. M. Strohmeier, T. Walter, J. Rothe, and S. Montenegro, “Ultra-wideband based pose estimation for small unmanned aerial vehicles,” IEEE Access, vol. 6, pp. 57 526–57 535, 2018.
  18. S. Gao, Z. Li, Q. Han, M. Cheng, and L. Wang, “Rf-next: Efficient receptive field search for convolutional neural networks,” IEEE Transactions on Pattern Analysis; Machine Intelligence, vol. 45, no. 03, pp. 2984–3002, mar 2023.
  19. J. Redmon and A. Farhadi, “Yolov3: An incremental improvement,” ArXiv, vol. abs/1804.02767, 2018. [Online]. Available: https://api.semanticscholar.org/CorpusID:4714433
  20. A. Burrello, A. Garofalo, N. Bruschi, G. Tagliavini, D. Rossi, and F. Conti, “Dory: Automatic end-to-end deployment of real-world dnns on low-cost iot mcus,” IEEE Transactions on Computers, pp. 1–1, 2021.
  21. A. Garofalo, M. Rusci, F. Conti, D. Rossi, and L. Benini, “Pulp-nn: A computing library for quantized neural network inference at the edge on risc-v based parallel ultra low power clusters,” in 2019 26th IEEE International Conference on Electronics, Circuits and Systems (ICECS).   IEEE, 2019, pp. 33–36.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Luca Crupi (9 papers)
  2. Alessandro Giusti (38 papers)
  3. Daniele Palossi (28 papers)
Citations (1)

Summary

Enhancing Drone-to-Drone Localization Through Vision-Based Fully Convolutional Neural Networks

Introduction

Precise localization capabilities are paramount for the deployment of nano-drones in swarm operations, such as formation flying and coordinated maneuvers in search-and-rescue missions. Miniaturized drones, with their limited onboard computational and power resources, present a unique challenge in this domain. Traditional approaches often rely on the deployment of lightweight deep learning models to overcome these limitations. This paper presents a novel approach to the relative pose estimation problem, utilizing a vision-based fully convolutional neural network (FCNN) that efficiently operates on nano-drones equipped with ultra-low-power Systems-on-Chip (SoCs) and low-resolution cameras.

System Design

The proposed system is built upon a commercial Crazyflie nano-drone, enhanced with a GWT GAP8 SoC and a low-power, grayscale camera. The novel FCNN model is specifically designed to accommodate the computational and memory constraints of the GAP8 SoC. Training and testing datasets were meticulously prepared using an indoor setup with precise ground truth data obtained from an OptiTrack motion capture system. The deployment strategy is carefully outlined, leveraging int8 quantization for network weights and an efficient memory management scheme to fit the operational requirements of the GAP8 SoC.

FCNN Architecture and Implementation

The FCNN is structured to accept a 160x160 grayscale image as input and produce three 20x20 output maps, predicting the presence of a drone, its depth (distance), and the state of its LEDs. This architecture not only allows for efficient inference on constrained hardware but also demonstrates significant improvements in inference speed and power consumption over state-of-the-art methods.

Results and Evaluation

The FCNN model showcased superior performance in comparison to state-of-the-art systems in terms of the coefficient of determination (R2R^2) and Pearson correlation coefficient metrics. Notably, the model achieved an inference rate of 39 Hz at a power budget of 101 mW onboard the nano-drone, highlighting both its efficiency and real-time processing capabilities. In-field tests further validated the model's performance, demonstrating an enhanced tracking accuracy and robustness in various never-before-seen environments. Additionally, the network's LED state prediction capability opens new avenues for low-bandwidth communication between drones.

Implications and Future Directions

This work underscores the potential of specialized FCNN models in addressing the challenges of drone-to-drone localization within the constraints of nano-drones. The demonstrated improvements in localization accuracy and system efficiency pave the way for more advanced and reliable swarm operations in complex environments. Looking forward, the scalability of this approach and its application to other tasks and drone platforms merit further investigation. Moreover, the novel model's generalization capability invites exploration into its deployment in even more varied and challenging scenarios, without necessitating extensive retraining or fine-tuning.

In conclusion, the research presents a significant step forward in the pursuit of autonomous, efficient, and accurate drone-to-drone localization, with promising implications for the further advancement of drone swarm technologies.