Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Robust and Efficient Depth-based Obstacle Avoidance for Autonomous Miniaturized UAVs (2208.12624v1)

Published 26 Aug 2022 in cs.RO, cs.SY, and eess.SY

Abstract: Nano-size drones hold enormous potential to explore unknown and complex environments. Their small size makes them agile and safe for operation close to humans and allows them to navigate through narrow spaces. However, their tiny size and payload restrict the possibilities for on-board computation and sensing, making fully autonomous flight extremely challenging. The first step towards full autonomy is reliable obstacle avoidance, which has proven to be technically challenging by itself in a generic indoor environment. Current approaches utilize vision-based or 1-dimensional sensors to support nano-drone perception algorithms. This work presents a lightweight obstacle avoidance system based on a novel millimeter form factor 64 pixels multi-zone Time-of-Flight (ToF) sensor and a generalized model-free control policy. Reported in-field tests are based on the Crazyflie 2.1, extended by a custom multi-zone ToF deck, featuring a total flight mass of 35g. The algorithm only uses 0.3% of the on-board processing power (210uS execution time) with a frame rate of 15fps, providing an excellent foundation for many future applications. Less than 10% of the total drone power is needed to operate the proposed perception system, including both lifting and operating the sensor. The presented autonomous nano-size drone reaches 100% reliability at 0.5m/s in a generic and previously unexplored indoor environment. The proposed system is released open-source with an extensive dataset including ToF and gray-scale camera data, coupled with UAV position ground truth from motion capture.

Citations (26)

Summary

  • The paper presents a lightweight, efficient depth-based obstacle avoidance system for nano-UAVs using a 64-pixel Time-of-Flight sensor and a decision tree algorithm for low power consumption.
  • Experimental results show the system achieved 100% reliability in obstacle avoidance in unexplored indoor spaces at 0.5 m/s and covered 206 meters autonomously in a maze.
  • This research demonstrates a scalable solution for indoor UAV navigation, highlighting the potential of simple depth-based perception systems over computationally intensive camera approaches for real-world applications.

Robust and Efficient Depth-based Obstacle Avoidance for Autonomous Miniaturized UAVs

The paper presents an obstacle avoidance system for autonomous exploration by nano-UAVs, focusing on on-board processing and low power consumption. The system uses a 64-pixel multi-zone Time-of-Flight (ToF) sensor to provide depth information, which is a crucial aspect in enabling fully autonomous operation in complex indoor environments.

Research Foundation and Methodology

Autonomous navigation for UAVs, especially those operating under the constraints imposed by nano-size platforms, faces significant challenges, primarily related to on-board computation and sensing limitations. The approach in this paper leverages a lightweight ToF array, which is integrated onto a Crazyflie 2.1 drone. The novel aspect lies in its ability to conduct reliable obstacle avoidance using a minimal computational footprint. The algorithm employed by the drone processes data with a latency of \SI{210}{\micro\second}, consuming only 0.31% of the available processing power. This ensures that most of the power budget is devoted to sustained flight operation.

Key Experimental Results

The experimental setup tested obstacle avoidance in various scenarios, including static and dynamic environments. The significant results include achieving a 100% reliability rate in previously unexplored indoor spaces at a speed of \SI{0.5}{\meter/\second}. Tests in a controlled maze environment demonstrated the system's ability to autonomously cover distances around \SI{206}{meters} over the duration of a single flight. The application of an efficient model-free decision tree approach enabled real-time responses to environmental stimuli, crucially expanding the domain of use cases for nano-drones.

Implications of Findings

This research highlights the potential of a relatively simple, yet effective depth-based perception system that could significantly impact fields reliant on indoor drone navigation, such as surveillance or search and rescue operations in constrained environments. The low computational and power demands demonstrate that the system is easily adaptable to existing platforms, providing a scalable solution to a crucial problem in UAV autonomy.

The success in navigating complex environments using a depth model with minimal hardware suggests a shift from traditional camera-based approaches, which are computationally intensive and often limited by environmental factors such as lighting. The robustness of the presented system in handling challenges such as dynamic obstacles emphasizes its feasibility in real-world applications.

Future Directions

Further developments may focus on integrating additional sensors without surpassing the power budget, aiming to enhance environmental perception. There is potential to augment this system with machine learning frameworks, such as CNNs running on low-power processors, for more nuanced environmental understanding and improved decision-making capabilities.

Additionally, expansions on this work could explore the incorporation of additional ToF sensors to cover blind spots, or the potential integration of lightweight radar for enhanced detection of reflective and varied-material surfaces. The open-source release of this system, along with supplementary datasets, lays the groundwork for continued advancement and adaptation by the research community.

The development of this efficient depth-based obstacle avoidance mechanism opens up significant new avenues for autonomous UAV navigation, aligning with the increasing demand for versatile and reliable indoor drone technologies. The paper effectively demonstrates the utility of strategically combining minimalistic yet robust sensor arrays with intelligent control algorithms to overcome the constraints typically encountered by micro-UAV platforms.

In summary, this research paper provides an intricate and scientifically rigorous insight into creating a resource-efficient navigation system for small-scale UAVs, contributing valuable knowledge to the field of autonomous aerial robotics.

X Twitter Logo Streamline Icon: https://streamlinehq.com
Youtube Logo Streamline Icon: https://streamlinehq.com