Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
126 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

From Monocular SLAM to Autonomous Drone Exploration (1609.07835v3)

Published 26 Sep 2016 in cs.RO and cs.CV

Abstract: Micro aerial vehicles (MAVs) are strongly limited in their payload and power capacity. In order to implement autonomous navigation, algorithms are therefore desirable that use sensory equipment that is as small, low-weight, and low-power consuming as possible. In this paper, we propose a method for autonomous MAV navigation and exploration using a low-cost consumer-grade quadrocopter equipped with a monocular camera. Our vision-based navigation system builds on LSD-SLAM which estimates the MAV trajectory and a semi-dense reconstruction of the environment in real-time. Since LSD-SLAM only determines depth at high gradient pixels, texture-less areas are not directly observed so that previous exploration methods that assume dense map information cannot directly be applied. We propose an obstacle mapping and exploration approach that takes the properties of our semi-dense monocular SLAM system into account. In experiments, we demonstrate our vision-based autonomous navigation and exploration system with a Parrot Bebop MAV.

Citations (59)

Summary

  • The paper presents a vision-based system for autonomous drone navigation using monocular SLAM (LSD-SLAM) to achieve low-cost, semi-dense environmental reconstruction.
  • A novel two-step exploration strategy, including a "star discovery" local maneuver and global planning to unseen areas, is introduced to handle semi-dense mapping data.
  • Implemented on a consumer drone (Parrot Bebop) with IMU-EKF integration, the system demonstrates robust autonomous navigation and mapping in indoor environments, enabling practical MAV deployment.

Overview of "From Monocular SLAM to Autonomous Drone Exploration"

This paper presents a method for autonomous navigation of micro aerial vehicles (MAVs) utilizing monocular camera-based simultaneous localization and mapping (SLAM). The work aims to address the constraints on payload and power of MAVs by proposing a vision-based navigation system requiring minimal sensory equipment. The system builds upon the LSD-SLAM framework to navigate and explore environments using a consumer-grade drone, the Parrot Bebop, with a focus on semi-dense environment reconstruction.

Vision-Based Navigation and Mapping

At the core of the method is the application of LSD-SLAM, a framework designed for large-scale direct SLAM using monocular cameras. The system estimates the drone’s trajectory and reconstructs the environment semi-densely by evaluating depth only at high gradient regions in the image, consequently leaving texture-less areas unspecified. This semi-dense reconstruction poses unique challenges for obstacle avoidance and exploration, necessitating novel methods to handle the incomplete mapping of the environment effectively.

Exploration Strategy

The paper devises a two-step exploration process to address the challenges posed by semi-dense SLAM. The local exploration strategy, termed "star discovery," involves maneuvering the drone in a star pattern to gather information about the local surroundings efficiently, making use of motion parallax for SLAM data improvement. This maneuver is essential for filling in gaps in the map where depth information is sparse or missing.

The global exploration strategy follows by identifying areas not directly visible from the present exploration vantage points. The system computes and navigates to these “interesting” regions to facilitate further mapping. This method ensures that exploration is methodical and thorough, making use of available sensory data to guide the drones in an efficient mapping sequence.

Implementation and Experiments

The methodology is implemented using the Parrot Bebop drone equipped with a fisheye camera and IMU, thus ensuring that the hardware requirements are aligned with the constraints of MAVs: low-cost and low-power. The SLAM data, alongside attitude information from the IMU, undergoes processing through an extended Kalman filter (EKF) to maintain accuracy in state estimation during dynamic flights.

The paper provides experimental validation in indoor environments to test the robustness of the proposed exploration strategy. The results demonstrate successful autonomous navigation and mapping in complex environments where traditional, dense-map reliant methods would struggle.

Implications and Future Work

The implications of this research extend to the practical deployment of autonomous MAVs in constrained environments where payload restrictions limit the use of traditional depth sensors like RGB-D cameras or LiDAR. The proposed monocular approach opens avenues for deploying MAVs in industries requiring low-cost and rapid environmental mapping.

Moving forward, integrating additional sensory data such as IMU measurements directly into the SLAM system could enhance robustness, particularly in feature-poor environments. Moreover, expanding the system to leverage stereo vision techniques could improve depth perception without compromising the minimalist hardware footprint.

In conclusion, this research contributes significantly to the development of autonomous navigation frameworks that capitalize on efficient, low-cost components to broaden the potential applications of MAVs in real-world scenarios.

Youtube Logo Streamline Icon: https://streamlinehq.com