Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
175 tokens/sec
GPT-4o
8 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Build Your Own Visual-Inertial Drone: A Cost-Effective and Open-Source Autonomous Drone (1708.06652v3)

Published 22 Aug 2017 in cs.RO

Abstract: This paper describes an approach to building a cost-effective and research grade visual-inertial odometry aided vertical taking-off and landing (VTOL) platform. We utilize an off-the-shelf visual-inertial sensor, an onboard computer, and a quadrotor platform that are factory-calibrated and mass-produced, thereby sharing similar hardware and sensor specifications (e.g., mass, dimensions, intrinsic and extrinsic of camera-IMU systems, and signal-to-noise ratio). We then perform a system calibration and identification enabling the use of our visual-inertial odometry, multi-sensor fusion, and model predictive control frameworks with the off-the-shelf products. This implies that we can partially avoid tedious parameter tuning procedures for building a full system. The complete system is extensively evaluated both indoors using a motion capture system and outdoors using a laser tracker while performing hover and step responses, and trajectory following tasks in the presence of external wind disturbances. We achieve root-mean-square (RMS) pose errors between a reference and actual trajectories of 0.036m, while performing hover. We also conduct relatively long distance flight (~180m) experiments on a farm site and achieve 0.82% drift error of the total distance flight. This paper conveys the insights we acquired about the platform and sensor module and returns to the community as open-source code with tutorial documentation.

Citations (45)

Summary

  • The paper introduces a cost-effective, open-source drone integrating off-the-shelf components to lower the barrier for autonomous research.
  • The paper details precise calibration of camera-IMU systems and robust visual-inertial odometry using the ROVIO framework for accurate state estimation.
  • The paper applies nonlinear model predictive control (NMPC) to achieve low RMS pose errors and reliable flight performance under dynamic conditions.

Overview of Visual-Inertial Odometry Aided Autonomous Drone Development

This paper presents a comprehensive methodology for developing a cost-effective, open-source visual-inertial odometry aided autonomous drone. The research is conducted by a team at the Autonomous Systems Lab, ETH Zurich, and focuses on leveraging commercially available hardware to create a reliable and affordable VTOL platform. Key components include an off-the-shelf visual-inertial sensor, an onboard computer, and a mass-produced quadrotor platform. The integration of these components is aimed at facilitating advanced research and applications in autonomous flight.

Key Contributions

  1. Utilization of Commercially Available Components: The researchers effectively utilize off-the-shelf hardware, including the DJI Matrice 100 quadrotor and the Intel ZR300 visual-inertial sensor. This approach reduces the entry barrier for research institutions by significantly lowering costs and simplifying the procurement of replacement parts.
  2. System Integration and Calibration: The paper describes detailed procedures for system calibration, including camera-IMU extrinsic calibration and time synchronization. These steps are crucial for achieving precise state estimation and control necessary for autonomous operation.
  3. Robust Visual-Inertial Odometry Framework: The paper employs the ROVIO framework, integrating it with a multi-sensor fusion strategy to enhance ego-motion estimation accuracy. This framework is pivotal in enabling precise trajectory tracking and disturbance rejection in both indoor and outdoor environments.
  4. Nonlinear Model Predictive Control (NMPC): An NMPC method is applied for high-level control, which considers vehicle dynamics and external disturbances, notably improving the platform's autonomous navigation capabilities.
  5. Comprehensive Evaluation: Extensive testing demonstrates the system's performance in dynamic environments, including controlled conditions with wind disturbances. Highlighted experimental results show the drone achieving RMS pose errors of 0.036 m during hovering tasks and a 0.82% drift over a flight distance of approximately 180 m.

Implications and Future Directions

This research holds significant implications for both practical applications and theoretical advancements in autonomous aerial vehicles. By providing open-source code and documentation, the paper facilitates replication and adaptation, enabling a broader range of researchers to engage in UAV research without the prohibitive costs associated with specialized equipment. The robust methodology for state estimation and control presented extends the operational capabilities of low-cost VTOL platforms, suggesting applications in areas such as precision agriculture, environmental monitoring, and emergency response.

Future research could explore further integration of advanced perception and decision-making algorithms, potentially incorporating reinforcement learning for enhanced adaptability to complex environments. Additionally, scalability and adaptability to larger platforms or multi-drone systems could open new avenues in cooperative tasks and swarm robotics.

In conclusion, this research is a significant contribution to the field of autonomous drones, offering a practical and accessible framework for both researchers and practitioners interested in developing advanced aerial robotics solutions with commercially available technologies.

Youtube Logo Streamline Icon: https://streamlinehq.com