Papers
Topics
Authors
Recent
Search
2000 character limit reached

FalconWing: An Open-Source Platform for Ultra-Light Fixed-Wing Aircraft Research

Published 2 May 2025 in cs.RO and cs.AI | (2505.01383v1)

Abstract: We present FalconWing -- an open-source, ultra-lightweight (150 g) fixed-wing platform for autonomy research. The hardware platform integrates a small camera, a standard airframe, offboard computation, and radio communication for manual overrides. We demonstrate FalconWing's capabilities by developing and deploying a purely vision-based control policy for autonomous landing (without IMU or motion capture) using a novel real-to-sim-to-real learning approach. Our learning approach: (1) constructs a photorealistic simulation environment via 3D Gaussian splatting trained on real-world images; (2) identifies nonlinear dynamics from vision-estimated real-flight data; and (3) trains a multi-modal Vision Transformer (ViT) policy through simulation-only imitation learning. The ViT architecture fuses single RGB image with the history of control actions via self-attention, preserving temporal context while maintaining real-time 20 Hz inference. When deployed zero-shot on the hardware platform, this policy achieves an 80% success rate in vision-based autonomous landings. Together with the hardware specifications, we also open-source the system dynamics, the software for photorealistic simulator and the learning approach.

Summary

FalconWing: An Open-Source Platform for Ultra-Light Fixed-Wing Aircraft Research

The paper presents FalconWing, an innovative open-source platform designed for autonomous research in ultra-light fixed-wing aircraft weighing just 150g. This research addresses a significant gap in current fixed-wing aircraft investigations, which generally focus on larger, sensor-rich platforms equipped with expensive equipment like GPS/GNSS sensors and high-resolution cameras. FalconWing, by contrast, provides a vision-only control system, relying on minimal hardware to facilitate repeatable and accessible indoor experimentation.

Key Components and Methodology Overview

The hardware platform integrates several components including a small FPV camera, a standard airframe, and offboard computation capabilities with the inclusion of radio communication for manual override. The aircraft is inspired by natural systems such as avian creatures, primarily using vision-based cues with no dependency on additional sensory instruments like IMUs. FalconWing exploits a lightweight analog camera for visual input and processes this data using real-to-sim-to-real learning models for control and maneuverability. This approach incorporates:

  1. Photorealistic Simulation Environment: This is constructed using a 3D modeling technique known as Gaussian Splatting. The simulator is trained on real-world images and constructs a virtual environment from which aircraft dynamics can be assessed and developed.
  2. Non-linear Dynamics Identification: The platform estimates the dynamics parameters from actual flight data using vision-based state estimation methods. This bypasses the need for motion capture systems, offering efficiency and simplification in modeling aircraft behavior using the inputs derived from the vision system.
  3. Vision-Based Control System: FalconWing employs a Vision Transformer (ViT) network architecture that processes RGB images and control history through self-attention. This manages to maintain temporal coherence and offers real-time inference around 20 Hz, effectively directing autonomous landing maneuvers.

Results and Performance Indicator

An empirical analysis demonstrates that the FalconWing platform and control policy exhibit noteworthy proficiency, marking an 80% success rate in autonomous landing tasks when deployed in real-world testing scenarios. The study spans various initial conditions and positions, highlighting the flexibility and efficiency of the Gaussian Splatting simulator and the learning approach involved in the control policy.

Implications and Prospective Directions

The implications of FalconWing are profound, offering a new paradigm for research focused on fixed-wing aircraft autonomy that emphasizes lightweight design and low-cost implementations. This platform could serve as a foundation for further studies in vision-based navigation in environments where traditional sensor apparatus is untenable, such as GPS-denied locations.

Future extensions of FalconWing could focus on integrating advanced cameras for better image quality, expanding the dynamics models to incorporate more intricate aerodynamic interactions, or leveraging the platform in outdoor settings for broader applicability. The open-source nature of this research invites collaboration and exploration among the academic community, which could catalyze progress in lightweight aerial autonomy and vision-based robotics, potentially contributing significantly to practical applications in delivery, monitoring, and search and rescue missions.

Future Research Recommendations

The challenges surrounding FalconWing implementation, particularly regarding sensor noise and limited hardware evaluations without ground-truth motion data, should inspire further research into robust sensor modalities and techniques for real-world dynamics calibration. Moreover, addressing the specific dynamics constraints to accommodate advanced maneuvers could lead to more comprehensive applications of the platform, overcoming existing limitations inherent in low-complexity sensor setups.

In summary, FalconWing presents a remarkable contribution to fixed-wing UAV research, with its emphasis on minimalism and efficiency paving diverse opportunities for advancement in autonomous aviation technologies.

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (4)

Collections

Sign up for free to add this paper to one or more collections.