- The paper presents a novel IPT method that achieves centimeter-level quadrotor localization with processing speeds of about 10 FPS.
- It employs modulation of AR projections and high-speed cameras, using CIELAB manipulation for imperceptible tag embedding.
- This approach overcomes high-cost and interference issues seen in systems like UWB, enabling robust, low-complexity indoor drone operations.
The paper delineates a novel approach to indoor localization for quadrotors utilizing invisible projected tags (IPT). The method capitalizes on augmented reality (AR) to bridge the gap in visual perception between indoor and outdoor environments, providing a solution for real-time, centimeter-level localization without reliance on satellite navigation systems. This proposal integrates a projection-based localization method with screen-camera communication to encode fiducial tags imperceptibly within AR scenarios, enabling both visualization and localization through minimal hardware setups involving projectors and high-speed cameras.
Problem Statement and Motivation
Indoor flight experiments with quadrotors face two primary challenges: the simplicity of indoor environments results in a lack of visual cues, and the absence of satellite navigation systems necessitates alternative localization methods. Traditional solutions such as motion capture systems or Ultra-Wide Band (UWB) devices come with limitations such as high costs, complexity, and susceptibility to environmental interferences. The paper introduces IPT as a low-cost, easy-to-use localization approach circumventing these challenges, thereby enhancing AR robotics platforms.
Methodology
The proposed system comprises both sender and receiver components. The sender utilizes modulation techniques to embed invisible fiducial tags into projected video images. This process relies on human vision's flicker-fusion threshold, allowing rapid flickering frames to remain undetected to human observers. The processing involves the manipulation of the CIELAB color space to maintain uniform perceptual lightness changes, ensuring invisibility.
On the receiver end, a high-speed camera captures the images, following demodulation procedures to extract 2D and 3D tag coordinates from the encoded tags. A sample-based image alignment method assists in mitigating motion-induced noise due to quadrotor movement. Subsequently, the pose estimation is addressed via solving the PnP problem using the IPPE algorithm, furnishing the quadrotor with its spatial orientation and position.
Experimental Results
The IPT method demonstrates impressive localization accuracy with measurements showing positional errors within 10 centimeters and achieving a processing speed of approximately 10 FPS. Compared to existing solutions such as OptiTrack and UWB, IPT provides a competitive alternative by maintaining accuracy in Z-axis positioning and including orientation data—a feature lacking in UWB systems. Furthermore, IPT's simplicity negates the need for complex calibration processes and exhibits robustness against metal-induced interference, common in UWB systems.
Implications and Future Work
The IPT method holds significant potential for applications in AR-based robotics, where cost-effectiveness and ease of deployment are crucial. This system offers a viable path forward for indoor operations of autonomous systems like drones, particularly in scenarios requiring high positional accuracy without extensive infrastructure.
Looking towards future work, the paper suggests integrating IPT with inertial measurement units (IMUs) to enhance pose accuracy and to experiment with its application across various visual content settings. Additionally, investigating the interaction effects between quadrotor dynamics and IPT performance could yield insights into optimizing deployment strategies across diverse indoor environments.
Conclusion
The presented method represents a meaningful contribution to indoor quadrotor localization, merging visualization and localization processes while minimizing hardware requirements. Its application can significantly influence the design and deployment of AR-driven robotic systems, paving the way for more efficient and accessible solutions in environments traditionally limiting for autonomous drones.