- The paper introduces a novel IBVS method that decouples linear and angular motion using a spherical projection model and a super-twisting observer.
- It integrates a model predictive controller to compute precise joint torques, ensuring coordinated control for quadruped locomotion and manipulation.
- Experimental results in simulation and on an Aliengo platform demonstrate reliable tracking of targets with variable velocities.
Dynamic Object Tracking for Quadruped Manipulator with Spherical Image-Based Approach
This paper introduces a novel image-based visual servoing (IBVS) technique for autonomous tracking of dynamic objects by quadruped manipulators utilizing a spherical image-based approach. The authors address the challenge of tracking objects with unknown and variable velocities using only an onboard RGB camera, making it a valuable contribution to enhancing the autonomy of quadruped robots used in applications such as disaster rescue and anti-terrorist operations.
Methodological Contributions
The proposed IBVS approach consists of three key components: a spherical projection model, a robust super-twisting observer (STO), and a model predictive controller (MPC).
- Spherical Projection Model: This model decouples the visual error into linear and angular components, which is crucial for accurately estimating the motion of dynamic targets. The model projects images onto a spherical image plane, thereby accommodating the system's movement without depth estimation.
- Super-Twisting Observer (STO): The STO is designed to robustly estimate the unknown and varying velocities of the target, even in the presence of visual errors. This allows the system to function effectively without relying on depth estimation.
- Model Predictive Controller (MPC): The estimated velocity from the STO feeds into the MPC, which computes the joint torques required for the quadruped manipulator to track the target's motion. MPC considers both locomotion and manipulation aspects, ensuring coordinated control for efficient target tracking.
Experimental Validation and Results
The proposed system is evaluated through both simulation and empirical experiments. In simulations conducted in a Gazebo environment, it successfully tracked targets with constant and varying velocities along straight and S-shaped trajectories. The results demonstrated that the STO accurately estimated target velocities, enabling robust tracking with minimal error compared to conventional methods.
Empirical tests involved a physical robot comprising an Aliengo quadruped platform and a Kinova manipulator. The system successfully grasped a moving target using feature points detected by a YOLO neural network, achieving high accuracy and reliability in various scenarios. This indicates significant strides toward functional autonomy in dynamic and unstructured environments.
Theoretical and Practical Implications
Theoretically, this work advances the field of visual servoing by introducing a spherical model that decouples motion components, enhancing robustness against visual disturbances commonly faced with dynamic environments. It also demonstrates the practicality of STOs in real-world applications, as they accurately capture dynamic velocity changes without computationally intensive components like depth sensors.
Practically, these advancements improve the utility of quadruped manipulators in mission-critical scenarios, ensuring they can autonomously track and interact with moving objects efficiently, with applications ranging from search and rescue to complex manufacturing tasks.
Future Directions
Future research may explore extending this approach to handle targets with complex motion, incorporating angular velocities. Additionally, exploring integration with other perception modalities could broaden the scope and versatility of applications, enhancing the system's adaptability to various environmental conditions.
In summary, this paper presents a robust framework for dynamic object tracking using spherical image-based visual servoing, yielding impressive experimental results and showcasing significant potential for future advancements in autonomous robotics and manipulator systems.