Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

VILENS: Visual, Inertial, Lidar, and Leg Odometry for All-Terrain Legged Robots (2107.07243v3)

Published 15 Jul 2021 in cs.RO and cs.CV

Abstract: We present visual inertial lidar legged navigation system (VILENS), an odometry system for legged robots based on factor graphs. The key novelty is the tight fusion of four different sensor modalities to achieve reliable operation when the individual sensors would otherwise produce degenerate estimation. To minimize leg odometry drift, we extend the robot's state with a linear velocity bias term, which is estimated online. This bias is observable because of the tight fusion of this preintegrated velocity factor with vision, lidar, and inertial measurement unit (IMU) factors. Extensive experimental validation on different ANYmal quadruped robots is presented, for a total duration of 2 h and 1.8 km traveled. The experiments involved dynamic locomotion over loose rocks, slopes, and mud, which caused challenges such as slippage and terrain deformation. Perceptual challenges included dark and dusty underground caverns, and open and feature-deprived areas. We show an average improvement of 62% translational and 51% rotational errors compared to a state-of-the-art loosely coupled approach. To demonstrate its robustness, VILENS was also integrated with a perceptive controller and a local path planner.

Citations (93)

Summary

  • The paper presents VILENS, a tightly coupled multi-sensor fusion framework that reduces translational and rotational errors by 62% and 51%, respectively.
  • It integrates visual, inertial, lidar, and leg odometry data within a factor graph to overcome limitations of single-sensor systems in extreme terrains.
  • Experimental validation on ANYmal quadruped robots demonstrates enhanced robustness, paving the way for reliable autonomous navigation under adverse conditions.

VILENS: Advancing Multi-Sensor Fusion for Legged Robots

The paper presents VILENS, a robust odometry system designed for all-terrain legged robots, with an innovative fusion of visual, inertial, lidar, and leg odometry data within a factor graph framework. Challenging the limitations of single-modality systems, VILENS is tailored for scenarios where individual sensors fail due to environmental constraints, providing a comprehensive solution for state estimation in extreme conditions.

The core novelty of the VILENS system lies in its ability to integrate data from multiple sensors tightly, an approach that enhances the reliability of pose estimation. This tight fusion is particularly significant when sensors individually would produce degenerate estimation. Notably, the system introduces a velocity bias term to the leg odometry inputs. This term aids in compensating for drift typical in legged locomotion on complex terrains. The bias is observable due to the integration of multi-modal inputs, which collectively inform a more accurate estimation than any single modality could achieve.

The experimental validation of VILENS, conducted using ANYmal quadruped robots over a diverse range of terrains, demonstrates its practical efficacy. The system showed substantial performance improvements, reducing translational and rotational errors by 62% and 51% respectively compared to a state-of-the-art loosely coupled baseline. These metrics underscore the effectiveness of VILENS in reducing estimation drift and error across challenging environments, including loose rocks, slopes, mud, and regions with poor visual features, such as dark caves and open fields.

The paper also showcases the benefits of integrating advanced components such as perceptive controllers and local path planners with VILENS to elevate robustness in decision-making processes, offering a comprehensive system ready for real-world deployment in adverse conditions, such as those encountered in the DARPA Subterranean Challenge.

Beyond the immediate results, the implications of this work extend to the broader landscape of field robotics, where evolving sensor fusion techniques can significantly enhance autonomy and adaptability. The ability to accurately estimate state information despite complex interactions with unpredictable terrains broadens the operational envelope of legged robots, paving the way for enhanced exploratory and autonomous functions.

Looking forward, the development opens up avenues for refining sensor fusion algorithms further and potentially adapting them to other mobile robotic systems. Future advancements could focus on optimizing computational efficiency and exploring machine learning techniques to auto-tune system parameters for diverse environments, streamlining the deployment of VILENS in an array of robotic platforms. In sum, the contribution to the field of robotic perception and navigation is notable, setting a precedent for robust, multi-sensor integrated systems.

Youtube Logo Streamline Icon: https://streamlinehq.com