- The paper introduces LION, a novel lidar-inertial fusion navigator designed for robust state estimation in vision-denied environments.
- It integrates an observability metric based on the Hessian condition number in ICP to dynamically select optimal odometry sources.
- Extensive tests in DARPA SubT validated LION's 200Hz output rate, reduced drift, and improved roll and pitch accuracy.
LION: Lidar-Inertial Observability-Aware Navigator for Vision-Denied Environments
This paper presents LION, a Lidar-Inertial Observability-Aware Navigator designed for robust state estimation in scenarios where traditional visual-based navigation is infeasible. Such environments include GPS-denied, perceptually-degraded areas like underground mines and tunnels, where lighting conditions and structure complexity can severely impair conventional odometry solutions. Developed by the CoSTAR team, LION emerges from a need highlighted during the DARPA Subterranean Challenge, where the team showcased its potential by achieving top rankings.
Overview of LION's Technical Approach
LION leverages a fusion framework combining high-frequency Inertial Measurement Unit (IMU) data with low-rate lidar odometry in a fixed-lag sliding window smoother. This fusion eliminates the necessity for prior calibration of the relative positions between lidar sensors and IMUs, as extrinsic parameters are estimated online. The system structure is bifurcated into a front-end and a back-end architecture. The former consists of modules for lidar odometry via Generalized Iterative Closest Point (ICP) and IMU pre-integration, while the back-end employs factor graph optimization for state estimation, using tools like GTSAM and iSAM2.
A significant advancement with LION is its integrated observability metric, which evaluates the geometric constraints of the pose estimation process. The observability metric is calculated using the condition number of the Hessian matrix derived from the point-to-plane ICP cost, providing a quantitative measure of the state estimation confidence. In scenarios with deteriorating observability, such as geometrically sparse tunnel environments, this metric triggers the HeRO algorithm to switch to alternative odometry sources, ensuring reliability.
Experimental Validation
The efficacy of LION is demonstrated through extensive testing, particularly during the 2019 Tunnel Competition of the DARPA Subterranean Challenge. LION's competitive edge is evaluated against Wheel-Inertial odometry, direct Scan-to-Scan matching, and LOAM, showing significant reduction in drift and superior maintenance of roll and pitch accuracy. Notably, LION's output rate capability extends to 200 Hz, aligning with downstream processing needs.
Implications and Future Directions
The development of LION is poised to significantly impact robotic navigation in vision-denied environments by providing an adaptable and modular framework for state estimation. Its choice of a loosely-coupled architecture facilitates the sharing of computational resources and fosters robustness through modular redundancy. The integration of real-time extrinsic calibration and observability-aware metrics offers a template for enhancing the reliability of navigation systems in structured and unstructured terrain alike.
The advancements presented in LION underscore a trajectory towards autonomous systems capable of operating in adverse conditions with minimal pre-deployment calibration. Future work could explore the integration of additional sensory data sources and further refinement of observability metrics to enhance LION’s adaptability and performance in even more challenging environments. The modular design inherently supports iterative improvements, potentially incorporating machine learning techniques for predictive failure handling and adaptive environmental sensing.
Conclusion
LION represents a robust approach to lidar-inertial odometry aimed at environments where traditional methods flounder. Through real-time online calibration, observability assessment, and modular architecture, it provides insightful advancements in high-rate, continuous state estimation essential for autonomous navigation in subterranean conditions. This work can inform subsequent developments in autonomous navigation technology, setting a benchmark for future advancements in the field.