Papers
Topics
Authors
Recent
Search
2000 character limit reached

Graph-based Multi-sensor Fusion for Consistent Localization of Autonomous Construction Robots

Published 2 Mar 2022 in cs.RO, cs.SY, and eess.SY | (2203.01389v1)

Abstract: Enabling autonomous operation of large-scale construction machines, such as excavators, can bring key benefits for human safety and operational opportunities for applications in dangerous and hazardous environments. To facilitate robot autonomy, robust and accurate state-estimation remains a core component to enable these machines for operation in a diverse set of complex environments. In this work, a method for multi-modal sensor fusion for robot state-estimation and localization is presented, enabling operation of construction robots in real-world scenarios. The proposed approach presents a graph-based prediction-update loop that combines the benefits of filtering and smoothing in order to provide consistent state estimates at high update rate, while maintaining accurate global localization for large-scale earth-moving excavators. Furthermore, the proposed approach enables a flexible integration of asynchronous sensor measurements and provides consistent pose estimates even during phases of sensor dropout. For this purpose, a dual-graph design for switching between two distinct optimization problems is proposed, directly addressing temporary failure and the subsequent return of global position estimates. The proposed approach is implemented on-board two Menzi Muck walking excavators and validated during real-world tests conducted in representative operational environments.

Citations (36)

Summary

  • The paper introduces a dual-graph design that integrates asynchronous IMU, lidar, and GNSS data to maintain consistent state estimation during sensor dropouts.
  • The paper leverages a multi-threaded prediction-update loop to achieve near-real-time performance with sub-centimeter global accuracy.
  • The paper validates the approach on heavy construction robots, demonstrating its superiority over traditional filters in dynamic, complex environments.

Graph-based Multi-sensor Fusion for Consistent Localization of Autonomous Construction Robots

This paper addresses the critical requirement for robust and accurate state-estimation and localization of autonomous construction robots, with a specific focus on large-scale construction machines, such as excavators. These machines play a pivotal role in various industries, and their autonomy has been recognized for its potential to enhance safety and operational capabilities in hazardous environments. The authors propose a graph-based approach to multi-sensor fusion that integrates IMU, lidar, and GNSS data to achieve consistent and high-frequency state estimates.

The paper presents a dual-graph design within a prediction-update loop framework, aiming to address the inherent challenges faced in dynamically changing and complex environments. This design allows for the seamless integration of asynchronous measurements and effectively manages cases of sensor dropout, ensuring robust performance in real-world applications. The implementation leverages the GTSAM framework, with sensor inputs from a Leica GNSS system and an Ouster lidar, and undergoes validation on two Menzi Muck walking excavators.

Key aspects of this approach include the combination of filtering and smoothing methods, which traditionally trade off between speed and accuracy. The prediction-update loop capitalizes on multi-threaded architecture to achieve low-latency, near-real-time state estimations necessary for control tasks, while maintaining global consistency through the use of graph optimization. Such an approach is particularly beneficial in handling delayed and nonlinear sensor data—a significant limitation in conventional filtering techniques.

The dual-graph system is particularly notable for its strategy to handle GNSS dropouts, a common issue due to environmental occlusions such as dense foliage or urban canyons. By retaining the consistency of pose estimates using lidar data during GNSS outages, the system exhibits resilience, allowing for smooth transitions when GNSS data is reacquired. This capability is validated in practical settings, demonstrating the system's ability to build accurate maps and effectively manage global localization corrections upon GNSS signal restoration.

The empirical evaluation reveals the successful deployment of the proposed method under real-world scenarios, highlighting its efficacy in comparison to existing methods like the Two-State Implicit Filter (TSIF) and traditional multi-sensor fusion approaches. The numerical results from these evaluations underline the proposed method's ability to maintain sub-centimeter level global accuracy and consistency, attesting to its robustness in practical autonomous navigation tasks.

In conclusion, this paper contributes significantly to the field of robotics by presenting a multi-modal sensor fusion technique that addresses the challenges of autonomous operation in construction environments. Its findings enhance our understanding of reliable localization systems in outdoor and dynamic settings, encouraging further research into flexible sensor fusion frameworks. Future work could explore the joint optimization of additional parameters, such as chassis orientation, encoder biases, and sensor time-offsets, to further refine the system's precision and applicability. This exploration is essential for advancing autonomous capabilities in construction robotics and other industrial applications.

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Collections

Sign up for free to add this paper to one or more collections.