Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

RTAB-Map as an Open-Source Lidar and Visual SLAM Library for Large-Scale and Long-Term Online Operation (2403.06341v1)

Published 10 Mar 2024 in cs.RO

Abstract: Distributed as an open source library since 2013, RTAB-Map started as an appearance-based loop closure detection approach with memory management to deal with large-scale and long-term online operation. It then grew to implement Simultaneous Localization and Mapping (SLAM) on various robots and mobile platforms. As each application brings its own set of contraints on sensors, processing capabilities and locomotion, it raises the question of which SLAM approach is the most appropriate to use in terms of cost, accuracy, computation power and ease of integration. Since most of SLAM approaches are either visual or lidar-based, comparison is difficult. Therefore, we decided to extend RTAB-Map to support both visual and lidar SLAM, providing in one package a tool allowing users to implement and compare a variety of 3D and 2D solutions for a wide range of applications with different robots and sensors. This paper presents this extended version of RTAB-Map and its use in comparing, both quantitatively and qualitatively, a large selection of popular real-world datasets (e.g., KITTI, EuRoC, TUM RGB-D, MIT Stata Center on PR2 robot), outlining strengths and limitations of visual and lidar SLAM configurations from a practical perspective for autonomous navigation applications.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Mathieu Labbé (12 papers)
  2. François Michaud (23 papers)
Citations (673)

Summary

RTAB-Map: An Open-Source Solution for Visual and Lidar SLAM

The paper "RTAB-Map as an Open-Source Lidar and Visual SLAM Library for Large-Scale and Long-Term Online Operation" by Mathieu Labbé and François Michaud presents a comprehensive overview of an extended version of the RTAB-Map library. This library offers a versatile platform for Simultaneous Localization and Mapping (SLAM) applicable to various robot platforms utilizing either visual or lidar sensors. The publication provides vital insights into the comparison of SLAM techniques under different operational constraints, enhancing the understanding of their applicability in real-world scenarios.

Contributions and Methodology

RTAB-Map (Real-Time Appearance-Based Mapping) was initially developed for appearance-based loop closure detection with efficient memory management to support large-scale and long-term operations. This work extends the capabilities of RTAB-Map, enabling it to process data from both visual and lidar sensors, thus facilitating comparisons across 2D and 3D SLAM solutions within a unified framework. The paper organizes an evaluation using several popular datasets, including KITTI, EuRoC, and TUM RGB-D, outlining the practical constraints and performance benchmarks for autonomous navigation applications.

Evaluation and Results

The paper presents a rigorous evaluation of RTAB-Map configurations using multiple sensor types:

  1. Visual vs. Lidar SLAM: The experiments with datasets such as KITTI and the MIT Stata Center reveal that while visual SLAM offers good accuracy, lidar SLAM tends to outperform in environments with a lack of distinctive features or challenging lighting conditions.
  2. Performance Metrics: Metrics like the Absolute Trajectory Error (ATE) and computation time highlight the robust performance of RTAB-Map, often comparable to state-of-the-art SLAM approaches like ORB-SLAM2 and LOAM.
  3. Contribution to SLAM Community: The release of RTAB-Map as an open-source tool broadens access, enabling researchers and developers to prototype and refine SLAM systems using diverse sensor data.

Implications and Future Work

The extended version of RTAB-Map serves as an effective tool for researchers seeking to compare SLAM methodologies across different sensing modalities. Its flexibility and comprehensive ROS integration make it well-suited for prototyping and deployment on various autonomous robot platforms. Future directions include further integration of tight coupling strategies between visual and lidar odometry to overcome current limitations in environments with sparse or repetitive features.

Conclusion

RTAB-Map stands out as a versatile and accessible SLAM library capable of handling large-scale and long-term operations through both visual and lidar data. Its robust performance and modular design provide a solid foundation for continued research and development within the SLAM community, contributing significantly to advancements in autonomous navigation technologies.