RTAB-Map: An Open-Source Solution for Visual and Lidar SLAM
The paper "RTAB-Map as an Open-Source Lidar and Visual SLAM Library for Large-Scale and Long-Term Online Operation" by Mathieu Labbé and François Michaud presents a comprehensive overview of an extended version of the RTAB-Map library. This library offers a versatile platform for Simultaneous Localization and Mapping (SLAM) applicable to various robot platforms utilizing either visual or lidar sensors. The publication provides vital insights into the comparison of SLAM techniques under different operational constraints, enhancing the understanding of their applicability in real-world scenarios.
Contributions and Methodology
RTAB-Map (Real-Time Appearance-Based Mapping) was initially developed for appearance-based loop closure detection with efficient memory management to support large-scale and long-term operations. This work extends the capabilities of RTAB-Map, enabling it to process data from both visual and lidar sensors, thus facilitating comparisons across 2D and 3D SLAM solutions within a unified framework. The paper organizes an evaluation using several popular datasets, including KITTI, EuRoC, and TUM RGB-D, outlining the practical constraints and performance benchmarks for autonomous navigation applications.
Evaluation and Results
The paper presents a rigorous evaluation of RTAB-Map configurations using multiple sensor types:
- Visual vs. Lidar SLAM: The experiments with datasets such as KITTI and the MIT Stata Center reveal that while visual SLAM offers good accuracy, lidar SLAM tends to outperform in environments with a lack of distinctive features or challenging lighting conditions.
- Performance Metrics: Metrics like the Absolute Trajectory Error (ATE) and computation time highlight the robust performance of RTAB-Map, often comparable to state-of-the-art SLAM approaches like ORB-SLAM2 and LOAM.
- Contribution to SLAM Community: The release of RTAB-Map as an open-source tool broadens access, enabling researchers and developers to prototype and refine SLAM systems using diverse sensor data.
Implications and Future Work
The extended version of RTAB-Map serves as an effective tool for researchers seeking to compare SLAM methodologies across different sensing modalities. Its flexibility and comprehensive ROS integration make it well-suited for prototyping and deployment on various autonomous robot platforms. Future directions include further integration of tight coupling strategies between visual and lidar odometry to overcome current limitations in environments with sparse or repetitive features.
Conclusion
RTAB-Map stands out as a versatile and accessible SLAM library capable of handling large-scale and long-term operations through both visual and lidar data. Its robust performance and modular design provide a solid foundation for continued research and development within the SLAM community, contributing significantly to advancements in autonomous navigation technologies.