Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Visual Localization and Mapping in Dynamic and Changing Environments (2209.10710v1)

Published 21 Sep 2022 in cs.RO

Abstract: The real-world deployment of fully autonomous mobile robots depends on a robust SLAM (Simultaneous Localization and Mapping) system, capable of handling dynamic environments, where objects are moving in front of the robot, and changing environments, where objects are moved or replaced after the robot has already mapped the scene. This paper presents Changing-SLAM, a method for robust Visual SLAM in both dynamic and changing environments. This is achieved by using a Bayesian filter combined with a long-term data association algorithm. Also, it employs an efficient algorithm for dynamic keypoints filtering based on object detection that correctly identify features inside the bounding box that are not dynamic, preventing a depletion of features that could cause lost tracks. Furthermore, a new dataset was developed with RGB-D data especially designed for the evaluation of changing environments on an object level, called PUC-USP dataset. Six sequences were created using a mobile robot, an RGB-D camera and a motion capture system. The sequences were designed to capture different scenarios that could lead to a tracking failure or a map corruption. To the best of our knowledge, Changing-SLAM is the first Visual SLAM system that is robust to both dynamic and changing environments, not assuming a given camera pose or a known map, being also able to operate in real time. The proposed method was evaluated using benchmark datasets and compared with other state-of-the-art methods, proving to be highly accurate.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
Citations (3)

Summary

The paper "Visual Localization and Mapping in Dynamic and Changing Environments" introduces Changing-SLAM, a robust solution for Visual SLAM tailored to operate in dynamic and changing environments. Changing-SLAM innovatively combines Bayesian filtering with long-term data association algorithms. This is further augmented by a keypoint filtering technique that identifies static features within dynamic bounding boxes.

Key contributions of this work include:

  1. Robust Dynamic Environment Handling: The approach incorporates a robust keypoint classification algorithm to pre-filter dynamic objects, using an Extended Kalman Filter (EKF) to track the movements of these objects. A novel feature repopulation technique is employed to differentiate valid features from dynamic objects, minimizing feature depletion issues prevalent in previous methods.
  2. Adaptation to Changing Environments: The system builds a semantic map by merging information from detected objects and over time, updates beliefs about object poses using Bayesian filtering. Notably, the system operates without assumptions about known camera poses or pre-existing maps.
  3. Introduction of the PUC-USP Dataset: A specialized dataset, PUC-USP, has been developed for evaluating SLAM performance in changing environments. It features six sequences recorded using an RGB-D camera and motion capture system, focusing on scenarios that may cause tracking failures or map corruption. Ground truth is provided through a motion capture system, facilitating accurate assessment of SLAM performance in scenarios such as vanishing and relocating objects.

In terms of evaluation, Changing-SLAM demonstrates superior performance against state-of-the-art methods on both dynamic environment datasets from TUM and in the newly introduced PUC-USP dataset. It shows significant improvement in camera localization accuracy under conditions involving both dynamic and changing environments.

Methodology Highlights:

  • Changing-SLAM employs ORB-SLAM3 as its foundation, with modifications across several threads including tracking, local mapping, object detection, and loop closure techniques.
  • Incorporation of an Atlas framework allows management of multiple disconnected maps, which are merged upon loop detection, enhancing reliability in environments prone to lose tracks.

Performance and Limitations:

  • Changing-SLAM achieves real-time processing speeds (up to 23.8 FPS), rendering it suitable for practical applications.
  • It is limited by its reliance on predefined object categories and may not handle deformable objects or significant unseen objects efficiently.

Overall, Changing-SLAM presents a comprehensive approach to tackling the challenges of visual SLAM in complex environments with both dynamic and static changes, offering robustness and real-time performance that extends the capabilities of existing systems. The introduction of the PUC-USP dataset provides a valuable resource for continued research and improvement in the field.