Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
194 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

VIPS-Odom: Visual-Inertial Odometry Tightly-coupled with Parking Slots for Autonomous Parking (2407.05017v1)

Published 6 Jul 2024 in cs.RO

Abstract: Precise localization is of great importance for autonomous parking task since it provides service for the downstream planning and control modules, which significantly affects the system performance. For parking scenarios, dynamic lighting, sparse textures, and the instability of global positioning system (GPS) signals pose challenges for most traditional localization methods. To address these difficulties, we propose VIPS-Odom, a novel semantic visual-inertial odometry framework for underground autonomous parking, which adopts tightly-coupled optimization to fuse measurements from multi-modal sensors and solves odometry. Our VIPS-Odom integrates parking slots detected from the synthesized bird-eye-view (BEV) image with traditional feature points in the frontend, and conducts tightly-coupled optimization with joint constraints introduced by measurements from the inertial measurement unit, wheel speed sensor and parking slots in the backend. We develop a multi-object tracking framework to robustly track parking slots' states. To prove the superiority of our method, we equip an electronic vehicle with related sensors and build an experimental platform based on ROS2 system. Extensive experiments demonstrate the efficacy and advantages of our method compared with other baselines for parking scenarios.

Summary

  • The paper introduces a semantic visual-inertial odometry framework that fuses visual, inertial, and wheel sensor data with parking slot information for enhanced localization.
  • It employs tightly-coupled optimization and joint constraints, significantly improving precision and robustness over baseline methods in real-world tests.
  • The integration of synthesized BEV parking slot detections and multi-object tracking ensures reliable performance even under adverse lighting and GPS-denied conditions.

The paper "VIPS-Odom: Visual-Inertial Odometry Tightly-coupled with Parking Slots for Autonomous Parking" addresses the critical need for precise localization in autonomous parking tasks. This is essential as accurate localization provides essential data for downstream planning and control modules, heavily influencing overall system performance.

In the context of parking, traditional localization methods encounter significant challenges due to factors like dynamic lighting conditions, sparse textures, and unreliable GPS signals, particularly in underground or covered parking facilities. The authors propose VIPS-Odom, a novel semantic visual-inertial odometry framework specifically designed to tackle these challenges in autonomous parking scenarios.

Key Components of VIPS-Odom

  1. Tightly-Coupled Optimization:
    • VIPS-Odom employs a tightly-coupled optimization approach that fuses measurements from multiple sensors—visual, inertial, and wheel speed—to solve the odometry problem.
  2. Integration of Parking Slots:
    • The system integrates parking slots detected from synthesized bird-eye-view (BEV) images with traditional feature points in the frontend. This integration enhances the reliability and accuracy of the visual-inertial odometry.
  3. Joint Constraints in the Backend:
    • The backend optimization involves joint constraints introduced by IMU (Inertial Measurement Unit) measurements, wheel speed sensor data, and detected parking slots. This multi-sensor fusion enhances the robustness of localization.
  4. Multi-Object Tracking Framework:
    • A significant innovation of the framework is robust multi-object tracking for parking slots' states, ensuring the system can consistently track and localize these critical features even in complex environments.

Experimental Validation

To demonstrate the effectiveness of VIPS-Odom, the researchers equipped an electric vehicle with the necessary sensors structured within the ROS2 (Robot Operating System 2) platform. Extensive real-world experiments showed that VIPS-Odom significantly outperformed baseline methods in various parking scenarios, demonstrating better accuracy and robustness.

Contributions

  • Resilience to Adverse Conditions:

    The introduction of semantic information from parking slots and tight sensor fusion helps the system remain robust against dynamic lighting conditions and sparse textures.

  • Precision in Localization:

    By optimizing visual-inertial odometry with additional constraints and semantic data, VIPS-Odom achieves high precision in localization necessary for accurate autonomous parking.

  • Robust Multi-Sensor Fusion:

    The approach highlights the advantages of integrating multiple sensor measurements, which enhances the system's ability to function reliably in the GPS-denied environments typical of underground parking facilities.

Overall, VIPS-Odom presents a significant step forward in the development of autonomous parking systems, combining innovative methods in visual-inertial odometry with semantic information from parking slots to achieve precise and reliable localization.