Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Gaussian-LIC: Real-Time Photo-Realistic SLAM with Gaussian Splatting and LiDAR-Inertial-Camera Fusion (2404.06926v2)

Published 10 Apr 2024 in cs.RO

Abstract: In this paper, we present a real-time photo-realistic SLAM method based on marrying Gaussian Splatting with LiDAR-Inertial-Camera SLAM. Most existing radiance-field-based SLAM systems mainly focus on bounded indoor environments, equipped with RGB-D or RGB sensors. However, they are prone to decline when expanding to unbounded scenes or encountering adverse conditions, such as violent motions and changing illumination. In contrast, oriented to general scenarios, our approach additionally tightly fuses LiDAR, IMU, and camera for robust pose estimation and photo-realistic online mapping. To compensate for regions unobserved by the LiDAR, we propose to integrate both the triangulated visual points from images and LiDAR points for initializing 3D Gaussians. In addition, the modeling of the sky and varying camera exposure have been realized for high-quality rendering. Notably, we implement our system purely with C++ and CUDA, and meticulously design a series of strategies to accelerate the online optimization of the Gaussian-based scene representation. Extensive experiments demonstrate that our method outperforms its counterparts while maintaining real-time capability. Impressively, regarding photo-realistic mapping, our method with our estimated poses even surpasses all the compared approaches that utilize privileged ground-truth poses for mapping. Our code will be released on project page https://xingxingzuo.github.io/gaussian_lic.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (35)
  1. “Nerf: Representing scenes as neural radiance fields for view synthesis” In Communications of the ACM 65.1 ACM New York, NY, USA, 2021, pp. 99–106
  2. “Mip-nerf 360: Unbounded anti-aliased neural radiance fields” In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2022, pp. 5470–5479
  3. “Plenoxels: Radiance fields without neural networks” In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2022, pp. 5501–5510
  4. “Instant neural graphics primitives with a multiresolution hash encoding” In ACM Transactions on Graphics (ToG) 41.4 ACM New York, NY, USA, 2022, pp. 1–15
  5. “3D Gaussian Splatting for Real-Time Radiance Field Rendering” In ACM Transactions on Graphics 42.4, 2023
  6. “iMAP: Implicit mapping and positioning in real-time” In Proceedings of the IEEE/CVF International Conference on Computer Vision, 2021, pp. 6229–6238
  7. “Nice-slam: Neural implicit scalable encoding for slam” In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2022, pp. 12786–12796
  8. “Vox-Fusion: Dense tracking and mapping with voxel-based neural implicit representation” In 2022 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 2022, pp. 499–507 IEEE
  9. Hengyi Wang, Jingwen Wang and Lourdes Agapito “Co-SLAM: Joint Coordinate and Sparse Parametric Encodings for Neural Real-Time SLAM” In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2023, pp. 13293–13302
  10. Mohammad Mahdi Johari, Camilla Carta and François Fleuret “Eslam: Efficient dense slam system based on hybrid representation of signed distance fields” In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2023, pp. 17408–17419
  11. “Point-slam: Dense neural point cloud-based slam” In Proceedings of the IEEE/CVF International Conference on Computer Vision, 2023, pp. 18433–18444
  12. “Orbeez-slam: A real-time monocular visual slam with orb features and nerf-realized mapping” In 2023 IEEE International Conference on Robotics and Automation (ICRA), 2023, pp. 9400–9406 IEEE
  13. Antoni Rosinol, John J Leonard and Luca Carlone “Nerf-slam: Real-time dense monocular slam with neural radiance fields” In 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2023, pp. 3437–3444 IEEE
  14. “Go-slam: Global optimization for consistent 3d instant reconstruction” In Proceedings of the IEEE/CVF International Conference on Computer Vision, 2023, pp. 3727–3737
  15. “NeRF-VO: Real-Time Sparse Visual Odometry with Neural Radiance Fields” In arXiv preprint arXiv:2312.13471, 2023
  16. “Gs-slam: Dense visual slam with 3d gaussian splatting” In arXiv preprint arXiv:2311.11700, 2023
  17. “Splatam: Splat, track & map 3d gaussians for dense rgb-d slam” In arXiv preprint arXiv:2312.02126, 2023
  18. “Gaussian-slam: Photo-realistic dense slam with gaussian splatting” In arXiv preprint arXiv:2312.10070, 2023
  19. “Gaussian Splatting SLAM” In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2024
  20. “Photo-slam: Real-time simultaneous localization and photorealistic mapping for monocular, stereo, and rgb-d cameras” In arXiv preprint arXiv:2311.16728, 2023
  21. “Droid-slam: Deep visual slam for monocular, stereo, and rgb-d cameras” In Advances in neural information processing systems 34, 2021, pp. 16558–16569
  22. “Orb-slam3: An accurate open-source library for visual, visual–inertial, and multimap slam” In IEEE Transactions on Robotics 37.6 IEEE, 2021, pp. 1874–1890
  23. “Lic-fusion: Lidar-inertial-camera odometry” In 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2019, pp. 5848–5854 IEEE
  24. “Lic-fusion 2.0: Lidar-inertial-camera odometry with sliding-window plane-feature tracking” In 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2020, pp. 5112–5119 IEEE
  25. “Lvi-sam: Tightly-coupled lidar-visual-inertial odometry via smoothing and mapping” In 2021 IEEE international conference on robotics and automation (ICRA), 2021, pp. 5692–5698 IEEE
  26. “R 3 LIVE: A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package” In 2022 International Conference on Robotics and Automation (ICRA), 2022, pp. 10672–10678 IEEE
  27. “FAST-LIVO: Fast and tightly-coupled sparse-direct LiDAR-inertial-visual odometry” In 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2022, pp. 4003–4009 IEEE
  28. “Continuous-Time Fixed-Lag Smoothing for LiDAR-Inertial-Camera SLAM” In IEEE/ASME Transactions on Mechatronics IEEE, 2023
  29. “Coco-LIC: continuous-time tightly-coupled LiDAR-inertial-camera odometry using non-uniform B-spline” In IEEE Robotics and Automation Letters IEEE, 2023
  30. “Drivinggaussian: Composite gaussian splatting for surrounding dynamic autonomous driving scenes” In arXiv preprint arXiv:2312.07920, 2023
  31. “Street gaussians for modeling dynamic urban scenes” In arXiv preprint arXiv:2401.01339, 2024
  32. “LIV-GaussMap: LiDAR-Inertial-Visual Fusion for Real-time 3D Radiance Field Map Rendering” In arXiv preprint arXiv:2401.14857, 2024
  33. “Multi-scale 3d gaussian splatting for anti-aliased rendering” In arXiv preprint arXiv:2311.17089, 2023
  34. “BotanicGarden: A High-Quality Dataset for Robot Navigation in Unstructured Natural Environments” In IEEE Robotics and Automation Letters 9.3, 2024, pp. 2798–2805 DOI: 10.1109/LRA.2024.3359548
  35. “Compact 3d gaussian representation for radiance field” In arXiv preprint arXiv:2311.13681, 2023

Summary

We haven't generated a summary for this paper yet.