Gaussian-LIC: Real-Time Photo-Realistic SLAM with Gaussian Splatting and LiDAR-Inertial-Camera Fusion (2404.06926v2)
Abstract: In this paper, we present a real-time photo-realistic SLAM method based on marrying Gaussian Splatting with LiDAR-Inertial-Camera SLAM. Most existing radiance-field-based SLAM systems mainly focus on bounded indoor environments, equipped with RGB-D or RGB sensors. However, they are prone to decline when expanding to unbounded scenes or encountering adverse conditions, such as violent motions and changing illumination. In contrast, oriented to general scenarios, our approach additionally tightly fuses LiDAR, IMU, and camera for robust pose estimation and photo-realistic online mapping. To compensate for regions unobserved by the LiDAR, we propose to integrate both the triangulated visual points from images and LiDAR points for initializing 3D Gaussians. In addition, the modeling of the sky and varying camera exposure have been realized for high-quality rendering. Notably, we implement our system purely with C++ and CUDA, and meticulously design a series of strategies to accelerate the online optimization of the Gaussian-based scene representation. Extensive experiments demonstrate that our method outperforms its counterparts while maintaining real-time capability. Impressively, regarding photo-realistic mapping, our method with our estimated poses even surpasses all the compared approaches that utilize privileged ground-truth poses for mapping. Our code will be released on project page https://xingxingzuo.github.io/gaussian_lic.
- “Nerf: Representing scenes as neural radiance fields for view synthesis” In Communications of the ACM 65.1 ACM New York, NY, USA, 2021, pp. 99–106
- “Mip-nerf 360: Unbounded anti-aliased neural radiance fields” In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2022, pp. 5470–5479
- “Plenoxels: Radiance fields without neural networks” In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2022, pp. 5501–5510
- “Instant neural graphics primitives with a multiresolution hash encoding” In ACM Transactions on Graphics (ToG) 41.4 ACM New York, NY, USA, 2022, pp. 1–15
- “3D Gaussian Splatting for Real-Time Radiance Field Rendering” In ACM Transactions on Graphics 42.4, 2023
- “iMAP: Implicit mapping and positioning in real-time” In Proceedings of the IEEE/CVF International Conference on Computer Vision, 2021, pp. 6229–6238
- “Nice-slam: Neural implicit scalable encoding for slam” In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2022, pp. 12786–12796
- “Vox-Fusion: Dense tracking and mapping with voxel-based neural implicit representation” In 2022 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 2022, pp. 499–507 IEEE
- Hengyi Wang, Jingwen Wang and Lourdes Agapito “Co-SLAM: Joint Coordinate and Sparse Parametric Encodings for Neural Real-Time SLAM” In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2023, pp. 13293–13302
- Mohammad Mahdi Johari, Camilla Carta and François Fleuret “Eslam: Efficient dense slam system based on hybrid representation of signed distance fields” In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2023, pp. 17408–17419
- “Point-slam: Dense neural point cloud-based slam” In Proceedings of the IEEE/CVF International Conference on Computer Vision, 2023, pp. 18433–18444
- “Orbeez-slam: A real-time monocular visual slam with orb features and nerf-realized mapping” In 2023 IEEE International Conference on Robotics and Automation (ICRA), 2023, pp. 9400–9406 IEEE
- Antoni Rosinol, John J Leonard and Luca Carlone “Nerf-slam: Real-time dense monocular slam with neural radiance fields” In 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2023, pp. 3437–3444 IEEE
- “Go-slam: Global optimization for consistent 3d instant reconstruction” In Proceedings of the IEEE/CVF International Conference on Computer Vision, 2023, pp. 3727–3737
- “NeRF-VO: Real-Time Sparse Visual Odometry with Neural Radiance Fields” In arXiv preprint arXiv:2312.13471, 2023
- “Gs-slam: Dense visual slam with 3d gaussian splatting” In arXiv preprint arXiv:2311.11700, 2023
- “Splatam: Splat, track & map 3d gaussians for dense rgb-d slam” In arXiv preprint arXiv:2312.02126, 2023
- “Gaussian-slam: Photo-realistic dense slam with gaussian splatting” In arXiv preprint arXiv:2312.10070, 2023
- “Gaussian Splatting SLAM” In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2024
- “Photo-slam: Real-time simultaneous localization and photorealistic mapping for monocular, stereo, and rgb-d cameras” In arXiv preprint arXiv:2311.16728, 2023
- “Droid-slam: Deep visual slam for monocular, stereo, and rgb-d cameras” In Advances in neural information processing systems 34, 2021, pp. 16558–16569
- “Orb-slam3: An accurate open-source library for visual, visual–inertial, and multimap slam” In IEEE Transactions on Robotics 37.6 IEEE, 2021, pp. 1874–1890
- “Lic-fusion: Lidar-inertial-camera odometry” In 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2019, pp. 5848–5854 IEEE
- “Lic-fusion 2.0: Lidar-inertial-camera odometry with sliding-window plane-feature tracking” In 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2020, pp. 5112–5119 IEEE
- “Lvi-sam: Tightly-coupled lidar-visual-inertial odometry via smoothing and mapping” In 2021 IEEE international conference on robotics and automation (ICRA), 2021, pp. 5692–5698 IEEE
- “R 3 LIVE: A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package” In 2022 International Conference on Robotics and Automation (ICRA), 2022, pp. 10672–10678 IEEE
- “FAST-LIVO: Fast and tightly-coupled sparse-direct LiDAR-inertial-visual odometry” In 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2022, pp. 4003–4009 IEEE
- “Continuous-Time Fixed-Lag Smoothing for LiDAR-Inertial-Camera SLAM” In IEEE/ASME Transactions on Mechatronics IEEE, 2023
- “Coco-LIC: continuous-time tightly-coupled LiDAR-inertial-camera odometry using non-uniform B-spline” In IEEE Robotics and Automation Letters IEEE, 2023
- “Drivinggaussian: Composite gaussian splatting for surrounding dynamic autonomous driving scenes” In arXiv preprint arXiv:2312.07920, 2023
- “Street gaussians for modeling dynamic urban scenes” In arXiv preprint arXiv:2401.01339, 2024
- “LIV-GaussMap: LiDAR-Inertial-Visual Fusion for Real-time 3D Radiance Field Map Rendering” In arXiv preprint arXiv:2401.14857, 2024
- “Multi-scale 3d gaussian splatting for anti-aliased rendering” In arXiv preprint arXiv:2311.17089, 2023
- “BotanicGarden: A High-Quality Dataset for Robot Navigation in Unstructured Natural Environments” In IEEE Robotics and Automation Letters 9.3, 2024, pp. 2798–2805 DOI: 10.1109/LRA.2024.3359548
- “Compact 3d gaussian representation for radiance field” In arXiv preprint arXiv:2311.13681, 2023