PLE-SLAM: A Visual-Inertial SLAM Based on Point-Line Features and Efficient IMU Initialization (2401.01081v2)
Abstract: Visual-inertial SLAM is crucial in various fields, such as aerial vehicles, industrial robots, and autonomous driving. The fusion of camera and inertial measurement unit (IMU) makes up for the shortcomings of a signal sensor, which significantly improves the accuracy and robustness of localization in challenging environments. This article presents PLE-SLAM, an accurate and real-time visual-inertial SLAM algorithm based on point-line features and efficient IMU initialization. First, we use parallel computing methods to extract features and compute descriptors to ensure real-time performance. Adjacent short line segments are merged into long line segments, and isolated short line segments are directly deleted. Second, a rotation-translation-decoupled initialization method is extended to use both points and lines. Gyroscope bias is optimized by tightly coupling IMU measurements and image observations. Accelerometer bias and gravity direction are solved by an analytical method for efficiency. To improve the system's intelligence in handling complex environments, a scheme of leveraging semantic information and geometric constraints to eliminate dynamic features and A solution for loop detection and closed-loop frame pose estimation using CNN and GNN are integrated into the system. All networks are accelerated to ensure real-time performance. The experiment results on public datasets illustrate that PLE-SLAM is one of the state-of-the-art visual-inertial SLAM systems.
- C. Forster, M. Pizzoli, and D. Scaramuzza, “SVO: Fast semi-direct monocular visual odometry,” in Proc. IEEE Int. Conf. Robot Autom. (ICRA). IEEE, 2014, pp. 15–22.
- C. Forster, Z. Zhang, M. Gassner, M. Werlberger, and D. Scaramuzza, “SVO: Semidirect visual odometry for monocular and multicamera systems,” IEEE Trans. Robot., vol. 33, no. 2, pp. 249–265, 2017.
- R. Mur-Artal and J. D. Tardós, “ORB-SLAM2: An open-source SLAM system for monocular, stereo, and RGB-D cameras,” IEEE Trans. Robot., vol. 33, no. 5, pp. 1255–1262, 2017.
- J. Engel, V. Koltun, and D. Cremers, “Direct sparse odometry,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 40, no. 3, pp. 611–625, 2018.
- S. Leutenegger, S. Lynen, M. Bosse, R. Siegwart, and P. Furgale, “Keyframe-based visual–inertial odometry using nonlinear optimization,” Int. J. Robot. Res., vol. 34, no. 3, pp. 314–334, 2015.
- T. Qin, P. Li, and S. Shen, “VINS-Mono: A robust and versatile monocular visual-inertial state estimator,” IEEE Trans. Robot., vol. 34, no. 4, pp. 1004–1020, 2018.
- R. Mur-Artal and J. D. Tardós, “Visual-inertial monocular SLAM with map reuse,” IEEE Robot. Autom. Lett., vol. 2, no. 2, pp. 796–803, 2017.
- L. Von Stumberg, V. Usenko, and D. Cremers, “Direct sparse visual-inertial odometry using dynamic marginalization,” in Proc. IEEE Int. Conf. Robot Autom. (ICRA). IEEE, 2018, pp. 2510–2517.
- D. S. Giovanni Cioffi, “Tightly-coupled fusion of global positional measurements in optimization-based visual-inertial odometry,” in Proc. IEEE Int. Conf. Intel. Robots and Syst. (IROS), 2020.
- C. Campos, R. Elvira, J. J. G. Rodríguez, J. M. Montiel, and J. D. Tardós, “ORB-SLAM3: An accurate open-source library for visual, visual–inertial, and multimap slam,” IEEE Trans. Robot., vol. 37, no. 6, pp. 1874–1890, 2021.
- E. Rublee, V. Rabaud, K. Konolige, and G. Bradski, “ORB: An efficient alternative to SIFT or SURF,” in Proc. IEEE Int. Conf. Comput. Vis. (ICCV). Ieee, 2011, pp. 2564–2571.
- R. Gomez-Ojeda, J. Briales, and J. Gonzalez-Jimenez, “PL-SVO: Semi-direct monocular visual odometry by combining points and line segments,” in Proc. IEEE Int. Conf. Intel. Robots and Syst. (IROS). IEEE, 2016, pp. 4211–4216.
- A. Pumarola, A. Vakhitov, A. Agudo, A. Sanfeliu, and F. Moreno-Noguer, “PL-SLAM: Real-time monocular visual SLAM with points and lines,” in Proc. IEEE Int. Conf. Robot Autom. (ICRA). IEEE, 2017, pp. 4503–4508.
- R. Gomez-Ojeda, F.-A. Moreno, D. Zuniga-Noël, D. Scaramuzza, and J. Gonzalez-Jimenez, “PL-SLAM: A stereo SLAM system through the combination of points and line segments,” IEEE Trans. Robot., vol. 35, no. 3, pp. 734–746, 2019.
- K. Xu, Y. Hao, S. Yuan, C. Wang, and L. Xie, “Airvo: An illumination-robust point-line visual odometry,” in Proc. IEEE Int. Conf. Intel. Robots and Syst. (IROS). IEEE, 2023, pp. 3429–3436.
- Y. He, J. Zhao, Y. Guo, W. He, and K. Yuan, “PL-VIO: Tightly-coupled monocular visual–inertial odometry using point and line features,” Sensors, vol. 18, no. 4, p. 1159, 2018.
- Q. Fu, J. Wang, H. Yu, I. Ali, F. Guo, Y. He, and H. Zhang, “PL-VINS: Real-time monocular visual-inertial SLAM with point and line features,” 2020.
- X. Liu, S. Wen, and H. Zhang, “A real-time stereo visual-inertial SLAM system based on point-and-line features,” IEEE Trans. Veh., 2023.
- L. Xu, H. Yin, T. Shi, D. Jiang, and B. Huang, “EPLF-VINS: Real-time monocular visual-inertial SLAM with efficient point-line flow features,” IEEE Robot. Autom. Lett., vol. 8, no. 2, pp. 752–759, 2022.
- C. Akinlar and C. Topal, “EDLines: A real-time line segment detector with a false detection control,” Pattern Recognit Lett, vol. 32, no. 13, pp. 1633–1642, 2011.
- A. Martinelli, “Closed-form solution of visual-inertial structure from motion,” Int J Comput Vis, vol. 106, no. 2, pp. 138–152, 2014.
- P. Geneva, K. Eckenhoff, W. Lee, Y. Yang, and G. Huang, “OpenVINS: A research platform for visual-inertial estimation,” in Proc. IEEE Int. Conf. Robot Autom. (ICRA). IEEE, 2020, pp. 4666–4672.
- T. Qin and S. Shen, “Robust initialization of monocular visual-inertial estimation on aerial robots,” in Proc. IEEE Int. Conf. Intel. Robots and Syst. (IROS). IEEE, 2017, pp. 4225–4232.
- C. Campos, J. M. Montiel, and J. D. Tardós, “Inertial-only optimization for visual-inertial initialization,” in Proc. IEEE Int. Conf. Robot Autom. (ICRA). IEEE, 2020, pp. 51–57.
- D. Zuñiga-Noël, F.-A. Moreno, and J. Gonzalez-Jimenez, “An analytical solution to the IMU initialization problem for visual-inertial systems,” IEEE Robot. Autom. Lett., vol. 6, no. 3, pp. 6116–6122, 2021.
- Y. He, B. Xu, Z. Ouyang, and H. Li, “A rotation-translation-decoupled solution for robust and efficient visual-inertial initialization,” in Proc. IEEE Conf. Comput. Vis. Pattern Recognit. (CVPR), 2023, pp. 739–748.
- L. Kneip and H. Li, “Efficient computation of relative pose for multi-camera systems,” in Proc. IEEE Conf. Comput. Vis. Pattern Recognit. (CVPR), 2014, pp. 446–453.
- D. DeTone, T. Malisiewicz, and A. Rabinovich, “Superpoint: Self-supervised interest point detection and description,” in Proc. IEEE Conf. Comput. Vis. Pattern Recognit. (CVPR), 2018, pp. 224–236.
- P.-E. Sarlin, D. DeTone, T. Malisiewicz, and A. Rabinovich, “Superglue: Learning feature matching with graph neural networks,” in Proc. IEEE Conf. Comput. Vis. Pattern Recognit. (CVPR), 2020, pp. 4938–4947.
- R. G. Von Gioi, J. Jakubowicz, J.-M. Morel, and G. Randall, “LSD: A line segment detector,” Image Process. Line, vol. 2, pp. 35–55, 2012.
- H. Wei, T. Zhang, and L. Zhang, “A fast analytical two-stage initial-parameters estimation method for monocular-inertial navigation,” IEEE Transactions on Instrumentation and Measurement, vol. 71, pp. 1–12, 2022.
- S. Leutenegger, “OKVIS2: Realtime scalable visual-inertial SLAM with loop closure,” arXiv preprint arXiv:2202.09199, 2022.
- J. Liu, X. Li, Y. Liu, and H. Chen, “RGB-D inertial odometry for a resource-restricted robot in dynamic environments,” IEEE Robot. Autom. Lett., vol. 7, no. 4, pp. 9573–9580, 2022.
- B. Song, X. Yuan, Z. Ying, B. Yang, Y. Song, and F. Zhou, “DGM-VINS: Visual-inertial SLAM for complex dynamic environments with joint geometry feature extraction and multiple object tracking,” IEEE Trans Instrum Meas, 2023.
- J. He, M. Li, Y. Wang, and H. Wang, “OVD-SLAM: An online visual SLAM for dynamic environments,” IEEE Sens. J, 2023.
- X. Hu, Y. Zhang, Z. Cao, R. Ma, Y. Wu, Z. Deng, and W. Sun, “CFP-SLAM: A real-time visual SLAM based on coarse-to-fine probability in dynamic environments,” in Proc. IEEE Int. Conf. Intel. Robots and Syst. (IROS). IEEE, 2022, pp. 4399–4406.
- J. Tang, L. Ericson, J. Folkesson, and P. Jensfelt, “Gcnv2: Efficient correspondence prediction for real-time slam,” IEEE Robot. Autom. Lett., vol. 4, no. 4, pp. 3505–3512, 2019.
- A. Samadzadeh and A. Nickabadi, “Srvio: Super robust visual inertial odometry for dynamic environments and challenging loop-closure conditions,” IEEE Trans. Robot., 2023.
- C. Forster, L. Carlone, F. Dellaert, and D. Scaramuzza, “On-manifold preintegration for real-time visual-inertial odometry,” IEEE Trans. Robot., vol. 33, no. 1, pp. 1–21, 2016.
- L. Kneip and S. Lynen, “Direct optimization of frame-to-frame rotation,” in Proc. IEEE Int. Conf. Comput. Vis. (ICCV), 2013, pp. 2352–2359.
- L. Kneip and P. Furgale, “OpenGV: A unified and generalized approach to real-time calibrated geometric vision,” in Proc. IEEE Int. Conf. Robot Autom. (ICRA). IEEE, 2014, pp. 1–8.
- L. Freda, “PLVS: A SLAM system with points, lines, volumetric mapping, and 3d incremental segmentation,” arXiv preprint arXiv:2309.10896, 2023.
- M. Burri, J. Nikolic, P. Gohl, T. Schneider, J. Rehder, S. Omari, M. W. Achtelik, and R. Siegwart, “The EuRoC micro aerial vehicle datasets,” Int J Rob Res, vol. 35, no. 10, pp. 1157–1163, 2016.
- L. Von Stumberg and D. Cremers, “DM-VIO: Delayed marginalization visual-inertial odometry,” IEEE Robot. Autom. Lett., vol. 7, no. 2, pp. 1408–1415, 2022.
- J. Li, X. Pan, G. Huang, Z. Zhang, N. Wang, H. Bao, and G. Zhang, “RD-VIO: Robust visual-inertial odometry for mobile augmented reality in dynamic environments,” arXiv preprint arXiv:2310.15072, 2023.
- H. Yin, S. Li, Y. Tao, J. Guo, and B. Huang, “Dynam-SLAM: An accurate, robust stereo visual-inertial SLAM method in dynamic environments,” IEEE Trans. Robot., vol. 39, no. 1, pp. 289–308, 2022.
- X. Peng, Z. Liu, W. Li, P. Tan, S. Cho, and Q. Wang, “DVI-SLAM: A dual visual inertial SLAM network,” arXiv preprint arXiv:2309.13814, 2023.
- Y. Wang, Y. Ng, I. Sa, A. Parra, C. Rodriguez, T. J. Lin, and H. Li, “MAVIS: Multi-camera augmented visual-inertial SLAM using SE2 (3) based exact IMU pre-integration,” arXiv preprint arXiv:2309.08142, 2023.
- D. Schubert, T. Goll, N. Demmel, V. Usenko, J. Stückler, and D. Cremers, “The TUM VI benchmark for evaluating visual-inertial odometry,” in Proc. IEEE Int. Conf. Intel. Robots and Syst. (IROS). IEEE, 2018, pp. 1680–1687.
- X. Shi, D. Li, P. Zhao, Q. Tian, Y. Tian, Q. Long, C. Zhu, J. Song, F. Qiao, L. Song et al., “Are we ready for service robots? the openloris-scene datasets for lifelong SLAM,” in Proc. IEEE Int. Conf. Robot Autom. (ICRA). IEEE, 2020, pp. 3139–3145.
- Jiaming He (19 papers)
- Mingrui Li (14 papers)
- Yangyang Wang (41 papers)
- Hongyu Wang (104 papers)