Adaptive Denoising-Enhanced LiDAR Odometry for Degeneration Resilience in Diverse Terrains (2309.14641v2)
Abstract: The flexibility of Simultaneous Localization and Mapping (SLAM) algorithms in various environments has consistently been a significant challenge. To address the issue of LiDAR odometry drift in high-noise settings, integrating clustering methods to filter out unstable features has become an effective module of SLAM frameworks. However, reducing the amount of point cloud data can lead to potential loss of information and possible degeneration. As a result, this research proposes a LiDAR odometry that can dynamically assess the point cloud's reliability. The algorithm aims to improve adaptability in diverse settings by selecting important feature points with sensitivity to the level of environmental degeneration. Firstly, a fast adaptive Euclidean clustering algorithm based on range image is proposed, which, combined with depth clustering, extracts the primary structural points of the environment defined as ambient skeleton points. Then, the environmental degeneration level is computed through the dense normal features of the skeleton points, and the point cloud cleaning is dynamically adjusted accordingly. The algorithm is validated on the KITTI benchmark and real environments, demonstrating higher accuracy and robustness in different environments.
- J. Zhang and S. Singh, “Low-drift and real-time lidar odometry and mapping,” Autonomous Robots, p. 401–416, Feb 2017. [Online]. Available: http://dx.doi.org/10.1007/s10514-016-9548-2
- J. Lin and F. Zhang, “Loam-livox:a fast,robust,high-precision lidar odometry and mapping package for lidars of small fov,” in 2020 IEEE International Conference on Robotics and Automation (ICRA), May 2020. [Online]. Available: http://dx.doi.org/10.1109/icra40945.2020.9197440
- J. Behley and C. Stachniss, “Efficient surfel-based slam using 3d laser range data in urban environments,” in Robotics: Science and Systems XIV, Aug 2018. [Online]. Available: http://dx.doi.org/10.15607/rss.2018.xiv.016
- X. Chen, A. Milioto, E. Palazzolo, P. Giguere, J. Behley, and C. Stachniss, “Suma++: Efficient lidar-based semantic slam,” in 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Nov 2019. [Online]. Available: http://dx.doi.org/10.1109/iros40897.2019.8967704
- T. Shan, B. Englot, D. Meyers, W. Wang, C. Ratti, and D. Rus, “Lio-sam: Tightly-coupled lidar inertial odometry via smoothing and mapping,” in 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Oct 2020. [Online]. Available: http://dx.doi.org/10.1109/iros45743.2020.9341176
- W. Xu and F. Zhang, “Fast-lio: A fast, robust lidar-inertial odometry package by tightly-coupled iterated kalman filter.” IEEE Robotics and Automation Letters, p. 3317–3324, Apr 2021. [Online]. Available: http://dx.doi.org/10.1109/lra.2021.3064227
- W. Xu, Y. Cai, D. He, J. Lin, and F. Zhang, “Fast-lio2: Fast direct lidar-inertial odometry.” IEEE Transactions on Robotics, p. 2053–2073, Aug 2022. [Online]. Available: http://dx.doi.org/10.1109/tro.2022.3141876
- T. Shan and B. Englot, “Lego-loam: Lightweight and ground-optimized lidar odometry and mapping on variable terrain,” in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Oct 2018. [Online]. Available: http://dx.doi.org/10.1109/iros.2018.8594299
- J.-E. Deschaud, “Imls-slam: Scan-to-model matching based on 3d data,” in 2018 IEEE International Conference on Robotics and Automation (ICRA), May 2018. [Online]. Available: http://dx.doi.org/10.1109/icra.2018.8460653
- P. Zhou, X. Guo, X. Pei, and C. Chen, “T-loam: Truncated least squares lidar-only odometry and mapping in real time,” IEEE Transactions on Geoscience and Remote Sensing, p. 1–13, Jan 2022. [Online]. Available: http://dx.doi.org/10.1109/tgrs.2021.3083606
- D.-U. Seo, H. Lim, S. Lee, and H. Myung, “Pago-loam: Robust ground-optimized lidar odometry,” 2022.
- H. Li, B. Tian, H. Shen, and J. Lu, “An intensity-augmented lidar-inertial slam for solid-state lidars in degenerated environments,” IEEE Transactions on Instrumentation and Measurement, vol. 71, pp. 1–10, 2022.
- S. Guo, Z. Rong, S. Wang, and Y. Wu, “A lidar slam with pca-based feature extraction and two-stage matching,” IEEE Transactions on Instrumentation and Measurement, vol. 71, pp. 1–11, 2022.
- O. Bengtsson and A.-J. Baerveldt, “Robot localization based on scan-matching—estimating the covariance matrix for the idc algorithm,” Robotics and Autonomous Systems, p. 29–40, Jul 2003. [Online]. Available: http://dx.doi.org/10.1016/s0921-8890(03)00008-3
- A. Censi, “An accurate closed-form estimate of icp’s covariance,” in Proceedings 2007 IEEE International Conference on Robotics and Automation, Apr 2007. [Online]. Available: http://dx.doi.org/10.1109/robot.2007.363961
- J. Zhang and S. Singh, “Enabling aggressive motion estimation at low-drift and accurate mapping in real-time,” in 2017 IEEE International Conference on Robotics and Automation (ICRA), May 2017. [Online]. Available: http://dx.doi.org/10.1109/icra.2017.7989589
- H. Gao, X. Zhang, Y. Fang, and J. Yuan, “A line segment extraction algorithm using laser data based on seeded region growing,” International Journal of Advanced Robotic Systems, p. 172988141875524, Jan 2018. [Online]. Available: http://dx.doi.org/10.1177/1729881418755245
- A. Schaefer, D. Buscher, L. Luft, and W. Burgard, “A maximum likelihood approach to extract polylines from 2-d laser range scans,” in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Oct 2018. [Online]. Available: http://dx.doi.org/10.1109/iros.2018.8593844
- J. Meng, L. Wan, S. Wang, L. Jiang, G. Li, L. Wu, and Y. Xie, “Efficient and reliable lidar-based global localization of mobile robots using multiscale/resolution maps,” IEEE Transactions on Instrumentation and Measurement, vol. 70, p. 1–15, Jan 2021. [Online]. Available: http://dx.doi.org/10.1109/tim.2021.3093933
- H. Yu, W. Zhen, W. Yang, and S. Scherer, “Line-based 2-d–3-d registration and camera localization in structured environments,” IEEE Transactions on Instrumentation and Measurement, p. 8962–8972, Nov 2020. [Online]. Available: http://dx.doi.org/10.1109/tim.2020.2999137
- T. Ozaslan, G. Loianno, J. Keller, C. J. Taylor, V. Kumar, J. M. Wozencraft, and T. Hood, “Autonomous navigation and mapping for inspection of penstocks and tunnels with mavs,” IEEE Robotics and Automation Letters, p. 1740–1747, Jul 2017. [Online]. Available: http://dx.doi.org/10.1109/lra.2017.2699790
- W. Shi, S. Li, C. Yao, Q. Yan, C. Liu, and Q. Chen, “Dense normal based degeneration-aware 2-d lidar odometry for correlative scan matching,” IEEE Transactions on Instrumentation and Measurement, p. 1–16, Jan 2023. [Online]. Available: http://dx.doi.org/10.1109/tim.2022.3231326
- H. Lim, D. Kim, B. Kim, and H. Myung, “Adalio: Robust adaptive lidar-inertial odometry in degenerate indoor environments,” 2023.
- A. Geiger, P. Lenz, C. Stiller, and R. Urtasun, “Vision meets robotics: The kitti dataset,” The International Journal of Robotics Research, p. 1231–1237, Sep 2013. [Online]. Available: http://dx.doi.org/10.1177/0278364913491297
- J. Lin, C. Zheng, W. Xu, and F. Zhang, “R2live: A robust, real-time, lidar-inertial-visual tightly-coupled state estimator and mapping,” IEEE Robotics and Automation Letters, p. 7469–7476, Oct 2021. [Online]. Available: http://dx.doi.org/10.1109/lra.2021.3095515
- J. Lin and F. Zhang, “R3live: A robust, real-time, rgb-colored, lidar-inertial-visual tightly-coupled state estimation and mapping package,” in 2022 International Conference on Robotics and Automation (ICRA), May 2022. [Online]. Available: http://dx.doi.org/10.1109/icra46639.2022.9811935
- K. Ebadi, M. Palieri, S. Wood, C. Padgett, and A.-a. Agha-mohammadi, “Dare-slam: Degeneracy-aware and resilient loop closing in perceptually-degraded environments,” arXiv: Robotics,arXiv: Robotics, Feb 2021.
- M. Pierzchała, P. Giguère, and R. Astrup, “Mapping forests using an unmanned ground vehicle with 3d lidar and graph-slam,” Computers and Electronics in Agriculture, p. 217–225, Feb 2018. [Online]. Available: http://dx.doi.org/10.1016/j.compag.2017.12.034
- M. Li, H. Zhu, S. You, L. Wang, and C. Tang, “Efficient laser-based 3d slam for coal mine rescue robots,” IEEE Access, p. 14124–14138, Jan 2019. [Online]. Available: http://dx.doi.org/10.1109/access.2018.2889304
- X. Wei, J. Lv, J. Sun, E. Dong, and S. Pu, “Gclo: Ground constrained lidar odometry with low-drifts for gps-denied indoor environments,” in 2022 International Conference on Robotics and Automation (ICRA), May 2022. [Online]. Available: http://dx.doi.org/10.1109/icra46639.2022.9812336
- B. Wu, A. Wan, X. Yue, and K. Keutzer, “Squeezeseg: Convolutional neural nets with recurrent crf for real-time road-object segmentation from 3d lidar point cloud,” in 2018 IEEE International Conference on Robotics and Automation (ICRA), May 2018. [Online]. Available: http://dx.doi.org/10.1109/icra.2018.8462926
- Y. Zhou and O. Tuzel, “Voxelnet: End-to-end learning for point cloud based 3d object detection,” in 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Jun 2018. [Online]. Available: http://dx.doi.org/10.1109/cvpr.2018.00472
- R. Q. Charles, H. Su, M. Kaichun, and L. J. Guibas, “Pointnet: Deep learning on point sets for 3d classification and segmentation,” in 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Jul 2017. [Online]. Available: http://dx.doi.org/10.1109/cvpr.2017.16
- A. Milioto, I. Vizzo, J. Behley, and C. Stachniss, “Rangenet ++: Fast and accurate lidar semantic segmentation,” in 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Nov 2019. [Online]. Available: http://dx.doi.org/10.1109/iros40897.2019.8967762
- R. B. Rusu, “Semantic 3d object maps for everyday manipulation in human living environments,” KI - Künstliche Intelligenz, p. 345–348, Nov 2010. [Online]. Available: http://dx.doi.org/10.1007/s13218-010-0059-6
- S. Kato, S. Tokunaga, Y. Maruyama, S. Maeda, M. Hirabayashi, Y. Kitsukawa, A. Monrroy, T. Ando, Y. Fujii, and T. Azumi, “Autoware on board: enabling autonomous vehicles with embedded systems,” in 2018 ACM/IEEE 9th International Conference on Cyber-Physical Systems (ICCPS), Apr 2018. [Online]. Available: http://dx.doi.org/10.1109/iccps.2018.00035
- F. Hasecke, L. Hahn, and A. Kummert, “Flic: Fast lidar image clustering,” arXiv: Computer Vision and Pattern Recognition,arXiv: Computer Vision and Pattern Recognition, Mar 2020.
- Z. Yan, T. Duckett, and N. Bellotto, “Online learning for 3d lidar-based human detection: experimental analysis of point cloud clustering and classification methods,” Autonomous Robots, p. 147–164, Jan 2020. [Online]. Available: http://dx.doi.org/10.1007/s10514-019-09883-y
- M. Ester, H.-P. Kriegel, J. Sander, X. Xu et al., “A density-based algorithm for discovering clusters in large spatial databases with noise,” in kdd, vol. 96, no. 34, 1996, pp. 226–231.
- J. Papon, A. Abramov, M. Schoeler, and F. Worgotter, “Voxel cloud connectivity segmentation - supervoxels for point clouds,” in 2013 IEEE Conference on Computer Vision and Pattern Recognition, Jun 2013. [Online]. Available: http://dx.doi.org/10.1109/cvpr.2013.264
- S. Park, S. Wang, H. Lim, and U. Kang, “Curved-voxel clustering for accurate segmentation of 3d lidar point clouds with real-time performance,” in 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Nov 2019. [Online]. Available: http://dx.doi.org/10.1109/iros40897.2019.8968026
- D. Zermas, I. Izzat, and N. Papanikolopoulos, “Fast segmentation of 3d point clouds: A paradigm on lidar data for autonomous vehicle applications,” in 2017 IEEE International Conference on Robotics and Automation (ICRA), May 2017. [Online]. Available: http://dx.doi.org/10.1109/icra.2017.7989591
- I. Bogoslavskyi and C. Stachniss, “Fast range image-based segmentation of sparse 3d laser scans for online operation,” in 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Oct 2016. [Online]. Available: http://dx.doi.org/10.1109/iros.2016.7759050
- Z. Chen, H. Xu, J. Zhao, and H. Liu, “Fast-spherical-projection-based point cloud clustering algorithm,” Transportation Research Record: Journal of the Transportation Research Board, p. 315–329, Jun 2022. [Online]. Available: http://dx.doi.org/10.1177/03611981221074365
- Y. Zhao, X. Zhang, and X. Huang, “A technical survey and evaluation of traditional point cloud clustering methods for lidar panoptic segmentation,” in 2021 IEEE/CVF International Conference on Computer Vision Workshops (ICCVW), Oct 2021. [Online]. Available: http://dx.doi.org/10.1109/iccvw54120.2021.00279
- T. Yang, Y. Li, C. Zhao, D. Yao, G. Chen, L. Sun, T. Krajnik, and Z. Yan, “3d tof lidar in mobile robotics: A review,” 2022.
- J. Kang and N. L. Doh, “Full-dof calibration of a rotating 2-d lidar with a simple plane measurement,” IEEE Transactions on Robotics, p. 1245–1263, Oct 2016. [Online]. Available: http://dx.doi.org/10.1109/tro.2016.2596769
- C. Park, S. Kim, P. Moghadam, C. Fookes, and S. Sridharan, “Probabilistic surfel fusion for dense lidar mapping,” in 2017 IEEE International Conference on Computer Vision Workshops (ICCVW), Oct 2017. [Online]. Available: http://dx.doi.org/10.1109/iccvw.2017.285
- M. Himmelsbach, F. v. Hundelshausen, and H.-J. Wuensche, “Fast segmentation of 3d point clouds for ground vehicles,” in 2010 IEEE Intelligent Vehicles Symposium, Jun 2010. [Online]. Available: http://dx.doi.org/10.1109/ivs.2010.5548059