Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
126 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

LIWO: Lidar-Inertial-Wheel Odometry (2302.14298v2)

Published 28 Feb 2023 in cs.RO

Abstract: LiDAR-inertial odometry (LIO), which fuses complementary information of a LiDAR and an Inertial Measurement Unit (IMU), is an attractive solution for state estimation. In LIO, both pose and velocity are regarded as state variables that need to be solved. However, the widely-used Iterative Closest Point (ICP) algorithm can only provide constraint for pose, while the velocity can only be constrained by IMU pre-integration. As a result, the velocity estimates inclined to be updated accordingly with the pose results. In this paper, we propose LIWO, an accurate and robust LiDAR-inertialwheel (LIW) odometry, which fuses the measurements from LiDAR, IMU and wheel encoder in a bundle adjustment (BA) based optimization framework. The involvement of a wheel encoder could provide velocity measurement as an important observation, which assists LIO to provide a more accurate state prediction. In addition, constraining the velocity variable by the observation from wheel encoder in optimization can further improve the accuracy of state estimation. Experiment results on two public datasets demonstrate that our system outperforms all state-of-the-art LIO systems in terms of smaller absolute trajectory error (ATE), and embedding a wheel encoder can greatly improve the performance of LIO based on the BA framework.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (21)
  1. Y. Cai, W. Xu, and F. Zhang, “ikd-tree: An incremental kd tree for robotic applications,” arXiv preprint arXiv:2102.10808, 2021.
  2. N. Carlevaris-Bianco, A. K. Ushani, and R. M. Eustice, “University of michigan north campus long-term vision and lidar dataset,” The International Journal of Robotics Research, vol. 35, no. 9, pp. 1023–1035, 2016.
  3. K. Chen, R. Nemiroff, and B. T. Lopez, “Direct lidar-inertial odometry,” arXiv preprint arXiv:2203.03749, 2022.
  4. P. Dellenbach, J.-E. Deschaud, B. Jacquet, and F. Goulette, “Ct-icp: Real-time elastic lidar odometry with loop closure,” in 2022 International Conference on Robotics and Automation (ICRA).   IEEE, 2022, pp. 5580–5586.
  5. J. Jeong, Y. Cho, Y.-S. Shin, H. Roh, and A. Kim, “Complex urban dataset with multi-level sensors from highly diverse urban environments,” The International Journal of Robotics Research, vol. 38, no. 6, pp. 642–657, 2019.
  6. G. P. C. Júnior, A. M. Rezende, V. R. Miranda, R. Fernandes, H. Azpúrua, A. A. Neto, G. Pessin, and G. M. Freitas, “Ekf-loam: an adaptive fusion of lidar slam with wheel odometry and inertial data for confined spaces with few geometric features,” IEEE Transactions on Automation Science and Engineering, vol. 19, no. 3, pp. 1458–1471, 2022.
  7. K. Li, M. Li, and U. D. Hanebeck, “Towards high-performance solid-state-lidar-inertial odometry and mapping,” IEEE Robotics and Automation Letters, vol. 6, no. 3, pp. 5167–5174, 2021.
  8. J. Liu, W. Gao, and Z. Hu, “Visual-inertial odometry tightly coupled with wheel encoder adopting robust initialization and online extrinsic calibration,” in 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2019, pp. 5391–5397.
  9. C. Qin, H. Ye, C. E. Pranata, J. Han, S. Zhang, and M. Liu, “Lins: A lidar-inertial state estimator for robust and efficient navigation,” in 2020 IEEE international conference on robotics and automation (ICRA).   IEEE, 2020, pp. 8899–8906.
  10. T. Qin, P. Li, and S. Shen, “Vins-mono: A robust and versatile monocular visual-inertial state estimator,” IEEE Transactions on Robotics, vol. 34, no. 4, pp. 1004–1020, 2018.
  11. T. Shan and B. Englot, “Lego-loam: Lightweight and ground-optimized lidar odometry and mapping on variable terrain,” in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2018, pp. 4758–4765.
  12. T. Shan, B. Englot, D. Meyers, W. Wang, C. Ratti, and D. Rus, “Lio-sam: Tightly-coupled lidar inertial odometry via smoothing and mapping,” in 2020 IEEE/RSJ international conference on intelligent robots and systems (IROS).   IEEE, 2020, pp. 5135–5142.
  13. H. Wang, C. Wang, C.-L. Chen, and L. Xie, “F-loam: Fast lidar odometry and mapping,” in 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2021, pp. 4390–4396.
  14. H. Wang, C. Wang, and L. Xie, “Intensity scan context: Coding intensity and geometry relations for loop closure detection,” in 2020 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2020, pp. 2095–2101.
  15. W. Xu, Y. Cai, D. He, J. Lin, and F. Zhang, “Fast-lio2: Fast direct lidar-inertial odometry,” IEEE Transactions on Robotics, vol. 38, no. 4, pp. 2053–2073, 2022.
  16. W. Xu and F. Zhang, “Fast-lio: A fast, robust lidar-inertial odometry package by tightly-coupled iterated kalman filter,” IEEE Robotics and Automation Letters, vol. 6, no. 2, pp. 3317–3324, 2021.
  17. H. Ye, Y. Chen, and M. Liu, “Tightly coupled 3d lidar inertial odometry and mapping,” in 2019 International Conference on Robotics and Automation (ICRA).   IEEE, 2019, pp. 3144–3150.
  18. Z. Yuan, F. Lang, and X. Yang, “Sr-lio: Lidar-inertial odometry with sweep reconstruction,” arXiv preprint arXiv:2210.10424, 2022.
  19. J. Zhang and S. Singh, “Loam: Lidar odometry and mapping in real-time.” in Robotics: Science and Systems, vol. 2, no. 9.   Berkeley, CA, 2014, pp. 1–9.
  20. ——, “Low-drift and real-time lidar odometry and mapping,” Autonomous Robots, vol. 41, pp. 401–416, 2017.
  21. S. Zhang, Y. Guo, Q. Zhu, and Z. Liu, “Lidar-imu and wheel odometer based autonomous vehicle localization system,” in 2019 Chinese control and decision conference (CCDC).   IEEE, 2019, pp. 4950–4955.
Citations (4)

Summary

  • The paper introduces LIWO, a novel odometry system that integrates LiDAR, IMU, and wheel encoder data via a tightly coupled bundle adjustment framework.
  • It achieves superior absolute trajectory error performance on datasets like nclt and kaist compared to conventional LiDAR-IMU odometry systems.
  • The method enhances robustness in state estimation for autonomous vehicles and paves the way for advanced sensor fusion techniques in mobile robotics.

Overview of LIWO: Lidar-Inertial-Wheel Odometry

The paper introduces LIWO, a novel odometry system that integrates measurements from LiDAR, an Inertial Measurement Unit (IMU), and a wheel encoder using a bundle adjustment (BA) based optimization framework. This integration is designed to provide an advanced approach for precise and robust state estimation in robotics, particularly for applications involving unmanned vehicles and autonomous navigation.

Technical Summary

LiDAR-inertial odometry (LIO) has conventionally relied on the fusion of LiDAR and IMU data. However, these systems often face challenges because the Iterative Closest Point (ICP) algorithm typically used in these systems provides constraints only for pose estimation, leaving velocity updates dependent on IMU pre-integration. This limitation can result in inaccuracies, particularly when the IMU pre-integrations are not adequately reliable.

LIWO addresses these challenges by adding a wheel encoder to the conventional LiDAR-IMU setup. The wheel encoder provides direct velocity measurements, which are incorporated as observations within the optimization framework, thereby supplementing the existing constraints and improving the overall accuracy of the state predictions. The optimization problem is modeled in a tightly coupled BA framework, integrating pose, velocity, and sensor biases as state variables. The framework also uses pre-integrated IMU measurements and wheel encoder readings to refine these variables iteratively.

Results and Implications

Experimentation on public datasets, including the comprehensive ncltnclt and kaistkaist datasets, demonstrates that LIWO achieves superior performance in absolute trajectory error (ATE) compared to state-of-the-art LIO systems. The results highlight the robustness and accuracy gains achieved by incorporating wheel encoder data into the odometry process.

The paper suggests significant implications for the field of robotics and autonomous vehicles. By integrating wheel encoder data, which is typically underutilized, LIWO presents a more accurate solution for environments where traditional LIO might struggle, such as complex or rapidly changing landscapes. The system's compatibility with 6-axis IMUs enhances its practicality across different hardware platforms, without necessitating additional costly equipment like the Attitude and Heading Reference System (AHRS).

Future Prospects

Looking toward future developments, LIWO opens avenues for improving sensor fusion techniques in mobile robotics. The current model can be further refined to address potential discrepancies such as wheel slippage or speed inconsistencies between left and right wheels during turns. These improvements could enhance the applicability of LIWO in diverse real-world scenarios, including urban navigation and off-road exploration.

Furthermore, the paper sets a foundation for ongoing research into more seamless integration of diverse sensors for state estimation, encouraging the exploration of novel algorithms and optimization techniques that can handle complex sensor data interactions more efficiently.

In conclusion, LIWO represents an advancement in odometry systems, contributing significantly to the accuracy and reliability of state estimation in autonomous systems by leveraging an effective combination of multiple sensory inputs in a structured optimization context. Its open-sourced nature promises to accelerate development and collaborative enhancements within the robotics community.