Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
175 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

GPS-VIO Fusion with Online Rotational Calibration (2309.12005v2)

Published 21 Sep 2023 in cs.RO

Abstract: Accurate global localization is crucial for autonomous navigation and planning. To this end, various GPS-aided Visual-Inertial Odometry (GPS-VIO) fusion algorithms are proposed in the literature. This paper presents a novel GPS-VIO system that is able to significantly benefit from the online calibration of the rotational extrinsic parameter between the GPS reference frame and the VIO reference frame. The behind reason is this parameter is observable. This paper provides novel proof through nonlinear observability analysis. We also evaluate the proposed algorithm extensively on diverse platforms, including flying UAV and driving vehicle. The experimental results support the observability analysis and show increased localization accuracy in comparison to state-of-the-art (SOTA) tightly-coupled algorithms.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (31)
  1. J. Kelly and G. S. Sukhatme, “Visual-inertial sensor fusion: Localization, mapping and sensor-to-sensor self-calibration,” The International Journal of Robotics Research, vol. 30, no. 1, pp. 56–79, 2011.
  2. W. Lee, K. Eckenhoff, P. Geneva, and G. Huang, “Intermittent gps-aided vio: Online initialization and calibration,” in 2020 IEEE International Conference on Robotics and Automation (ICRA), pp. 5724–5731, IEEE, 2020.
  3. G. Cioffi and D. Scaramuzza, “Tightly-coupled fusion of global positional measurements in optimization-based visual-inertial odometry,” in 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 5089–5095, IEEE, 2020.
  4. W. Lee, K. Eckenhoff, P. Geneva, and G. Huang, “Gps-aided visualinertial navigation in large-scale environments,” Robot Perception and Navigation Group (RPNG), University of Delaware, Tech. Rep, 2019.
  5. W. J. Terrell, “Local observability of nonlinear differential-algebraic equations (daes) from the linearization along a trajectory,” IEEE Transactions on Automatic Control, vol. 46, no. 12, pp. 1947–1950, 2001.
  6. Y. Tang, Y. Wu, M. Wu, W. Wu, X. Hu, and L. Shen, “Ins/gps integration: Global observability analysis,” IEEE Transactions on Vehicular Technology, vol. 58, no. 3, pp. 1129–1142, 2008.
  7. G. Huang, “Visual-inertial navigation: A concise review,” in 2019 international conference on robotics and automation (ICRA), pp. 9572–9582, IEEE, 2019.
  8. J. Delmerico and D. Scaramuzza, “A benchmark comparison of monocular visual-inertial odometry algorithms for flying robots,” in 2018 IEEE international conference on robotics and automation (ICRA), pp. 2502–2509, IEEE, 2018.
  9. T. Qin, P. Li, and S. Shen, “Vins-mono: A robust and versatile monocular visual-inertial state estimator,” IEEE Transactions on Robotics, vol. 34, no. 4, pp. 1004–1020, 2018.
  10. V. Usenko, N. Demmel, D. Schubert, J. Stückler, and D. Cremers, “Visual-inertial mapping with non-linear factor recovery,” IEEE Robotics and Automation Letters, vol. 5, no. 2, pp. 422–429, 2019.
  11. C. Campos, R. Elvira, J. J. G. Rodríguez, J. M. Montiel, and J. D. Tardós, “Orb-slam3: An accurate open-source library for visual, visual–inertial, and multimap slam,” IEEE Transactions on Robotics, vol. 37, no. 6, pp. 1874–1890, 2021.
  12. A. I. Mourikis, S. I. Roumeliotis, et al., “A multi-state constraint kalman filter for vision-aided inertial navigation.,” in ICRA, vol. 2, p. 6, 2007.
  13. K. Sun, K. Mohta, B. Pfrommer, M. Watterson, S. Liu, Y. Mulgaonkar, C. J. Taylor, and V. Kumar, “Robust stereo visual inertial odometry for fast autonomous flight,” IEEE Robotics and Automation Letters, vol. 3, no. 2, pp. 965–972, 2018.
  14. P. Geneva, K. Eckenhoff, W. Lee, Y. Yang, and G. Huang, “Openvins: A research platform for visual-inertial estimation,” in 2020 IEEE International Conference on Robotics and Automation (ICRA), pp. 4666–4672, IEEE, 2020.
  15. G. Huang, M. Kaess, and J. J. Leonard, “Towards consistent visual-inertial navigation,” in 2014 IEEE International Conference on Robotics and Automation (ICRA), pp. 4926–4933, IEEE, 2014.
  16. T. Qin, S. Cao, J. Pan, and S. Shen, “A general optimization-based framework for global pose estimation with multiple sensors,” arXiv preprint arXiv:1901.03642, 2019.
  17. R. Mascaro, L. Teixeira, T. Hinzmann, R. Siegwart, and M. Chli, “Gomsf: Graph-optimization based multi-sensor fusion for robust uav pose estimation,” in 2018 IEEE International Conference on Robotics and Automation (ICRA), pp. 1421–1428, IEEE, 2018.
  18. Y. Yu, W. Gao, C. Liu, S. Shen, and M. Liu, “A gps-aided omnidirectional visual-inertial state estimator in ubiquitous environments,” in 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 7750–7755, IEEE, 2019.
  19. Y. Wu, H. Zhang, M. Wu, X. Hu, and D. Hu, “Observability of strapdown ins alignment: A global perspective,” IEEE Transactions on Aerospace and Electronic Systems, vol. 48, no. 1, pp. 78–102, 2012.
  20. R. Hermann and A. Krener, “Nonlinear controllability and observability,” IEEE Transactions on automatic control, vol. 22, no. 5, pp. 728–740, 1977.
  21. J. Song, P. J. Sanchez-Cuevas, A. Richard, and M. Olivares-Mendez, “Gps-aided visual wheel odometry,” in 2023 IEEE 26th International Conference on Intelligent Transportation Systems (ITSC), pp. 375–382, IEEE, 2023.
  22. S. Cao, X. Lu, and S. Shen, “Gvins: Tightly coupled gnss–visual–inertial fusion for smooth and consistent state estimation,” IEEE Transactions on Robotics, 2022.
  23. S. Boche, X. Zuo, S. Schaefer, and S. Leutenegger, “Visual-inertial slam with tightly-coupled dropout-tolerant gps fusion,” in 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 7020–7027, IEEE, 2022.
  24. N. Trawny and S. I. Roumeliotis, “Indirect kalman filter for 3d attitude estimation,” University of Minnesota, Dept. of Comp. Sci. & Eng., Tech. Rep, vol. 2, p. 2005, 2005.
  25. M. Burri, J. Nikolic, P. Gohl, T. Schneider, J. Rehder, S. Omari, M. W. Achtelik, and R. Siegwart, “The euroc micro aerial vehicle datasets,” The International Journal of Robotics Research, vol. 35, no. 10, pp. 1157–1163, 2016.
  26. J. Jeong, Y. Cho, Y.-S. Shin, H. Roh, and A. Kim, “Complex urban dataset with multi-level sensors from highly diverse urban environments,” The International Journal of Robotics Research, vol. 38, no. 6, pp. 642–657, 2019.
  27. C. Forster, Z. Zhang, M. Gassner, M. Werlberger, and D. Scaramuzza, “Svo: Semidirect visual odometry for monocular and multicamera systems,” IEEE Transactions on Robotics, vol. 33, no. 2, pp. 249–265, 2016.
  28. RPG_SVO_PRO_OPEN, “Known issues and possible improvements.” Available at https://github.com/uzh-rpg/rpg_svo_pro_open/blob/master/doc/known_issues_and_improvements.md.
  29. A. Martinelli, “Vision and imu data fusion: Closed-form solutions for attitude, speed, absolute scale, and bias determination,” IEEE Transactions on Robotics, vol. 28, no. 1, pp. 44–60, 2011.
  30. K. J. Wu, C. X. Guo, G. Georgiou, and S. I. Roumeliotis, “Vins on wheels,” in 2017 IEEE International Conference on Robotics and Automation (ICRA), pp. 5155–5162, IEEE, 2017.
  31. W. Lee, P. Geneva, Y. Yang, and G. Huang, “Tightly-coupled gnss-aided visual-inertial localization,” in 2022 International Conference on Robotics and Automation (ICRA), pp. 9484–9491, IEEE, 2022.

Summary

We haven't generated a summary for this paper yet.