Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Joint Spatial-Temporal Calibration for Camera and Global Pose Sensor (2403.00976v1)

Published 1 Mar 2024 in cs.RO and cs.CV

Abstract: In robotics, motion capture systems have been widely used to measure the accuracy of localization algorithms. Moreover, this infrastructure can also be used for other computer vision tasks, such as the evaluation of Visual (-Inertial) SLAM dynamic initialization, multi-object tracking, or automatic annotation. Yet, to work optimally, these functionalities require having accurate and reliable spatial-temporal calibration parameters between the camera and the global pose sensor. In this study, we provide two novel solutions to estimate these calibration parameters. Firstly, we design an offline target-based method with high accuracy and consistency. Spatial-temporal parameters, camera intrinsic, and trajectory are optimized simultaneously. Then, we propose an online target-less method, eliminating the need for a calibration target and enabling the estimation of time-varying spatial-temporal parameters. Additionally, we perform detailed observability analysis for the target-less method. Our theoretical findings regarding observability are validated by simulation experiments and provide explainable guidelines for calibration. Finally, the accuracy and consistency of two proposed methods are evaluated with hand-held real-world datasets where traditional hand-eye calibration method do not work.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (32)
  1. Timothy D Barfoot. State estimation for robotics. Cambridge University Press, 2017.
  2. Batch continuous-time trajectory estimation as exactly sparse gaussian process regression. In Robotics: Science and Systems, pages 1–10. Citeseer, 2014.
  3. The euroc micro aerial vehicle datasets. The International Journal of Robotics Research, 35(10):1157–1163, 2016.
  4. Adaptive robust kernels for non-linear least squares problems. IEEE Robotics and Automation Letters, 6(2):2240–2247, 2021.
  5. Local observability matrix and its application to observability analyses. In [Proceedings] IECON’90: 16th Annual Conference of IEEE Industrial Electronics Society, pages 100–103. IEEE, 1990.
  6. Are we ready for autonomous drone racing? the uzh-fpv drone racing dataset. In 2019 International Conference on Robotics and Automation (ICRA), pages 6713–6719. IEEE, 2019.
  7. Sparse gaussian processes on matrix lie groups: A unified framework for optimizing continuous-time trajectories. In 2018 IEEE International Conference on Robotics and Automation (ICRA), pages 6497–6504. IEEE, 2018.
  8. The foldable drone: A morphing quadrotor that can squeeze and fly. IEEE Robotics and Automation Letters, 4(2):209–216, 2018.
  9. Unified temporal and spatial calibration for multi-sensor systems. In 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, pages 1280–1286. IEEE, 2013.
  10. Evaluation of combined time-offset estimation and hand-eye calibration on robotic datasets. In Field and Service Robotics: Results of the 11th International Conference, pages 145–159. Springer, 2018.
  11. Openvins: A research platform for visual-inertial estimation. In 2020 IEEE International Conference on Robotics and Automation (ICRA), pages 4666–4672. IEEE, 2020.
  12. Camodocal: Automatic intrinsic and extrinsic calibration of a rig with multiple generic cameras and odometry. In 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, pages 1793–1800. IEEE, 2013.
  13. Consistency analysis and improvement of vision-aided inertial navigation. IEEE Transactions on Robotics, 30(1):158–176, 2013.
  14. Mingyang Li. Visual-inertial odometry on resource-constrained systems. University of California, Riverside, 2014.
  15. High-precision, consistent ekf-based visual-inertial odometry. The International Journal of Robotics Research, 32(6):690–711, 2013a.
  16. Optimization-based estimator design for vision-aided inertial navigation. In Robotics: Science and Systems, pages 241–248. Berlin Germany, 2013b.
  17. Online temporal calibration for camera–imu systems: Theory and algorithms. The International Journal of Robotics Research, 33(7):947–964, 2014.
  18. Observability-aware intrinsic and extrinsic calibration of lidar-imu systems. IEEE Transactions on Robotics, 2022.
  19. A multi-state constraint kalman filter for vision-aided inertial navigation. In Proceedings 2007 IEEE international conference on robotics and automation, pages 3565–3572. IEEE, 2007.
  20. Edwin Olson. Apriltag: A robust and flexible visual fiducial system. In 2011 IEEE international conference on robotics and automation, pages 3400–3407. IEEE, 2011.
  21. Online temporal calibration for monocular visual-inertial systems. In 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pages 3662–3669. IEEE, 2018.
  22. Real-time temporal and rotational calibration of heterogeneous sensors using motion correlation analysis. IEEE Transactions on Robotics, 37(2):587–602, 2020.
  23. Extending kalibr: Calibrating the extrinsics of multiple imus and of individual axes. In 2016 IEEE International Conference on Robotics and Automation (ICRA), pages 4304–4311. IEEE, 2016a.
  24. A general approach to spatiotemporal calibration in multisensor systems. IEEE Transactions on Robotics, 32(2):383–398, 2016b.
  25. Direct sparse odometry with rolling shutter. In Proceedings of the European Conference on Computer Vision (ECCV), pages 682–697, 2018a.
  26. The tum vi benchmark for evaluating visual-inertial odometry. In 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pages 1680–1687. IEEE, 2018b.
  27. A micro lie theory for state estimation in robotics. arXiv preprint arXiv:1812.01537, 2018.
  28. Efficient derivative computation for cumulative b-splines on lie groups. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 11148–11156, 2020.
  29. Indirect kalman filter for 3d attitude estimation. University of Minnesota, Dept. of Comp. Sci. & Eng., Tech. Rep, 2:2005, 2005.
  30. The double sphere camera model. In 2018 International Conference on 3D Vision (3DV), pages 552–560. IEEE, 2018.
  31. Degenerate motion analysis for aided ins with online spatial and temporal sensor calibration. IEEE Robotics and Automation Letters, 4(2):2070–2077, 2019.
  32. Online imu intrinsic calibration: Is it necessary? In Robotics: Science and Systems, 2020.

Summary

We haven't generated a summary for this paper yet.