eWand: A calibration framework for wide baseline frame-based and event-based camera systems (2309.12685v2)
Abstract: Accurate calibration is crucial for using multiple cameras to triangulate the position of objects precisely. However, it is also a time-consuming process that needs to be repeated for every displacement of the cameras. The standard approach is to use a printed pattern with known geometry to estimate the intrinsic and extrinsic parameters of the cameras. The same idea can be applied to event-based cameras, though it requires extra work. By using frame reconstruction from events, a printed pattern can be detected. A blinking pattern can also be displayed on a screen. Then, the pattern can be directly detected from the events. Such calibration methods can provide accurate intrinsic calibration for both frame- and event-based cameras. However, using 2D patterns has several limitations for multi-camera extrinsic calibration, with cameras possessing highly different points of view and a wide baseline. The 2D pattern can only be detected from one direction and needs to be of significant size to compensate for its distance to the camera. This makes the extrinsic calibration time-consuming and cumbersome. To overcome these limitations, we propose eWand, a new method that uses blinking LEDs inside opaque spheres instead of a printed or displayed pattern. Our method provides a faster, easier-to-use extrinsic calibration approach that maintains high accuracy for both event- and frame-based cameras.
- T. A. Clarke and J. G. Fryer, “The development of camera calibration methods and models,” The Photogrammetric Record, vol. 16, no. 91, pp. 51–66, Apr. 1998. [Online]. Available: https://doi.org/10.1111/0031-868x.00113
- Z. Zhang, “Flexible camera calibration by viewing a plane from unknown orientations,” in Proceedings of the Seventh IEEE International Conference on Computer Vision. IEEE, 1999. [Online]. Available: https://doi.org/10.1109/iccv.1999.791289
- J. Rehder, J. Nikolic, T. Schneider, T. Hinzmann, and R. Siegwart, “Extending kalibr: Calibrating the extrinsics of multiple IMUs and of individual axes,” in 2016 IEEE International Conference on Robotics and Automation (ICRA). IEEE, May 2016. [Online]. Available: https://doi.org/10.1109/icra.2016.7487628
- J. Mitchelson and A. Hilton, “Wand-based multiple camera studio calibration,” Center Vision, Speech and Signal Process, 2003.
- G. Gallego, T. Delbrück, G. Orchard, C. Bartolozzi, B. Taba, A. Censi, S. Leutenegger, A. J. Davison, J. Conradt, K. Daniilidis, and D. Scaramuzza, “Event-based vision: A survey,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 44, no. 1, pp. 154–180, 2022.
- Y. Zhou, G. Gallego, H. Rebecq, L. Kneip, H. Li, and D. Scaramuzza, “Semi-dense 3d reconstruction with a stereo event camera,” in Computer Vision – ECCV 2018. Springer International Publishing, 2018, pp. 242–258. [Online]. Available: https://doi.org/10.1007/978-3-030-01246-5˙15
- J. Hidalgo-Carrio, G. Gallego, and D. Scaramuzza, “Event-aided direct sparse odometry,” in 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, June 2022. [Online]. Available: https://doi.org/10.1109/cvpr52688.2022.00569
- P. Chiberre, E. Perot, A. Sironi, and V. Lepetit, “Detecting stable keypoints from events through image gradient prediction,” in 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW). IEEE, June 2021. [Online]. Available: https://doi.org/10.1109/cvprw53098.2021.00153
- M. Muglikar, M. Gehrig, D. Gehrig, and D. Scaramuzza, “How to calibrate your event camera,” in 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW). IEEE, June 2021. [Online]. Available: https://doi.org/10.1109/cvprw53098.2021.00155
- K. Huang, Y. Wang, and L. Kneip, “Dynamic event camera calibration,” in 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2021, pp. 7021–7028.
- M. Salah, A. Ayyad, M. Humais, D. Gehrig, A. Abusafieh, L. Seneviratne, D. Scaramuzza, and Y. Zweiri, “E-calib: A fast, robust and accurate calibration toolbox for event cameras,” 2023.
- “Tvcal: Deploy multi-sensor calibration,” https://www.tangramvision.com/sdk/multimodal-calibration, 2023.
- H. Rebecq, R. Ranftl, V. Koltun, and D. Scaramuzza, “High speed and high dynamic range video with an event camera,” IEEE Trans. Pattern Anal. Mach. Intell. (T-PAMI), 2019. [Online]. Available: http://rpg.ifi.uzh.ch/docs/TPAMI19˙Rebecq.pdf
- C. Scheerlinck, H. Rebecq, D. Gehrig, N. Barnes, R. Mahony, and D. Scaramuzza, “Fast image reconstruction with an event camera,” in IEEE Winter Conf. Appl. Comput. Vis. (WACV), 2020, pp. 156–163.
- “Calibration toolbox by rpg group,” https://github.com/uzh-rpg/rpg˙dvs˙ros/tree/master/dvs˙calibration, 2014.
- M. J. Dominguez-Morales, A. Jimenez-Fernandez, G. Jimenez-Moreno, C. Conde, E. Cabello, and A. Linares-Barranco, “Bio-inspired stereo vision calibration for dynamic vision sensors,” IEEE Access, vol. 7, pp. 138 415–138 425, 2019. [Online]. Available: https://doi.org/10.1109/access.2019.2943160
- E. Mueggler, B. Huber, and D. Scaramuzza, “Event-based, 6-DOF pose tracking for high-speed maneuvers,” in 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, Sept. 2014. [Online]. Available: https://doi.org/10.1109/iros.2014.6942940
- “Calibration toolbox by vlo group,” https://github.com/VLOGroup/dvs-calibration, 2017.
- “Calibration toolbox by prophesee,” https://docs.prophesee.ai/metavision˙sdk/modules/calibration/guides/intrinsics.html, 2022.
- K.-Y. Shin and J. H. Mun, “A multi-camera calibration method using a 3-axis frame and wand,” International Journal of Precision Engineering and Manufacturing, vol. 13, no. 2, pp. 283–289, Feb. 2012. [Online]. Available: https://doi.org/10.1007/s12541-012-0035-1
- M. LLC. (2023) Active marker 2d tracking sample - metavision sdk docs 4.3.0 documentation. [Online]. Available: https://docs.prophesee.ai/stable/samples/modules/cv/active˙marker˙2d˙tracking˙cpp.html
- B. Pfrommer, “Frequency cam: Imaging periodic signals in real-time,” 2022. [Online]. Available: https://arxiv.org/abs/2211.00198
- R. I. Hartley, “Extraction of focal lengths from the fundamental matrix,” 2001.
- S. Bougnoux, “From projective to euclidean space under any practical situation, a criticism of self-calibration,” in Sixth International Conference on Computer Vision (IEEE Cat. No.98CH36271). Narosa Publishing House. [Online]. Available: https://doi.org/10.1109/iccv.1998.710808
- Z. Zhang, “A flexible new technique for camera calibration,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 22, no. 11, pp. 1330–1334, 2000. [Online]. Available: https://doi.org/10.1109/34.888718
- S. Agarwal, K. Mierle, and T. C. S. Team, “Ceres Solver,” 3 2022. [Online]. Available: https://github.com/ceres-solver/ceres-solver
- G. Bradski, “The OpenCV Library,” Dr. Dobb’s Journal of Software Tools, 2000.