Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

E-Calib: A Fast, Robust and Accurate Calibration Toolbox for Event Cameras (2306.09078v2)

Published 15 Jun 2023 in cs.CV

Abstract: Event cameras triggered a paradigm shift in the computer vision community delineated by their asynchronous nature, low latency, and high dynamic range. Calibration of event cameras is always essential to account for the sensor intrinsic parameters and for 3D perception. However, conventional image-based calibration techniques are not applicable due to the asynchronous, binary output of the sensor. The current standard for calibrating event cameras relies on either blinking patterns or event-based image reconstruction algorithms. These approaches are difficult to deploy in factory settings and are affected by noise and artifacts degrading the calibration performance. To bridge these limitations, we present E-Calib, a novel, fast, robust, and accurate calibration toolbox for event cameras utilizing the asymmetric circle grid, for its robustness to out-of-focus scenes. The proposed method is tested in a variety of rigorous experiments for different event camera models, on circle grids with different geometric properties, and under challenging illumination conditions. The results show that our approach outperforms the state-of-the-art in detection success rate, reprojection error, and estimation accuracy of extrinsic parameters.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (42)
  1. G. Gallego, T. Delbrück, G. Orchard, C. Bartolozzi, B. Taba, A. Censi, S. Leutenegger, A. J. Davison, J. Conradt, K. Daniilidis, and D. Scaramuzza, “Event-based vision: A survey,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 44, no. 1, pp. 154–180, 2022.
  2. Y. Alkendi, R. Azzam, A. Ayyad, S. Javed, L. Seneviratne, and Y. Zweiri, “Neuromorphic camera denoising using graph neural network-driven transformers,” IEEE Transactions on Neural Networks and Learning Systems, pp. 1–15, 2022.
  3. P. Lichtsteiner, C. Posch, and T. Delbruck, “A 128×\times× 128 120 db 15 μ𝜇\muitalic_μs latency asynchronous temporal contrast vision sensor,” IEEE Journal of Solid-State Circuits, vol. 43, no. 2, pp. 566–576, 2008.
  4. M. Salah, M. Chehadah, M. Humais, M. Wahbah, A. Ayyad, R. Azzam, L. Seneviratne, and Y. Zweiri, “A neuromorphic vision-based measurement for robust relative localization in future space exploration missions,” IEEE Transactions on Instrumentation and Measurement, pp. 1–1, 2022.
  5. F. Mahlknecht, D. Gehrig, J. Nash, F. M. Rockenbauer, B. Morrell, J. Delaune, and D. Scaramuzza, “Exploring event camera-based odometry for planetary robots,” IEEE Robotics and Automation Letters, vol. 7, no. 4, pp. 8651–8658, 2022.
  6. S. Roffe, H. Akolkar, A. D. George, B. Linares-Barranco, and R. B. Benosman, “Neutron-induced, single-event effects on neuromorphic event-based vision sensor: A first step and tools to space applications,” IEEE Access, vol. 9, pp. 85 748–85 763, 2021.
  7. A. Ayyad, M. Halwani, D. Swart, R. Muthusamy, F. Almaskari, and Y. Zweiri, “Neuromorphic vision based control for the precise positioning of robotic drilling systems,” Robotics and Computer-Integrated Manufacturing, vol. 79, p. 102419, 2023. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S0736584522001041
  8. J. Li, J. Li, L. Zhu, X. Xiang, T. Huang, and Y. Tian, “Asynchronous spatio-temporal memory network for continuous event-based object detection,” IEEE Transactions on Image Processing, vol. 31, pp. 2975–2987, 2022.
  9. Y. Zheng, Z. Yu, S. Wang, and T. Huang, “Spike-based motion estimation for object tracking through bio-inspired unsupervised learning,” IEEE Transactions on Image Processing, vol. 32, pp. 335–349, 2023.
  10. M. Gehrig and D. Scaramuzza, “Recurrent vision transformers for object detection with event cameras,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), June 2023, pp. 13 884–13 893.
  11. N. Messikommer, C. Fang, M. Gehrig, and D. Scaramuzza, “Data-driven feature tracking for event cameras,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), June 2023, pp. 5642–5651.
  12. J. Bertrand, A. Yiğit, and S. Durand, “Embedded event-based visual odometry,” in 2020 6th International Conference on Event-Based Control, Communication, and Signal Processing (EBCCSP), 2020, pp. 1–8.
  13. D. Falanga, K. Kleber, and D. Scaramuzza, “Dynamic obstacle avoidance for quadrotors with event cameras,” Science Robotics, vol. 5, no. 40, p. eaaz9712, 2020. [Online]. Available: https://www.science.org/doi/abs/10.1126/scirobotics.aaz9712
  14. S. Sun, G. Cioffi, C. de Visser, and D. Scaramuzza, “Autonomous quadrotor flight despite rotor failure with onboard vision sensors: Frames vs. events,” IEEE Robotics and Automation Letters, vol. 6, no. 2, pp. 580–587, apr 2021. [Online]. Available: https://doi.org/10.1109%2Flra.2020.3048875
  15. G. Chen, W. Chen, Q. Yang, Z. Xu, L. Yang, J. Conradt, and A. Knoll, “A novel visible light positioning system with event-based neuromorphic vision sensor,” IEEE Sensors Journal, vol. 20, no. 17, pp. 10 211–10 219, 2020.
  16. G. Orchard, “Dvscalibration,” 2016. [Online]. Available: https://github.com/gorchard/DVScalibration
  17. M. Muglikar, M. Gehrig, D. Gehrig, and D. Scaramuzza, “How to calibrate your event camera,” in 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 2021, pp. 1403–1409.
  18. C. Brandli, R. Berner, M. Yang, S.-C. Liu, and T. Delbruck, “A 240 × 180 130 db 3 µs latency global shutter spatiotemporal vision sensor,” IEEE Journal of Solid-State Circuits, vol. 49, no. 10, pp. 2333–2341, 2014.
  19. K. Huang, Y. Wang, and L. Kneip, “Dynamic event camera calibration,” in 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2021, pp. 7021–7028.
  20. C. Ricolfe-Viala and A. Esparza, “The influence of autofocus lenses in the camera calibration process,” IEEE Transactions on Instrumentation and Measurement, vol. 70, pp. 1–15, 2021.
  21. J.-H. Chuang, C.-H. Ho, A. Umam, H.-Y. Chen, J.-N. Hwang, and T.-A. Chen, “Geometry-based camera calibration using closed-form solution of principal line,” IEEE Transactions on Image Processing, vol. 30, pp. 2599–2610, 2021.
  22. S.-E. Lee, K. Shibata, S. Nonaka, S. Nobuhara, and K. Nishino, “Extrinsic camera calibration from a moving person,” IEEE Robotics and Automation Letters, vol. 7, no. 4, pp. 10 344–10 351, 2022.
  23. C. Yu and Q. Peng, “Robust recognition of checkerboard pattern for camera calibration,” Optical Engineering - OPT ENG, vol. 45, 09 2006.
  24. F. Bergamasco, L. Cosmo, A. Albarelli, and A. Torsello, “Camera calibration from coplanar circles,” Proceedings - International Conference on Pattern Recognition, pp. 2137–2142, 12 2014.
  25. K.-Y. K. Wong, G. Zhang, and Z. Chen, “A stratified approach for camera calibration using spheres,” IEEE Transactions on Image Processing, vol. 20, no. 2, pp. 305–316, 2011.
  26. E. Olson, “Apriltag: A robust and flexible visual fiducial system,” in 2011 IEEE International Conference on Robotics and Automation, 2011, pp. 3400–3407.
  27. J. Bouguet, “Matlab camera calibration toolbox,” 2000.
  28. J. Harguess and S. Strange, “Infrared stereo calibration for unmanned ground vehicle navigation,” in Proceedings of SPIE - The International Society for Optical Engineering, vol. 9084, 06 2014, p. 90840S.
  29. H. Rebecq, R. Ranftl, V. Koltun, and D. Scaramuzza, “High speed and high dynamic range video with an event camera,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 43, no. 6, pp. 1964–1980, 2021.
  30. P. R. G. Cadena, Y. Qian, C. Wang, and M. Yang, “Spade-e2vid: Spatially-adaptive denormalization for event-based video reconstruction,” IEEE Transactions on Image Processing, vol. 30, pp. 2488–2500, 2021.
  31. C. Scheerlinck, H. Rebecq, D. Gehrig, N. Barnes, R. Mahony, and D. Scaramuzza, “Fast image reconstruction with an event camera,” in IEEE Winter Conf. Appl. Comput. Vis. (WACV), 2020, pp. 156–163.
  32. K. S. Rattan, D. Ph., T. Wischgoll, and R. E. W. Fyffe, “A comparison of monocular camera calibration techniques,” 2014.
  33. M. Ester, H.-P. Kriegel, J. Sander, and X. Xu, “A density-based algorithm for discovering clusters in large spatial databases with noise,” ser. KDD’96.   AAAI Press, 1996, p. 226–231.
  34. R. Muthusamy, A. Ayyad, M. Halwani, D. Swart, D. Gan, L. Seneviratne, and Y. Zweiri, “Neuromorphic eye-in-hand visual servoing,” IEEE Access, vol. 9, pp. 55 853–55 870, 2021.
  35. D. Birant and A. Kut, “ST-DBSCAN: An algorithm for clustering spatial–temporal data,” Data & Knowledge Engineering, vol. 60, no. 1, pp. 208–221, 2007, intelligent Data Mining. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S0169023X06000218
  36. P. J. Green, “Iteratively reweighted least squares for maximum likelihood estimation, and some robust and resistant alternatives,” Journal of the Royal Statistical Society. Series B (Methodological), vol. 46, no. 2, pp. 149–192, 1984. [Online]. Available: http://www.jstor.org/stable/2345503
  37. K. Levenberg, “A method for the solution of certain non-linear problems in least squares,” Quarterly of Applied Mathematics, vol. 2, no. 2, pp. 164–168, 1944. [Online]. Available: http://www.jstor.org/stable/43633451
  38. S. Patel, S. Sihmar, and A. Jatain, “A study of hierarchical clustering algorithms,” in 2015 2nd International Conference on Computing for Sustainable Global Development (INDIACom), 2015, pp. 537–541.
  39. Z. Zhang, “A flexible new technique for camera calibration,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 22, no. 11, pp. 1330–1334, 2000.
  40. h. Hu, J. Wu, and Z. Xiong, “A soft time synchronization framework for multi-sensors in autonomous localization and navigation,” in 2018 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), 2018, pp. 694–699.
  41. MATLAB. Detect circle grid pattern in images. [Online]. Available: https://www.mathworks.com/help/vision/ref/detectcirclegridpoints.html
  42. G. Munda, C. Reinbacher, and T. Pock, “Real-time intensity-image reconstruction for event cameras using manifold regularisation,” International Journal of Computer Vision, vol. 126, no. 12, pp. 1381–1393, Jul. 2018.
Citations (5)

Summary

We haven't generated a summary for this paper yet.

Youtube Logo Streamline Icon: https://streamlinehq.com