IMU-Aided Event-based Stereo Visual Odometry (2405.04071v1)
Abstract: Direct methods for event-based visual odometry solve the mapping and camera pose tracking sub-problems by establishing implicit data association in a way that the generative model of events is exploited. The main bottlenecks faced by state-of-the-art work in this field include the high computational complexity of mapping and the limited accuracy of tracking. In this paper, we improve our previous direct pipeline \textit{Event-based Stereo Visual Odometry} in terms of accuracy and efficiency. To speed up the mapping operation, we propose an efficient strategy of edge-pixel sampling according to the local dynamics of events. The mapping performance in terms of completeness and local smoothness is also improved by combining the temporal stereo results and the static stereo results. To circumvent the degeneracy issue of camera pose tracking in recovering the yaw component of general 6-DoF motion, we introduce as a prior the gyroscope measurements via pre-integration. Experiments on publicly available datasets justify our improvement. We release our pipeline as an open-source software for future research in this field.
- P. Lichtsteiner, C. Posch, and T. Delbruck, “A 128×\times×128 120 dB 15 μ𝜇\muitalic_μs latency asynchronous temporal contrast vision sensor,” IEEE J. Solid-State Circuits, vol. 43, no. 2, pp. 566–576, 2008.
- E. Mueggler, B. Huber, and D. Scaramuzza, “Event-based, 6-DOF pose tracking for high-speed maneuvers,” in IEEE/RSJ Int. Conf. Intell. Robot. Syst. (IROS), 2014, pp. 2761–2768.
- X. Lagorce, C. Meyer, S.-H. Ieng, D. Filliat, and R. Benosman, “Asynchronous event-based multikernel algorithm for high-speed visual features tracking,” IEEE Trans. Neural Netw. Learn. Syst., vol. 26, no. 8, pp. 1710–1720, Aug. 2015.
- A. Z. Zhu, N. Atanasov, and K. Daniilidis, “Event-based feature tracking with probabilistic data association,” in IEEE Int. Conf. Robot. Autom. (ICRA), 2017, pp. 4465–4470.
- G. Gallego, J. E. A. Lund, E. Mueggler, H. Rebecq, T. Delbruck, and D. Scaramuzza, “Event-based, 6-DOF camera tracking from photometric depth maps,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 40, no. 10, pp. 2402–2412, Oct. 2018.
- G. Gallego and D. Scaramuzza, “Accurate angular velocity estimation with an event camera,” IEEE Robot. Autom. Lett., vol. 2, no. 2, pp. 632–639, 2017.
- E. Mueggler, G. Gallego, H. Rebecq, and D. Scaramuzza, “Continuous-time visual-inertial odometry for event cameras,” IEEE Trans. Robot., 2018.
- D. Gehrig, H. Rebecq, G. Gallego, and D. Scaramuzza, “Asynchronous, photometric feature tracking using events and frames,” in Eur. Conf. Comput. Vis. (ECCV), 2018, pp. 766–781.
- J. Conradt, M. Cook, R. Berner, P. Lichtsteiner, R. J. Douglas, and T. Delbruck, “A pencil balancing robot using a pair of AER dynamic vision sensors,” in IEEE Int. Symp. Circuits Syst. (ISCAS), 2009, pp. 781–784.
- T. Delbruck and M. Lang, “Robotic goalie with 3ms reaction time at 4% CPU load using event-based dynamic vision sensor,” Front. Neurosci., vol. 7, p. 223, 2013.
- H. Rebecq, G. Gallego, and D. Scaramuzza, “EMVS: Event-based multi-view stereo,” in British Mach. Vis. Conf. (BMVC), 2016.
- H. Kim, S. Leutenegger, and A. J. Davison, “Real-time 3D reconstruction and 6-DoF tracking with an event camera,” in Eur. Conf. Comput. Vis. (ECCV), 2016, pp. 349–364.
- H. Rebecq, G. Gallego, E. Mueggler, and D. Scaramuzza, “EMVS: Event-based multi-view stereo—3D reconstruction with an event camera in real-time,” Int. J. Comput. Vis., pp. 1–21, Nov. 2017.
- H. Rebecq, T. Horstschäfer, G. Gallego, and D. Scaramuzza, “EVO: A geometric approach to event-based 6-DOF parallel tracking and mapping in real-time,” IEEE Robot. Autom. Lett., vol. 2, no. 2, pp. 593–600, 2017.
- H. Rebecq, T. Horstschaefer, and D. Scaramuzza, “Real-time visual-inertial odometry for event cameras using keyframe-based nonlinear optimization,” in British Mach. Vis. Conf. (BMVC), 2017.
- A. Rosinol Vidal, H. Rebecq, T. Horstschaefer, and D. Scaramuzza, “Ultimate SLAM? combining events, images, and IMU for robust visual SLAM in HDR and high speed scenarios,” IEEE Robot. Autom. Lett., vol. 3, no. 2, pp. 994–1001, Apr. 2018.
- Y. Zhou, G. Gallego, and S. Shen, “Event-based stereo visual odometry,” IEEE Transactions on Robotics, vol. 37, no. 5, pp. 1433–1450, 2021.
- G. Klein and D. Murray, “Parallel tracking and mapping for small ar workspaces,” in 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality. IEEE, 2007, pp. 225–234.
- R. Mur-Artal, J. M. M. Montiel, and J. D. Tardós, “ORB-SLAM: a versatile and accurate monocular SLAM system,” IEEE Trans. Robot., vol. 31, no. 5, pp. 1147–1163, 2015.
- V. Vasco, A. Glover, and C. Bartolozzi, “Fast event-based Harris corner detection exploiting the advantages of event-driven cameras,” in IEEE/RSJ Int. Conf. Intell. Robot. Syst. (IROS), 2016.
- E. Mueggler, C. Bartolozzi, and D. Scaramuzza, “Fast event-based corner detection,” in British Mach. Vis. Conf. (BMVC), 2017.
- I. Alzugaray and M. Chli, “Asynchronous corner detection and tracking for event cameras in real time,” IEEE Robot. Autom. Lett., vol. 3, no. 4, pp. 3177–3184, Oct. 2018.
- R. Li, D. Shi, Y. Zhang, K. Li, and R. Li, “FA-Harris: A fast and asynchronous corner detector for event cameras,” in IEEE/RSJ Int. Conf. Intell. Robot. Syst. (IROS), 2019.
- C. Harris and M. Stephens, “A combined corner and edge detector,” in Proc. Fourth Alvey Vision Conf., vol. 15, 1988, pp. 147–151.
- E. Rosten and T. Drummond, “Machine learning for high-speed corner detection,” in Eur. Conf. Comput. Vis. (ECCV), 2006, pp. 430–443.
- I. Alzugaray and M. Chli, “ACE: An efficient asynchronous corner tracker for event cameras,” in 3D Vision (3DV), 2018, pp. 653–661.
- R. I. Hartley, “In defense of the eight-point algorithm,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 19, no. 6, pp. 580–593, 1997.
- D. Nistér, “An efficient solution to the five-point relative pose problem,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 26, no. 6, pp. 756–770, 2004.
- S. Li, C. Xu, and M. Xie, “A robust o (n) solution to the perspective-n-point problem,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 34, no. 7, pp. 1444–1450, 2012.
- L. Kneip, H. Li, and Y. Seo, “Upnp: An optimal o (n) solution to the absolute pose problem with universal applicability,” in Computer Vision–ECCV 2014: 13th European Conference, Zurich, Switzerland, September 6-12, 2014, Proceedings, Part I 13. Springer, 2014, pp. 127–142.
- A. Hadviger, I. Cvišić, I. Marković, S. Vražić, and I. Petrović, “Feature-based event stereo visual odometry,” in 2021 European Conference on Mobile Robots (ECMR). IEEE, 2021, pp. 1–6.
- G. Gallego, H. Rebecq, and D. Scaramuzza, “A unifying contrast maximization framework for event cameras, with applications to motion, depth, and optical flow estimation,” in IEEE Conf. Comput. Vis. Pattern Recog. (CVPR), 2018, pp. 3867–3876.
- A. Z. Zhu, N. Atanasov, and K. Daniilidis, “Event-based visual inertial odometry,” in IEEE Conf. Comput. Vis. Pattern Recog. (CVPR), 2017, pp. 5816–5824.
- A. I. Mourikis and S. I. Roumeliotis, “A multi-state constraint Kalman filter for vision-aided inertial navigation,” in IEEE Int. Conf. Robot. Autom. (ICRA), 2007, pp. 3565–3572.
- S. Leutenegger, P. Furgale, V. Rabaud, M. Chli, K. Konolige, and R. Siegwart, “Keyframe-based visual-inertial slam using nonlinear optimization,” Proceedings of Robotis Science and Systems (RSS) 2013, 2013.
- M. Gehrig, W. Aarents, D. Gehrig, and D. Scaramuzza, “Dsec: A stereo event camera dataset for driving scenarios,” IEEE Robotics and Automation Letters, 2021.
- M. Liu and T. Delbruck, “Adaptive time-slice block-matching optical flow algorithm for dynamic vision sensors,” in British Mach. Vis. Conf. (BMVC), 2018.
- G. Gallego, M. Gehrig, and D. Scaramuzza, “Focus is all you need: Loss functions for event-based vision,” in IEEE Conf. Comput. Vis. Pattern Recog. (CVPR), 2019, pp. 12 272–12 281.
- T. Delbruck, “Frame-free dynamic digital vision,” in Proc. Int. Symp. Secure-Life Electron., 2008, pp. 21–26.
- J. Manderscheid, A. Sironi, N. Bourdis, D. Migliore, and V. Lepetit, “Speed invariant time surface for learning to detect corner points with event-based cameras,” in IEEE Conf. Comput. Vis. Pattern Recog. (CVPR), 2019.
- A. Glover, A. Dinale, L. D. S. Rosa, S. Bamford, and C. Bartolozzi, “luvharris: A practical corner detector for event-cameras,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 44, no. 12, pp. 10 087–10 098, 2021.
- J. Engel, J. Stueckler, and D. Cremers, “Large-scale direct SLAM with stereo cameras,” in IEEE/RSJ Int. Conf. Intell. Robot. Syst. (IROS), 2015.
- A. Cayley, “About the algebraic structure of the orthogonal group and the other classical groups in a field of characteristic zero or a prime characteristic,” in Reine Angewandte Mathematik, 1846.
- Y. Zhou, G. Gallego, H. Rebecq, L. Kneip, H. Li, and D. Scaramuzza, “Semi-dense 3D reconstruction with a stereo event camera,” in Eur. Conf. Comput. Vis. (ECCV), 2018, pp. 242–258.
- J. Sturm, N. Engelhard, F. Endres, W. Burgard, and D. Cremers, “A benchmark for the evaluation of RGB-D SLAM systems,” in IEEE/RSJ Int. Conf. Intell. Robot. Syst. (IROS), Oct. 2012.
- Z. Zhang and D. Scaramuzza, “A tutorial on quantitative trajectory evaluation for visual (-inertial) odometry,” in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2018, pp. 7244–7251.