Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
194 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Precise localization of corneal reflections in eye images using deep learning trained on synthetic data (2304.05673v3)

Published 12 Apr 2023 in cs.CV and cs.AI

Abstract: We present a deep learning method for accurately localizing the center of a single corneal reflection (CR) in an eye image. Unlike previous approaches, we use a convolutional neural network (CNN) that was trained solely using simulated data. Using only simulated data has the benefit of completely sidestepping the time-consuming process of manual annotation that is required for supervised training on real eye images. To systematically evaluate the accuracy of our method, we first tested it on images with simulated CRs placed on different backgrounds and embedded in varying levels of noise. Second, we tested the method on high-quality videos captured from real eyes. Our method outperformed state-of-the-art algorithmic methods on real eye images with a 35% reduction in terms of spatial precision, and performed on par with state-of-the-art on simulated images in terms of spatial accuracy.We conclude that our method provides a precise method for CR center localization and provides a solution to the data availability problem which is one of the important common roadblocks in the development of deep learning models for gaze estimation. Due to the superior CR center localization and ease of application, our method has the potential to improve the accuracy and precision of CR-based eye trackers

Definition Search Book Streamline Icon: https://streamlinehq.com
References (40)
  1. John Merchant, Richard Morrissette and James L Porterfield “Remote measurement of eye direction allowing subject motion over one cubic foot of space” In IEEE transactions on biomedical engineering IEEE, 1974, pp. 309–317
  2. Dave M Stampe “Heuristic filtering and reliable calibration methods for video-based pupil-tracking systems” In Behavior Research Methods, Instruments, & Computers 25.2 Springer, 1993, pp. 137–142
  3. Mark R Shortis, Timothy A Clarke and Tim Short “Comparison of some techniques for the subpixel location of discrete target images” In Videometrics III 2350, 1994, pp. 239–250 SPIE
  4. J.B. Mulligan “Image processing for improved eye-tracking accuracy” In Behavior Research Methods, Instruments, & Computers 29, 1997, pp. 54–65 DOI: 10.3758/BF03200567
  5. “A precise eye-gaze detection and tracking system” In Proceedings of The 11th International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision, WSCG’2003, 2003, pp. 1–4 URL: http://wscg.zcu.cz/wscg2003/Papers_2003/A83.pdf
  6. Dongheng Li, D. Winfield and D.J. Parkhurst “Starburst: A hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches” In 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05) - Workshops, 2005, pp. 79–79 DOI: 10.1109/CVPR.2005.531
  7. “Evaluation of a Low-Cost Open-Source Gaze Tracker” In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, ETRA ’10 Austin, Texas: Association for Computing Machinery, 2010, pp. 77–80 DOI: 10.1145/1743666.1743685
  8. “Eye tracking: a comprehensive guide to methods and measures” Oxford University Press, 2011
  9. “Error characterization and compensation in eye tracking systems” In Proceedings of the symposium on eye tracking research and applications, 2012, pp. 205–208
  10. Kenneth Holmqvist, Marcus Nyström and Fiona Mulvey “Eye tracker data quality: What it is and how to measure it” In Proceedings of the Symposium on Eye Tracking Research and Applications, 2012, pp. 45–52 ACM
  11. Raghuveer Parthasarathy “Rapid, accurate particle tracking by calculation of radial symmetry centers” In Nature methods 9.7 Nature Publishing Group, 2012, pp. 724–726
  12. Yusuke Sugano, Yasuyuki Matsushita and Yoichi Sato “Learning-by-Synthesis for Appearance-Based 3D Gaze Estimation” In 2014 IEEE Conference on Computer Vision and Pattern Recognition, 2014, pp. 1821–1828 DOI: 10.1109/CVPR.2014.235
  13. “Rendering of Eyes for Eye-Shape Registration and Gaze Estimation” In CoRR abs/1505.05916, 2015 arXiv: http://arxiv.org/abs/1505.05916
  14. “Learning from Simulated and Unsupervised Images through Adversarial Training” In CoRR abs/1612.07828, 2016 arXiv: http://arxiv.org/abs/1612.07828
  15. “Learning an Appearance-Based Gaze Estimator from One Million Synthesised Images” In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, ETRA ’16 Charleston, South Carolina: Association for Computing Machinery, 2016, pp. 131–138 DOI: 10.1145/2857491.2857492
  16. “Oculomatic: High speed, reliable, and accurate open-source eye tracking for humans and non-human primates” In Journal of Neuroscience Methods 270, 2016, pp. 138–146 DOI: 10.1016/j.jneumeth.2016.06.016
  17. “PupilNet v2.0: Convolutional Neural Networks for CPU based real time Robust Pupil Detection”, 2017 arXiv:1711.00112 [cs.CV]
  18. Diederik P. Kingma and Jimmy Ba “Adam: A Method for Stochastic Optimization”, 2017 arXiv:1412.6980 [cs.LG]
  19. A.D. Barsingerhorn, F.N. Boonstra and J. Goossens “Development and validation of a high-speed stereoscopic eyetracker” In Behavior Research Methods 50.6 Springer, 2018, pp. 2480–2497 DOI: 10.3758/s13428-018-1026-7
  20. “Is human classification by experienced untrained observers a gold standard in fixation detection?” In Behavior Research Methods 50.5 Springer, 2018, pp. 1864–1881
  21. Diederick C Niehorster and Marcus Nyström “Microsaccade Detection Using Pupil and Corneal Reflection Signals” In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, ETRA ’18 Warsaw, Poland: Association for Computing Machinery, 2018 DOI: 10.1145/3204493.3204573
  22. Saga Helgadottir, Aykut Argun and Giovanni Volpe “Digital video microscopy enhanced by deep learning” In Optica 6.4 Optica Publishing Group, 2019, pp. 506–513
  23. “Eyenet: A multi-task deep network for off-axis eye gaze estimation” In 2019 IEEE/CVF International Conference on Computer Vision Workshop (ICCVW), 2019, pp. 3683–3687 IEEE
  24. “Small eye movements cannot be reliably measured by video-based P-CR eye-trackers” In Behavior research methods 52.5 Springer, 2020, pp. 2098–2121
  25. “RemoteEye: An open-source high-speed remote eye tracker” In Behavior Research Methods 52.3 Springer, 2020, pp. 1387–1401 DOI: 10.3758/s13428-019-01305-2
  26. Diederick C Niehorster, Roy S Hessels and Jeroen S Benjamins “GlassesViewer: Open-source software for viewing and analyzing data from the Tobii Pro Glasses 2 eye tracker” In Behavior Research Methods 52.3 Springer, 2020, pp. 1244–1253
  27. “The impact of slippage on the data quality of head-worn eye trackers” In Behavior Research Methods 52.3 Springer, 2020, pp. 1140–1160 DOI: 10.3758/s13428-019-01307-0
  28. “Characterizing gaze position signals and synthesizing noise during fixations in eye-tracking data” In Behavior Research Methods 52.6 Springer, 2020, pp. 2515–2534 DOI: 10.3758/s13428-020-01400-9
  29. “Appearance-based Gaze Estimation With Deep Learning: A Review and Benchmark” In CoRR abs/2104.12668, 2021 arXiv: https://arxiv.org/abs/2104.12668
  30. “Detection and Correspondence Matching of Corneal Reflections for Eye Tracking Using Deep Learning” In 2020 25th International Conference on Pattern Recognition (ICPR), 2021, pp. 2210–2217 IEEE
  31. “The pupil-size artefact (PSA) across time, viewing direction, and different eye trackers” In Behavior Research Methods 53.5 Springer, 2021, pp. 1986–2006
  32. “A low-cost, high-performance video-based binocular eye tracker for psychophysical research” In Journal of Eye Movement Research 14.3, 2021 DOI: 10.16910/jemr.14.3.3
  33. “Quantitative digital microscopy with deep learning” In Applied Physics Reviews 8.1 AIP Publishing LLC, 2021, pp. 011310
  34. Diederick C Niehorster, Raimondas Zemblys and Kenneth Holmqvist “Is apparent fixational drift in eye-tracking data due to filters or eyeball rotation?” In Behavior Research Methods 53.1 Springer, 2021, pp. 311–324 DOI: 10.3758/s13428-020-01414-3
  35. “Real-Time Localization and Matching of Corneal Reflections for Eye Gaze Estimation via a Lightweight Network” In The Ninth International Symposium of Chinese CHI, 2021, pp. 33–40
  36. “How robust are wearable eye trackers to slow and fast head and body movements?” In Behavior Research Methods Springer, 2022, pp. 1–15
  37. “Single-shot self-supervised object detection in microscopy” In Nature Communications 13.1 Nature Publishing Group, 2022, pp. 1–13
  38. “The amplitude of small eye movements can be accurately estimated with video-based eye trackers” In Behavior Research Methods Springer, 2022, pp. 1–13
  39. “Geometric deep learning reveals the spatiotemporal fingerprint of microscopic motion” arXiv, 2022 DOI: 10.48550/ARXIV.2202.06355
  40. “High-resolution eye-tracking via digital imaging of Purkinje reflections” In bioRxiv Cold Spring Harbor Laboratory, 2022 DOI: 10.1101/2022.08.16.504076
Citations (6)

Summary

We haven't generated a summary for this paper yet.