Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Using Deep Learning to Increase Eye-Tracking Robustness, Accuracy, and Precision in Virtual Reality (2403.19768v1)

Published 28 Mar 2024 in cs.CV

Abstract: Algorithms for the estimation of gaze direction from mobile and video-based eye trackers typically involve tracking a feature of the eye that moves through the eye camera image in a way that covaries with the shifting gaze direction, such as the center or boundaries of the pupil. Tracking these features using traditional computer vision techniques can be difficult due to partial occlusion and environmental reflections. Although recent efforts to use ML for pupil tracking have demonstrated superior results when evaluated using standard measures of segmentation performance, little is known of how these networks may affect the quality of the final gaze estimate. This work provides an objective assessment of the impact of several contemporary ML-based methods for eye feature tracking when the subsequent gaze estimate is produced using either feature-based or model-based methods. Metrics include the accuracy and precision of the gaze estimate, as well as drop-out rate.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (39)
  1. Pupil Tracking Under Direct Sunlight. In ACM Symposium on Eye Tracking Research and Applications (Virtual Event, Germany) (ETRA ’21 Adjunct). Association for Computing Machinery, New York, NY, USA, Article 18, 4 pages. https://doi.org/10.1145/3450341.3458490
  2. Studying human behavior with virtual reality: The Unity Experiment Framework. Behavior research methods 52 (2020), 455–463.
  3. Landmark-aware Self-supervised Eye Semantic Segmentation. Proceedings - 2021 16th IEEE International Conference on Automatic Face and Gesture Recognition, FG 2021 (2021). https://doi.org/10.1109/FG52635.2021.9667031
  4. Aayush K. Chaudhary. 2019. Motion Tracking of Iris Features for Eye Tracking. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications (Denver, Colorado) (ETRA ’19). Association for Computing Machinery, New York, NY, USA, Article 53, 3 pages. https://doi.org/10.1145/3314111.3322872
  5. RITnet: Real-time Semantic Segmentation of the Eye for Gaze Tracking. Proceedings - 2019 International Conference on Computer Vision Workshop, ICCVW 2019 (10 2019), 3698–3702. https://doi.org/10.1109/iccvw.2019.00568 RITnet Original Paper.
  6. A fast approach to refraction-aware eye-model fitting and gaze prediction. Eye Tracking Research and Applications Symposium (ETRA) (6 2019). https://doi.org/10.1145/3314111.3319819
  7. ExCuSe: Robust Pupil Detection in Real-World Scenarios. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 9256 (2015), 39–51. https://doi.org/10.1007/978-3-319-23192-1_4
  8. PupilNet: Convolutional Neural Networks for Robust Pupil Detection. (1 2016). https://arxiv.org/abs/1601.04902v1
  9. ElSe: Ellipse Selection for Robust Pupil Detection in Real-World Environments. Eye Tracking Research and Applications Symposium (ETRA) 14 (11 2015), 123–130. https://doi.org/10.48550/arxiv.1511.06575
  10. 1000 Pupil Segmentations in a Second using Haar Like Features and Statistical Learning. Proceedings of the IEEE International Conference on Computer Vision 2021-October (2 2021), 3459–3469. https://doi.org/10.48550/arxiv.2102.01921
  11. Pupil detection for head-mounted eye tracking in the wild: an evaluation of the state of the art. Machine Vision and Applications 27 (11 2016), 1275–1288. Issue 8. https://doi.org/10.1007/S00138-016-0776-4/FIGURES/14
  12. OpenEDS: Open Eye Dataset. (4 2019). https://arxiv.org/abs/1905.03702v2
  13. Pupil Labs GmbH. 2022a. HTC Vive Add-On. https://docs.pupil-labs.com/vr-ar/htc-vive/
  14. Pupil Labs GmbH. 2022b. Pupil core - open source eye tracking platform - pupil labs. https://pupil-labs.com/products/core/
  15. E.D. Guestrin and M. Eizenman. 2006. General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Transactions on Biomedical Engineering 53, 6 (2006), 1124–1133. https://doi.org/10.1109/TBME.2005.863952
  16. A single camera eye-gaze tracking system with free head motion. In Proceedings of the 2006 Symposium on Eye Tracking Research & Applications (San Diego, California) (ETRA ’06). Association for Computing Machinery, New York, NY, USA, 87–94. https://doi.org/10.1145/1117309.1117349
  17. Set: A pupil detection method using sinusoidal approximation. Frontiers in Neuroengineering 8 (4 2015), 4. Issue APR. https://doi.org/10.3389/FNENG.2015.00004/ABSTRACT
  18. Anuradha Kar and Peter Corcoran. 2017. A Review and Analysis of Eye-Gaze Estimation Systems, Algorithms and Performance Evaluation Methods in Consumer Platforms. IEEE Access 5 (2017), 16495–16519. https://doi.org/10.1109/ACCESS.2017.2735633
  19. Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction. UbiComp 2014 - Adjunct Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing (4 2014), 1151–1160. https://arxiv.org/abs/1405.0006v1
  20. NVGaze: An anatomically-informed dataset for low-latency, near-eye gaze estimation. Conference on Human Factors in Computing Systems - Proceedings (5 2019). https://doi.org/10.1145/3290605.3300780
  21. EllSeg-Gen, towards Domain Generalization for head-mounted eyetracking. Proceedings of the ACM on Human-Computer Interaction 6 (5 2022). Issue ETRA. https://doi.org/10.1145/3530880
  22. EllSeg: An Ellipse Segmentation Framework for Robust Gaze Tracking. IEEE Transactions on Visualization and Computer Graphics 27 (7 2020), 2757–2767. Issue 5. https://doi.org/10.1109/tvcg.2021.3067765
  23. Starburst: A hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches. IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops 2005-September (2005). https://doi.org/10.1109/CVPR.2005.531
  24. SR Research Ltd. 2023. SR Research Ltd. - Eye-Tracking Company. https://www.sr-research.com/
  25. Wearable Eye-tracking for Research: Automated dynamic gaze mapping and accuracy/precision comparisons across devices. bioRxiv (2018). https://doi.org/10.1101/299925 arXiv:https://www.biorxiv.org/content/early/2018/06/28/299925.full.pdf
  26. Norman H Mackworth and Edward Llewellyn Thomas. 1962. Head-mounted eye-marker camera. JOSA 52, 6 (1962), 713–716.
  27. John Merchant. 1967. The oculometer. Technical Report.
  28. A Single-Camera Remote Eye Tracker. In Perception and Interactive Technologies, Elisabeth André, Laila Dybkjær, Wolfgang Minker, Heiko Neumann, and Michael Weber (Eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, 208–211.
  29. Nitinraj Nair. 2020. RIT-Eyes: Realistic Eye Image and Video Generation for Eye Tracking Applications. Theses (6 2020). https://scholarworks.rit.edu/theses/10553
  30. RIT-Eyes: Rendering of near-eye images for eye-tracking applications. Proceedings - SAP 2020: ACM Symposium on Applied Perception (6 2020). https://doi.org/10.1145/3385955.3407935
  31. PuRe: Robust pupil detection for real-time pervasive eye tracking. Computer Vision and Image Understanding 170 (12 2017), 40–50. https://doi.org/10.1016/j.cviu.2018.02.002
  32. PuReST: Robust pupil tracking for real-time pervasive eye tracking. In Proceedings of the 2018 ACM symposium on eye tracking research & applications. 1–5.
  33. Lech Swirski and Neil Dodgson. 2013. A fully-automatic, temporal approach to single camera, glint-free 3D eye model fitting. Proc. PETMEI (2013), 1–11.
  34. Labeled pupils in the wild: A dataset for studying pupil detection in unconstrained environments. Eye Tracking Research and Applications Symposium (ETRA) 14 (11 2015), 139–142. https://doi.org/10.1145/2857491.2857520
  35. Edge-guided near-eye image analysis for head mounted displays. Proceedings - 2021 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2021 (2021), 11–20. https://doi.org/10.1109/ISMAR52148.2021.00015
  36. Learning an Appearance-Based Gaze Estimator from One Million Synthesised Images. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications. 131–138.
  37. DeepVOG: Open-source pupil segmentation and gaze estimation in neuroscience using deep learning. Journal of Neuroscience Methods 324 (8 2019). https://doi.org/10.1016/J.JNEUMETH.2019.05.016
  38. Zhiwei Zhu and Qiang Ji. 2005. Robust real-time eye detection and tracking under variable lighting conditions and various face orientations. Computer Vision and Image Understanding 98, 1 (2005), 124–154. https://doi.org/10.1016/j.cviu.2004.07.012 Special Issue on Eye Detection and Tracking.
  39. Robust real-time pupil tracking in highly off-axis images. Eye Tracking Research and Applications Symposium (ETRA) (2012), 173–176. https://doi.org/10.1145/2168556.2168585
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Kevin Barkevich (1 paper)
  2. Reynold Bailey (6 papers)
  3. Gabriel J. Diaz (6 papers)
Citations (4)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com