Papers
Topics
Authors
Recent
2000 character limit reached

Fixation-based Self-calibration for Eye Tracking in VR Headsets (2311.00391v2)

Published 1 Nov 2023 in cs.CV and cs.HC

Abstract: This study proposes a novel self-calibration method for eye tracking in a virtual reality (VR) headset. The proposed method is based on the assumptions that the user's viewpoint can freely move and that the points of regard (PoRs) from different viewpoints are distributed within a small area on an object surface during visual fixation. In the method, fixations are first detected from the time-series data of uncalibrated gaze directions using an extension of the I-VDT (velocity and dispersion threshold identification) algorithm to a three-dimensional (3D) scene. Then, the calibration parameters are optimized by minimizing the sum of a dispersion metrics of the PoRs. The proposed method can potentially identify the optimal calibration parameters representing the user-dependent offset from the optical axis to the visual axis without explicit user calibration, image processing, or marker-substitute objects. For the gaze data of 18 participants walking in two VR environments with many occlusions, the proposed method achieved an accuracy of 2.1$\circ$, which was significantly lower than the average offset. Our method is the first self-calibration method with an average error lower than 3$\circ$ in 3D environments. Further, the accuracy of the proposed method can be improved by up to 1.2$\circ$ by refining the fixation detection or optimization algorithm.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (75)
  1. “Positions of Ocular Geometrical and Visual Axes in Brazilian, Chinese and Italian Populations” In Current Eye Research 43.11, 2018, pp. 1404–1414 DOI: 10.1080/02713683.2018.1500609
  2. “Convolutional Neural Network-Based Methods for Eye Gaze Estimation: A Survey” In IEEE Access 8, 2020, pp. 142581–142605 DOI: 10.1109/ACCESS.2020.3013540
  3. “Auto-calibrated Gaze Estimation Using Human Gaze Patterns” In International Journal of Computer Vision 124.2, 2017, pp. 223–236 DOI: https://doi.org/10.1007/s11263-017-1014-x
  4. “Non-Intrusive Gaze Tracking Using Artificial Neural Networks” In Proceedings of the 6th International Conference on Neural Information Processing Systems, 1993, pp. 753–760
  5. “Eye gaze tracking using an active stereo head” In Proceedings of the 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition 2, 2003, pp. 451–458 DOI: 10.1109/CVPR.2003.1211502
  6. P. Blignaut “Fixation Identification: The Optimum Threshold for a Dispersion Algorithm” In Attention, Perception & Psychophysics 71, 2009, pp. 881–895 DOI: https://doi.org/10.3758/APP.71.4.881
  7. P. Blignaut “Mapping the Pupil-glint Vector to Gaze Coordinates in a Simple Video-based Eye Tracker” In Journal of Eye Movement Research 7.1, 2013, pp. 1–11 DOI: 10.16910/jemr.7.1.4
  8. R.H.S Carpenter “Movements of the eyes” London: Pion, 1988
  9. “Probabilistic Gaze Estimation without Active Personal Calibration” In Proceedings of the 2011 IEEE Conference on Computer Vision and Pattern Recognition, 2011, pp. 609–616 DOI: 10.1109/CVPR.2011.5995675
  10. “A Probabilistic Approach to Online Eye Gaze Tracking Without Explicit Personal Calibration” In IEEE Transactions on Image Processing 24.3, 2015, pp. 1076–1086 DOI: 10.1109/TIP.2014.2383326
  11. M. Dwass “Some K-sample Rank-order Tests” In Contributions to Probability and Statistics, 1960, pp. 198–202
  12. “General Theory of Remote Gaze Estimation Using the Pupil Center and Corneal Reflections” In IEEE Transactions on Biomedical Engineering 53.6, 2006, pp. 1124–1133 DOI: 10.1109/TBME.2005.863952
  13. “Microsaccades as an overt measure of covert attention shifts” In Vision Research 42.22, 2002, pp. 2533–2545 DOI: https://doi.org/10.1016/S0042-6989(02)00263-8
  14. “In the Eye of the Beholder: A Survey of Models for Eyes and Gaze” In IEEE Transactions on Pattern Analysis and Machine Intelligence 32.3, 2010, pp. 478–500 DOI: 10.1109/TPAMI.2009.30
  15. “Eye typing using Markov and active appearance models” In Proceedings of the Sixth IEEE Workshop on Applications of Computer Vision, 2002, pp. 132–136 DOI: 10.1109/ACV.2002.1182170
  16. “Is the eye-movement field confused about fixations and saccades? A survey among 124 researchers” In Royal Society Open Science 5.8, 2018, pp. 1–23 DOI: https://doi.org/10.1098/rsos.180502
  17. M. Hiroe, S. Mitsunaga and T. Nagamatsu “Implicit User Calibration for Model-Based Gaze-Tracking System Using Face Detection around Optical Axis of Eye” In Proceedings of the Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems, 2019 DOI: 10.1145/3290607.3312942
  18. “Fixation classification: How to merge and select fixation candidates” In Behavior Research Methods 54.6, 2022, pp. 2765–2776 DOI: 10.3758/s13428-021-01723-1
  19. “SGaze: A Data-Driven Eye-Head Coordination Model for Realtime Gaze Prediction” In IEEE Transactions on Visualization and Computer Graphics 25.5, 2019, pp. 2002–2010 DOI: 10.1109/TVCG.2019.2899187
  20. “DGaze: CNN-Based Gaze Prediction in Dynamic Scenes” In IEEE Transactions on Visualization and Computer Graphics 26.5, 2020, pp. 1902–1911 DOI: 10.1109/TVCG.2020.2973473
  21. “Development and Calibration of an Eye-Tracking Fixation Identification Algorithm for Immersive Virtual Reality” In Sensors 20.17, 2020, pp. 1–15 DOI: 10.3390/s20174956
  22. “A Review and Analysis of Eye-Gaze Estimation Systems, Algorithms and Performance Evaluation Methods in Consumer Platforms” In IEEE Access 5, 2017, pp. 16495–16519 DOI: 10.1109/ACCESS.2017.2735633
  23. “Automated classification and scoring of smooth pursuit eye movements in the presence of fixations and saccades” In Behavior research methods 45.1, 2013, pp. 203–215 DOI: https://doi.org/10.3758/s13428-012-0234-9
  24. R. Konrad, A. Angelopoulos and G. Wetzstein “Gaze-Contingent Ocular Parallax Rendering for Virtual Reality” In ACM Transactions on Graphics 39.2, 2020, pp. 1–12 DOI: https://doi.org/10.1145/3361330
  25. “Eye Tracking for Everyone” In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition, 2016, pp. 2176–2184 DOI: 10.1109/CVPR.2016.239
  26. B. Krajancich, P. Kellnhofer and G. Wetzstein “Optimizing depth perception in virtual and augmented reality through gaze-contingent stereo rendering” In ACM Transactions on Graphics 39.6, 2020, pp. 1–10 DOI: 10.1145/3414685.3417820
  27. “Head movement compensation and multi-modal event detection in eye-tracking data for unconstrained head movements” In Journal of Neuroscience Methods 274, 2016, pp. 13–26 DOI: 10.1016/j.jneumeth.2016.09.005
  28. Y. Li, A. Fathi and J.M. Rehg “Learning to Predict Gaze in Egocentric Video” In Proceedings of the 2013 IEEE International Conference on Computer Vision, 2013, pp. 3216–3223 DOI: 10.1109/ICCV.2013.399
  29. M. Liu, Y. Li and H. Liu “3D Gaze Estimation for Head-mounted Eye Tracking System With Auto-calibration Method” In IEEE Access 8, 2020, pp. 104207–104215 DOI: 10.1109/ACCESS.2020.2999633
  30. W. Maio, J. Chen and Q. Ji “Constraint-based gaze estimation without active calibration” In Proceedings of the IEEE International Conference on Automatic Face & Gesture Recognition, 2011, pp. 627–631 DOI: 10.1109/FG.2011.5771469
  31. “Defining the temporal threshold for ocular fixation in free-viewing visuocognitive tasks” In Journal of Neuroscience Methods 128.1, 2003, pp. 85–93 DOI: 10.1016/s0165-0270(03)00151-1
  32. D. Mardanbegi, T. Langlotz and H. Gellersen “Resolving Target Ambiguity in 3D Gaze Interaction through VOR Depth Estimation” In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, 2019 DOI: https://doi.org/10.1145/3290605.3300842
  33. B. Massé, S. Ba and R. Horaud “Tracking Gaze and Visual Focus of Attention of People Involved in Social Interaction” In IEEE Transactions on Pattern Analysis and Machine Intelligence 40.11, 2018, pp. 2711–2724 DOI: 10.1109/TPAMI.2017.2782819
  34. “Kernel Foveated Rendering” In Proceedings of the ACM on Computer Graphics and Interactive Techniques 1.1, 2018, pp. 1–20 DOI: https://doi.org/10.1145/3203199
  35. K. Miki, T. Nagamatsu and D.W. Hansen “Implicit user calibration for gaze-tracking systems using kernel density estimation” In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, 2016, pp. 249–252 DOI: https://doi.org/10.1145/2857491.2857518
  36. “An Automatic Personal Calibration Procedure for Advanced Gaze Estimation Systems” In IEEE Transactions on Biomedical Engineering 57.5, 2010, pp. 1031–1039 DOI: 10.1109/TBME.2009.2039351
  37. “User-calibration-free remote gaze estimation system” In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, 2010, pp. 1–8 DOI: 10.1145/1743666.1743672
  38. M. Murauer, M. Haslgrübler and A. Ferscha “Natural Pursuits for Eye Tracker Calibration” In Proceedings of the 5th international Workshop on Sensor-based Activity Recognition and Interaction, 2018 DOI: https://doi.org/10.1145/3266157.3266207
  39. “Gaze Estimation Method Based on an Aspherical Model of the Cornea: Surface of Revolution about the Optical Axis of the Eye” In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, 2010, pp. 255–258 DOI: https://doi.org/10.1145/1743666.1743726
  40. “User-calibration-free gaze tracking with estimation of the horizontal angles between the visual and the optical axes of both eyes” In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, 2010, pp. 251–254 DOI: https://doi.org/10.1145/1743666.1743725
  41. “One-point calibration gaze tracking based on eyeball kinematics using stereo cameras” In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications, 2008, pp. 95–98 DOI: https://doi.org/10.1145/1344471.1344496
  42. T. Nagamatsu, T. Ueki and J. Kamahara “Automatic User Calibration for Gaze-Tracking Systems by Looking into the Distance” In Proceedings of the 3rd International Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction, 2013
  43. T. Nagamatsu, M. Hiroe and H. Arai “Extending the measurement angle of a gaze estimation method using an eye model expressed by a revolution about the optical axis of the eye” In IEICE Transactions on Information and Systems E104.D.5, 2021, pp. 729–740 DOI: 10.1587/transinf.2020EDP7072
  44. “Sustained deviation of gaze direction can affect “inverted vection” induced by the foreground motion” In Vision Research 43.7, 2003, pp. 745–749 DOI: https://doi.org/10.1016/S0042-6989(03)00081-6
  45. “A Gaze-Reactive Display for Simulating Depth-of-Field of Eyes When Viewing Scenes with Multiple Depths” In IEICE Transactions on Information and Systems E99.D.3, 2016, pp. 739–746 DOI: 10.1587/transinf.2015edp7110
  46. N. Padmanaban, R. Konrad and G. Wetzstein “Autofocals: Evaluating gaze-contingent eyeglasses for presbyopes” In Science Advances 5.6, 2019, pp. eaav6187 DOI: 10.1126/sciadv.aav618
  47. “Eye tracking in human-computer interaction and usability research: Current status and future prospects” In Proceedings of the Encyclopedia of Human Computer Interaction, 2006, pp. 211–219
  48. “Identifying Fixations and Saccades in Eye-Tracking Protocols” In Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, 2000, pp. 71–78 DOI: https://doi.org/10.1145/355017.355028
  49. “An analysis of variance test for normality (complete samples)” In Biometrika 52.3-4, 1965, pp. 591–611
  50. P. Shi, M. Billeter and E. Eisemann “SalientGaze: Saliency-based Gaze Correction in Virtual Reality” In Computers & Graphics 91, 2020, pp. 83–94 DOI: https://doi.org/10.1016/j.cag.2020.06.007
  51. F. Shic, B. Scassellati and K. Chawarska “The Incomplete Fixation Measure” In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications, 2008, pp. 111–114 DOI: 10.1145/1344471.1344500
  52. Dave M. Stampe “Heuristic filtering and reliable calibration methods for video-based pupil-tracking systems” In Behavior Research Methods, Instruments, & Computers 25.2, 1993, pp. 137–142
  53. M. Startsev, I. Agtzidis and M. Dorr “1D CNN with BLSTM for automated classification of fixations” In Behavior Research Methods 51.2, 2019, pp. 556–572 DOI: 10.3758/s13428-018-1144-2
  54. R.G.D. Steel “A Rank Sum Test for Comparing All Pairs of Treatments” In Technometrics 2.2, 1960, pp. 197–207
  55. J. Steil, M.X. Huang and A. Bulling “Fixation Detection for Head-Mounted Eye Tracking Based on Visual Similarity of Gaze Targets” In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, 2018 DOI: https://doi.org/10.1145/3204493.3204538
  56. R. Stiefelhagen, J. Yang and A. Waibel “Tracking eyes and monitoring eye gaze” In Proceedings of the Workshop on Perceptual User Interfaces, 1997, pp. 98–100
  57. “Differential Evolution – A Simple and Efficient Heuristic for Global Optimization over Continuous Spaces” In Journal of Global Optimization 11.4, 1997, pp. 341–359
  58. “Human-Ocular-Physiological-Characteristics- Based Adaptive Console Design” In IEEE Access 8, 2020, pp. 109596–109607 DOI: 10.1109/ACCESS.2020.3002543
  59. Y. Sugano, Y. Matsushita and Y. Sato “Appearance-based Gaze Estimation Using Visual Saliency” In IEEE Transactions on Pattern Analysis and Machine Intelligence 35.2, 2013, pp. 329–341 DOI: 10.1109/TPAMI.2012.101
  60. “Estimating Point-of-Gaze using Smooth Pursuit Eye Movements without Implicit and Explicit User-Calibration” In Proceedings of the ACM Symposium on Eye Tracking Research & Applications, 2020 DOI: https://doi.org/10.1145/3379156.3391343
  61. K.H. Tan, D.J. Kriegman and N. Ahuja “Appearance-based eye gaze estimation” In Proceedings of 6-th IEEE Workshop on Applications of Computer Vision, 2002, pp. 191–195 DOI: 10.1109/ACV.2002.1182180
  62. “Automatic calibration of eye trackers using 3D positions of fixations and virtual object surface shapes” (in Japanese) In IPSJ SIG Technical Report 2021-CVIM-224, 2021
  63. “AmbiGaze: Direct Control of Ambient Devices by Gaze” In Proceedings of the 2016 ACM Conference on Designing Interactive Systems, 2016 DOI: 10.1145/2901790.2901867
  64. A. Villanueva, R. Cabeza and S. Porta “Gaze tracking system model based on physical parameters” In International Journal of Pattern Recognition and Artificial Intelligence 21.5, 2007, pp. 855–877 DOI: https://doi.org/10.1142/S0218001407005697
  65. “SLAM-Based Localization of 3D Gaze Using a Mobile Eye Tracker” In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, 2018 DOI: 10.1145/3204493.3204584
  66. J.G. Wang, E. Sung and R. Venkateswarlu “Estimating the eye gaze from one eye” In Computer Vision and Image Understanding 98.1, 2005, pp. 83–103 DOI: https://doi.org/10.1016/j.cviu.2004.07.008
  67. “3D gaze estimation without explicit personal calibration” In Pattern Recognition 79, 2018, pp. 216–227 DOI: https://doi.org/10.1016/j.patcog.2018.01.031
  68. K. Wang, S. Wang and Q. Ji “Deep eye fixation map learning for calibration-free eye gaze tracking” In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, 2016, pp. 47–55 DOI: https://doi.org/10.1145/2857491.2857515
  69. “A New Calibration-Free Gaze Tracking Algorithm Based on DE-SLFA” In Proceedings of the 2016 8th International Conference on Information Technology in Medicine and Education, 2016, pp. 380–384 DOI: 10.1109/ITME.2016.0091
  70. K.P. White, T.E. Hutchinson and J.M. Carley “Spatially Dynamic Calibration of an Eye-tracking System” In IEEE Transactions on Systems, Man, and Cybernetics 23.4, 1993, pp. 1162–1168 DOI: 10.1109/21.247897
  71. O. Williams, A. Blake and R. Cipolla “Sparse and Semi-supervised Visual Mapping with the S3⁢GPsuperscriptS3GP\rm S^{3}GProman_S start_POSTSUPERSCRIPT 3 end_POSTSUPERSCRIPT roman_GP” In Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition 1, 2006, pp. 230–237 DOI: 10.1109/CVPR.2006.285
  72. L.Q. Xu, D. Machin and P. Sheppard “A Novel Approach to Real-time Non-intrusive Gaze Finding” In Proceedings of the British Machine Conference, 1998, pp. 428–437 DOI: 10.5244/C.12.43
  73. “Appearance-Based Gaze Estimation in the Wild” In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2015, pp. 4511–4520 DOI: 10.1109/CVPR.2015.7299081
  74. “Eye and gaze tracking for interactive graphic display” In Machine Vision and Applications 15.3, 2004, pp. 79–85 DOI: https://doi.org/10.1145/569005.569017
  75. Z. Zhu, Q. Ji and K.P. Bennett “Nonlinear Eye Gaze Mapping Function Estimation via Support Vector Regression” In Proceedings of the 18th International Conference on Pattern Recognition, 2006, pp. 1132–1135 DOI: 10.1109/ICPR.2006.864

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 1 tweet and received 0 likes.

Upgrade to Pro to view all of the tweets about this paper: