Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Introduction to Eye Tracking: A Hands-On Tutorial for Students and Practitioners (2404.15435v1)

Published 23 Apr 2024 in cs.HC

Abstract: Eye-tracking technology is widely used in various application areas such as psychology, neuroscience, marketing, and human-computer interaction, as it is a valuable tool for understanding how people process information and interact with their environment. This tutorial provides a comprehensive introduction to eye tracking, from the basics of eye anatomy and physiology to the principles and applications of different eye-tracking systems. The guide is designed to provide a hands-on learning experience for everyone interested in working with eye-tracking technology. Therefore, we include practical case studies to teach students and professionals how to effectively set up and operate an eye-tracking system. The tutorial covers a variety of eye-tracking systems, calibration techniques, data collection, and analysis methods, including fixations, saccades, pupil diameter, and visual scan path analysis. In addition, we emphasize the importance of considering ethical aspects when conducting eye-tracking research and experiments, especially informed consent and participant privacy. We aim to give the reader a solid understanding of basic eye-tracking principles and the practical skills needed to conduct their experiments. Python-based code snippets and illustrative examples are included in the tutorials and can be downloaded at: https://gitlab.lrz.de/hctl/Eye-Tracking-Tutorial.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (116)
  1. David Atchison. Optics of the Human Eye. CRC Press, 2 edition, 2023. doi:10.1201/9781003128601. URL https://doi.org/10.1201/9781003128601.
  2. Eye tracking: A comprehensive guide to methods and measures. OUP Oxford, 2011.
  3. Helga Kolb. Gross Anatomy of the Eye. 01 1995.
  4. Rhcastilhos and Jmarchn. Schematic diagram of the human eye, 2007. URL https://commons.wikimedia.org/wiki/File:Schematic_diagram_of_the_human_eye_en.svg. Last access 4/3/24.
  5. Human eye tracking and related issues: A review. International Journal of Scientific and Research Publications, 2(9):1–9, 2012.
  6. How the human eye focuses. Scientific American, 259(1):92–99, 1988.
  7. Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research, 3(3), 2010.
  8. The Moving Tablet of the Eye: The Origins of Modern Eye Movement Research. Oxford University Press, 2005.
  9. Andrew Duchowski. Eye tracking methodology: Theory and practice. Springer, 2007.
  10. Ways of improving the precision of eye tracking data: Controlling the influence of dirt and dust on pupil detection. Journal of Eye Movement Research, 10(3), may 2017a.
  11. ExCuSe: Robust pupil detection in real-world scenarios. In Computer Analysis of Images and Patterns, pages 39–51, Cham, 2015. Springer International Publishing. doi:10.1007/978-3-319-23192-1_4.
  12. Else: ellipse selection for robust pupil detection in real-world environments. pages 123–130, 03 2016. doi:10.1145/2857491.2857505.
  13. Pupilnet v2.0: Convolutional neural networks for robust pupil detection. In CoRR, 2017b.
  14. Corneal imaging revisited: An overview of corneal reflection analysis and applications. IPSJ Transactions on Computer Vision and Applications, 5:1–18, 2013. doi:10.2197/ipsjtcva.5.1.
  15. State of the art: Eye-tracking studies in medical imaging. IEEE Access, 6:37023–37034, 2018. doi:10.1109/ACCESS.2018.2851451.
  16. Reading comprehension and eye-tracking in college students: Comparison between low-and middle-skilled readers. Psychology, 9(15):2972–2983, 2018.
  17. A review of eye-tracking research in marketing. Review of marketing research, pages 123–147, 2017.
  18. Exploring natural eye-gaze-based interaction for immersive virtual reality. In 2017 IEEE Symposium on 3D User Interfaces (3DUI), pages 36–39, 2017. doi:10.1109/3DUI.2017.7893315.
  19. Application of eye-tracking in drivers testing: A review of research. International journal of occupational medicine and environmental health, 28(6), 2015.
  20. Nerijus Ramanauskas. Calibration of video-oculographical eye-tracking system. Elektronika Ir Elektrotechnika, 72(8):65–68, 2006.
  21. The influence of calibration method and eye physiology on eyetracking data quality. Behavior research methods, 45:272–288, 2013. doi:10.3758/s13428-012-0247-4.
  22. A differential approach for gaze estimation with calibration. In BMVC, volume 2, page 6, 2018.
  23. Frame-rate pupil detector and gaze tracker. In Proceedings of the IEEE ICCV, volume 99, 1999.
  24. SV Sheela and PA Vijaya. Mapping functions in gaze tracking. International Journal of Computer Applications, 26(3):36–42, 2011.
  25. Calibration techniques and gaze accuracy estimation in pupil labs eye tracker. TECHART: Journal of Arts and Imaging Science, 5:38–41, 02 2018. doi:10.15323/techart.2018.2.5.1.38.
  26. Tobii. Calibration. https://developer.tobiipro.com/commonconcepts/calibration.html, 2024. Last access 4/3/24.
  27. Tobii. Eye tracker calibration and validation. https://connect.tobii.com/s/article/eye-tracker-calibration, 2023a. Last access 4/3/24.
  28. Calibme: Fast and unsupervised eye tracker calibration for gaze-based pervasive human-computer interaction. pages 2594–2605, 05 2017a. doi:10.1145/3025453.3025950.
  29. The art of pervasive eye tracking: Unconstrained eye tracking in the austrian gallery belvedere. In Proceedings of the 7th workshop on pervasive eye tracking and mobile eye-based interaction, pages 1–8, 2018.
  30. The impact of slippage on the data quality of head-worn eye trackers. Behavior research methods, 52:1140–1160, 2020.
  31. Compensating for eye tracker camera movement. In Proceedings of the 2006 symposium on Eye tracking research & applications, pages 79–85, 2006.
  32. Automatic detection of camera translation in eye video recordings using multiple methods. In The 26th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, volume 1, pages 1525–1528. IEEE, 2004.
  33. Visible-spectrum gaze tracking for sports. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, pages 1005–1010, 2013.
  34. Self-calibrating head-mounted eye trackers using egocentric visual saliency. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology, pages 363–372, 2015.
  35. Calibme: Fast and unsupervised eye tracker calibration for gaze-based pervasive human-computer interaction. In Proceedings of the 2017 chi conference on human factors in computing systems, pages 2594–2605, 2017b.
  36. A time-efficient re-calibration algorithm for improved long-term accuracy of head-worn eye trackers. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, pages 213–216, 2016.
  37. Building a personalized, auto-calibrating eye tracker from user interactions. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, pages 5169–5179, 2016.
  38. Tobii. Eye Tracker Data Quality Test Report. https://www.tobiipro.com/siteassets/tobii-pro/accuracy-and-precision-tests/tobii-pro-spectrum-accuracy-and-precision-test-report.pdf/?v=1.1, 2022. Last access 25.7.22.
  39. Joseph E. McGrath. Methodology matters: Doing research in the behavioral and social sciences. In Readings in Human–Computer Interaction, pages 152–169. 1995. doi:10.1016/B978-0-08-051574-8.50019-4.
  40. Tobii. Tobii pro lab. https://www.tobii.com/products/software/behavior-research-software/tobii-pro-lab, 2023b. Last access 4/3/24.
  41. The applicability of probabilistic methods to the online recognition of fixations and saccades in dynamic scenes. In Proceedings of the symposium on eye tracking research and applications, pages 323–326, 2014. doi:10.1145/2578153.2578213.
  42. Bayesian online clustering of eye movement data. In Proceedings of the symposium on eye tracking research and applications, pages 285–288, 2012. doi:10.1145/2168556.2168617.
  43. Is the eye-movement field confused about fixations and saccades? a survey among 124 researchers. Royal Society Open Science, 5(8):180502, 2018. doi:10.1098/rsos.180502.
  44. Identifying fixations and saccades in eye-tracking protocols. In Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, page 71–78. ACM, 2000. doi:10.1145/355017 .355028.
  45. Keith Rayner. Eye movements in reading and information processing: 20 years of research. Psychological bulletin, 124(3):372, 1998.
  46. The effects of task difficulty on visual search strategy in virtual 3D displays. Journal of Vision, 13(3):24–24, 2013. ISSN 1534-7362. doi:10.1167/13.3.24.
  47. Digital transformations of classrooms in virtual reality. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. ACM, 2021. doi:10.1145/3411764.3445596.
  48. Expertise differences in the comprehension of visualizations: A meta-analysis of eye-tracking research in professional domains. Educational Psychology Review - EDUC PSYCHOL REV, 23:523–552, 12 2011. doi:10.1007/s10648-011-9174-7.
  49. Automated visual scanpath analysis reveals the expertise level of micro-neurosurgeons. 10 2015.
  50. Using eye tracking to evaluate and develop innovative teaching strategies for fostering image reading skills of novices in medical training. Eye Tracking Enhanced Learning (ETEL2017), 2017.
  51. 360-degree video gaze behaviour: A ground-truth data set and a classification algorithm for eye movements. In Proceedings of the 27th ACM International Conference on Multimedia, page 1007–1015. ACM, 2019. doi:10.1145/3343031.3350947.
  52. Eye tracking in web search tasks: Design implications. In Proceedings of the 2002 Symposium on Eye Tracking Research & Applications, page 51–58. ACM, 2002. doi:10.1145/50 7072.507082.
  53. Neuroscience. 2nd edition. Sinauer Associates 2001, 2001. ISBN 0-87893-742-0. Types of Eye Movements and Their Functions.
  54. Bayesian identification of fixations, saccades, and smooth pursuits. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, pages 163–170, 2016.
  55. Online recognition of fixations, saccades, and smooth pursuits for automated analysis of traffic hazard perception. In Artificial Neural Networks: Methods and Applications in Bio-/Neuroinformatics, pages 411–434. Springer, 2015.
  56. Harvey Richard Schiffman. Sensation and perception: An integrated approach. John Wiley & Sons; 5th edition, 2001.
  57. Camera-based eye blink detection algorithm for assessing driver drowsiness. In 2019 IEEE Intelligent Vehicles Symposium (IV), pages 987–993. IEEE, 2019.
  58. Cross-subject workload classification using pupil-related measures. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, pages 1–8, 2018.
  59. Cross-task and cross-participant classification of cognitive load in an emergency simulation game. IEEE Transactions on Affective Computing, 2021.
  60. Using task-induced pupil diameter and blink rate to infer cognitive load. Human–Computer Interaction, 29(4):390–413, 2014. doi:10.1080/07370024.2014.892428.
  61. Distracted worker: Using pupil size and blink rate to detect cognitive load during manufacturing tasks. Applied Ergonomics, 106:103867, 2023. ISSN 0003-6870. doi:10.1016/j.apergo.2022.103867.
  62. Exploring gender differences in computational thinking learning in a vr classroom: Developing machine learning models using eye-tracking data and explaining the models. International Journal of Artificial Intelligence in Education, 33(4):929–954, 2023. ISSN 1560-4306. doi:10.1007/s40593-022-00316-z.
  63. gazeNet: End-to-end eye-movement event detection with deep neural networks. Behavior Research Methods, 51(2):840–864, 2019. ISSN 1554-3528. doi:10.3758/s13428-018-1133-5.
  64. Online eye-movement classification with temporal convolutional networks. Behavior Research Methods, 55(7):3602–3620, 2023. ISSN 1554-3528. doi:10.3758/s13428-022-01978-2.
  65. Eye-tracked virtual reality: A comprehensive survey on methods and privacy challenges. 2023. doi:10.48550/arXiv.2305.14080.
  66. Input evaluation of an eye-gaze-guided interface: kalman filter vs. velocity threshold eye movement identification. In Proceedings of the 1st ACM SIGCHI Symposium on Engineering Interactive Computing Systems, page 197–202. ACM, 2009. ISBN 9781605586007. doi:10.1145/1570433.1570470.
  67. Real time eye movement identification protocol. In CHI ’10 Extended Abstracts on Human Factors in Computing Systems, page 3499–3504. ACM, 2010. ISBN 9781605589305. doi:10.1145/1753846.1754008.
  68. Qualitative and quantitative scoring and evaluation of the eye movement classification algorithms. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, page 65–68. ACM, 2010. doi:10.1145/1743666.1743682.
  69. Histogram of oriented velocities for eye movement detection. In Proceedings of the Workshop on Modeling Cognitive Processes from Multimodal Data, pages 1–6, 2018.
  70. Fully convolutional neural networks for raw eye tracking data segmentation, generation, and reconstruction. In 2020 25th International Conference on Pattern Recognition (ICPR), pages 142–149. IEEE, 2021.
  71. End-to-end eye movement detection using convolutional neural networks. arXiv preprint arXiv:1609.02452, 2016.
  72. Jackson Beatty. Task-evoked pupillary responses, processing load, and the structure of processing resources. Psychological bulletin, 91(2):276, 1982. doi:10.1037/0033-2909.91.2.276.
  73. Pupil diameter differentiates expertise in dental radiography visual search. PloS one, 15(5):e0223941, 2020a.
  74. Assessment of driver attention during a safety critical situation in VR to generate VR-based training. In ACM Symposium on Applied Perception 2019, pages 23:1–23:5. ACM, 2019. doi:10.1145/3343036.3343138.
  75. Eye activity as a measure of human mental effort in HCI. In Proceedings of the 16th International Conference on Intelligent User Interfaces, page 315–318. ACM, 2011. doi:10.1145/1943403.1943454.
  76. Safe and sensible preprocessing and baseline correction of pupil-size data. Behavior Research Methods, 50(1):94–106, 2018. doi:10.3758/s13428-017-1007-2.
  77. Scanpaths in saccadic eye movements while viewing and recognizing patterns. Vision research, 11(9):929–IN8, 1971a. doi:10.1016/0042-6989(71)90213-6.
  78. Scanpaths in eye movements during pattern perception. Science, 171(3968):308–311, 1971b. doi:10.1126/science.171.3968.308.
  79. A comparative eye tracking study of usability—towards sustainable web design. Sustainability, 13(18):10415, 2021. doi:10.3390/su131810415.
  80. Deep semantic gaze embedding and scanpath comparison for expertise classification during opt viewing. In ACM symposium on eye tracking research and applications, pages 1–10, 2020b. doi:10.1145/3379155.3391320.
  81. Eye tracking methodology for diagnosing neurological diseases: a survey. In 2020 Chinese Automation Congress (CAC), pages 2158–2162. IEEE, 2020. doi:10.1109/CAC51589.2020.9326691.
  82. Scanpath comparison using scangraph for education and learning purposes: Summary of previous educational studies performed with the use of scangraph. In 2022 Symposium on Eye Tracking Research and Applications, pages 1–6, 2022. doi:10.1145/3517031.3529243.
  83. Subsmatch 2.0: Scanpath comparison and classification based on subsequence frequencies. Behavior research methods, 49:1048–1064, 2017. doi:10.3758/s13428-016-0765-6.
  84. ScanMatch: A novel method for comparing fixation sequences. Behavior research methods, 42:692–700, 2010. doi:10.3758/BRM.42.3.692.
  85. A general method applicable to the search for similarities in the amino acid sequence of two proteins. Journal of molecular biology, 48(3):443–453, 1970. doi:https://doi.org/10.1016/0022-2836(70)90057-4.
  86. Dynamic programming: A computational tool, volume 38. Springer, 2006.
  87. A vector-based, multidimensional scanpath similarity measure. In Proceedings of the 2010 symposium on eye-tracking research & applications, pages 211–218, 2010. doi:10.1145/1743666.1743718.
  88. Softmatch: Comparing scanpaths using combinatorial spatio-temporal sequences with fractal curves. Sensors, 22(19):7438, 2022.
  89. Automatic control of saccadic eye movements made in visual inspection of briefly presented 2-d images. Spatial vision, 9(3):363–386, 1995. doi:10.1163/156856895x00052.
  90. A simple way to estimate similarity between pairs of eye movement sequences. Journal of Eye Movement Research, 5(1), 2012. doi:10.16910/jemr.5.1.4.
  91. It depends on how you look at it: Scanpath comparison in multiple dimensions with multimatch, a vector-based approach. Behavior research methods, 44:1079–1100, 2012. doi:10.3758/s13428-012-0212-2.
  92. Scanpath modeling and classification with hidden markov models. Behavior research methods, 50(1):362–379, 2018. doi:10.3758/s13428-017-0876-8.
  93. State-of-the-art in visual attention modeling. IEEE transactions on pattern analysis and machine intelligence, 35(1):185–207, 2012.
  94. How to support dental students in reading radiographs: effects of a gaze-based compare-and-contrast intervention. Advances in Health Sciences Education, 26:159–181, 2021.
  95. Advertising image saliency prediction method based on score level fusion. IEEE Access, 11:8455–8466, 2023.
  96. Review of visual saliency prediction: Development process from neurobiological basis to deep models. Applied Sciences, 12(1):309, 2021.
  97. User trust on an explainable ai-based medical diagnosis support system. arXiv preprint arXiv:2204.12230, 2022.
  98. A benchmark of computational models of saliency to predict human fixations. http://hdl.handle.net/1721.1/68590, 2012.
  99. DeepGaze II: Reading fixations from deep features trained on object recognition. 2016. doi:10.48550/arXiv.1610.01563.
  100. Human attention in fine-grained classification. BMVC, 2021. doi:10.48550/arXiv.2111.01628.
  101. Joan N Vickers. Perception, cognition, and decision training: The quiet eye in action. Human Kinetics, 2007.
  102. Eye tracking on visualizations: Progressive extraction of scanning strategies, pages 337–372. 01 2014. doi:10.1007/978-1-4614-7485-2_13.
  103. A dynamic graph visualization perspective on eye movement data. In Proceedings of the Symposium on Eye Tracking Research and Applications, page 151–158. ACM, 2014. doi:10.1145/2578153.2578175.
  104. A system for three-dimensional gaze fixation analysis using eye tracking glasses. Journal of Computational Design and Engineering, 5(4):449–457, 2018. ISSN 2288-4300. doi:10.1016/j.jcde.2017.12.007.
  105. Eye gaze patterns while searching vs. browsing a website. Usability News, 9(1):1–9, 2007.
  106. State-of-the-art of visualization for eye tracking data. In Eurovis (stars), page 29, 2014.
  107. From lenses to living rooms: A policy brief on eye tracking in XR before the impending boom. In 2024 IEEE International Conference on Artificial Intelligence and eXtended and Virtual Reality (AIxVR), pages 90–96, 2024. doi:10.1109/AIxVR59861.2024.00020.
  108. National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research. Belmont Report: Ethical Principles and Guidelines for the Protection of Human Subjects of Research, 1979. https://www.hhs.gov/ohrp/regulations-and-policy/belmont-report/index.html.
  109. World Medical Association. World medical association declaration of helsinki: Ethical principles for medical research involving human subjects. JAMA, 310(20):2191–2194, 2013. doi:10.1001/jama.2013.281053. First version in 1964.
  110. Privacy considerations for a pervasive eye tracking world. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication, page 1169–1177. ACM, 2014. doi:10.1145/2638728.2641688.
  111. Privacy preserving gaze estimation using synthetic images via a randomized encoding based framework. In ACM Symposium on Eye Tracking Research and Applications. ACM, 2020. doi:10.1145/3379156.3391364.
  112. Differential privacy for eye tracking with temporal correlations. PLOS ONE, 16(8):1–22, 2021. doi:10.1371/journal.pone.0255979.
  113. Privacy-preserving datasets of eye-tracking samples with applications in xr. IEEE Transactions on Visualization and Computer Graphics, 29(5):2774–2784, 2023. doi:10.1109/TVCG.2023.3247048.
  114. Federated learning for appearance-based gaze estimation in the wild. In Annual Conference on Neural Information Processing Systems, pages 20–36. PMLR, 2023.
  115. PrivatEyes: Appearance-based gaze estimation using federated secure multi-party computation, 2024.
  116. Privacy-preserving scanpath comparison for pervasive eye tracking. arXiv preprint arXiv:2404.06216, 2024. doi:10.48550/arXiv.2404.06216.
Citations (1)

Summary

We haven't generated a summary for this paper yet.