Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

An End-to-End Review of Gaze Estimation and its Interactive Applications on Handheld Mobile Devices (2307.00122v1)

Published 30 Jun 2023 in cs.HC and cs.CV

Abstract: In recent years we have witnessed an increasing number of interactive systems on handheld mobile devices which utilise gaze as a single or complementary interaction modality. This trend is driven by the enhanced computational power of these devices, higher resolution and capacity of their cameras, and improved gaze estimation accuracy obtained from advanced machine learning techniques, especially in deep learning. As the literature is fast progressing, there is a pressing need to review the state of the art, delineate the boundary, and identify the key research challenges and opportunities in gaze estimation and interaction. This paper aims to serve this purpose by presenting an end-to-end holistic view in this area, from gaze capturing sensors, to gaze estimation workflows, to deep learning techniques, and to gaze interactive applications.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (180)
  1. L2CS-Net: Fine-Grained Gaze Estimation in Unconstrained Environments. arXiv:2203.03339 [cs.CV]
  2. Microsaccade characteristics in neurological and ophthalmic disease. Frontiers in neurology 9 (2018), 144.
  3. Eye movement analysis for depression detection. In 2013 IEEE International Conference on Image Processing. IEEE, 4220–4224.
  4. An eye tracking investigation of developmental change in bottom-up attention orienting to faces in cluttered natural scenes. PloS one 9, 1 (2014), e85701.
  5. One Eye is All You Need: Lightweight Ensembles for Gaze Estimation with Single Encoders. arXiv:2211.11936 [cs.CV]
  6. Interactions between attention and working memory. Neuroscience 139, 1 (2006), 201–208.
  7. Pradeep Raj Krishnappa Babu and Uttama Lahiri. 2019. Understanding the role of Proximity and Eye gaze in human–computer interaction for individuals with autism. Journal of Ambient Intelligence and Humanized Computing 5 (2019), 1–15.
  8. Combining Gaze Estimation and Optical Flow for Pursuits Interaction. In ETRA ’20. ACM, Article 2, 10 pages.
  9. UbiGaze: Ubiquitous Augmented Reality Messaging Using Gaze Gestures. In SIGGRAPH ASIA 2016 Mobile Graphics and Interactive Applications. ACM, Article 11, 5 pages.
  10. PrivacyScout: Assessing Vulnerability to Shoulder Surfing on Mobile Devices. Proceedings on Privacy Enhancing Technologies 1 (2022), 21.
  11. Onnx: Open neural network exchange. Github. https://github.com/onnx/onnx
  12. Openface 2.0: Facial behavior analysis toolkit. In 2018 13th IEEE international conference on automatic face & gesture recognition (FG 2018). IEEE, 59–66.
  13. An individual-difference-aware model for cross-person gaze estimation. IEEE Transactions on Image Processing 31 (2022), 3322–3333.
  14. Adaptive feature fusion network for gaze tracking in mobile tablets. In 2020 25th International Conference on Pattern Recognition (ICPR). IEEE, 9936–9943.
  15. Error-Aware Gaze-Based Interfaces for Robust Mobile Gaze Interaction. In ETRA ’18. ACM, Article 24, 10 pages.
  16. Visual Search Target Inference in Natural Interaction Settings with Machine Learning. In ETRA ’20. ACM, Article 1, 8 pages.
  17. Eye tracker system for use with head mounted displays. In SMC’98 Conference Proceedings. 1998 IEEE International Conference on Systems, Man, and Cybernetics (Cat. No. 98CH36218), Vol. 5. IEEE, 4348–4352.
  18. Multimodal Recognition of Reading Activity in Transit Using Body-Worn Sensors. ACM Trans. Appl. Percept. 9, 1, Article 2 (mar 2012), 21 pages.
  19. Joy Buolamwini and Timnit Gebru. 2018. Gender shades: Intersectional accuracy disparities in commercial gender classification. In Conference on fairness, accountability and transparency. PMLR, 77–91.
  20. Eye position error influence over “open-loop” smooth pursuit initiation. Journal of Neuroscience 39, 14 (2019), 2709–2721.
  21. Accurate and Robust Eye Contact Detection During Everyday Mobile Device Interactions. arXiv:1907.11115 [cs.HC]
  22. Person-specific Face Spoofing Detection for Replay Attack Based on Gaze Estimation. In Biometric Recognition. Springer, 201–211.
  23. PathSync: Multi-User Gestural Interaction with Touchless Rhythmic Path Mimicry. In CHI ’16. ACM, 3415–3427.
  24. A Smooth Pursuit Calibration Technique. In ETRA ’14. ACM, 377–378.
  25. A High-Frame-Rate Eye-Tracking Framework for Mobile Devices. In ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 1445–1449.
  26. Zhaokang Chen and Bertram E. Shi. 2019. Appearance-Based Gaze Estimation Using Dilated-Convolutions. In Computer Vision – ACCV 2018, C.V. Jawahar, Hongdong Li, Greg Mori, and Konrad Schindler (Eds.). Springer, 309–324.
  27. EasyGaze: Hybrid eye tracking approach for handheld mobile devices. Virtual Reality & Intelligent Hardware 4, 2 (2022), 173–188.
  28. PureGaze: Purifying Gaze Feature for Generalizable Gaze Estimation. AAAI 36, 1 (Jun. 2022), 436–443.
  29. Yihua Cheng and Feng Lu. 2021. Gaze Estimation using Transformer. arXiv:2105.14424 [cs.CV]
  30. Appearance-based gaze estimation via evaluation-guided asymmetric regression. In the European Conference on Computer Vision (ECCV). IEEE, 105–121.
  31. Appearance-based Gaze Estimation With Deep Learning: A Review and Benchmark. arXiv:2104.12668 [cs.CV]
  32. Collaborative eye tracking: a potential training tool in laparoscopic surgery. Surgical endoscopy 26, 7 (2012), 2003–2009.
  33. RT-BENE: a dataset and baselines for real-time blink estimation in natural environments. In the IEEE/CVF International Conference on Computer Vision Workshops. IEEE, 0–0.
  34. Multimodal Gaze Interaction for Creative Design. In CHI ’20. ACM, 1–13.
  35. Specialized gaze estimation for children by convolutional neural network and domain adaptation. In 2017 IEEE International Conference on Image Processing (ICIP). IEEE, 3305–3309.
  36. PyGaze: An open-source, cross-platform toolbox for minimal-effort programming of eyetracking experiments. Behavior research methods 46, 4 (2014), 913–921.
  37. How Do Drivers Allocate Their Potential Attention? Driving Fixation Prediction via Convolutional Neural Networks. IEEE Transactions on Intelligent Transportation Systems 21, 5 (2020), 2146–2154.
  38. Jacob W Dink and Brock Ferguson. 2015. eyetrackingR: An R library for eye-tracking data analysis.
  39. An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale. arXiv:2010.11929 [cs.CV]
  40. Eye-Gaze Interaction for Mobile Phones. In International Conference on Mobile Technology, Applications, and Systems and the 1st International Symposium on Computer Human Interaction in Mobile Technology. ACM, 364–371.
  41. DialPlates: Enabling Pursuits-Based User Interfaces with Large Target Numbers. In MUM ’19. ACM, Article 10, 10 pages.
  42. Time- and Space-Efficient Eye Tracker Calibration. In ETRA ’19. ACM, Article 7, 8 pages.
  43. Heiko Drewes and Albrecht Schmidt. 2007. Interacting with the Computer Using Gaze Gestures. In Human-Computer Interaction – INTERACT 2007, Cécilia Baranauskas, Philippe Palanque, Julio Abascal, and Simone Diniz Junqueira Barbosa (Eds.). Springer Berlin Heidelberg, 475–488.
  44. Lingyu Du and Guohao Lan. 2022. FreeGaze: Resource-efficient Gaze Estimation via Frequency Domain Contrastive Learning. arXiv:2209.06692 [cs.CV]
  45. Unsupervised learning of eye gaze representation from the web. In 2019 International Joint Conference on Neural Networks (IJCNN). IEEE, 1–7.
  46. Andrew T Duchowski. 2018. Gaze-based interaction: A 30 year retrospective. Computers & Graphics 73 (2018), 59–69.
  47. Andrew T Duchowski and Andrew T Duchowski. 2017. Eye tracking methodology: Theory and practice. Springer.
  48. Gaze Input for Mobile Devices by Dwell and Gestures. In ETRA ’12. ACM, 225–228.
  49. A novel eye-gaze-controlled wheelchair system for navigating unknown environments: case study with a person with ALS. IEEE Access 4 (2016), 558–573.
  50. NLP-Based Approach to Detect Autism Spectrum Disorder in Saccadic Eye Movement. In 2020 IEEE Symposium Series on Computational Intelligence (SSCI). IEEE, 1581–1587.
  51. Carlos Elmadjian and Carlos H Morimoto. 2021. GazeBar: Exploiting the Midas Touch in Gaze Interaction. In CHI EA ’21. ACM, Article 248, 7 pages.
  52. OpenNEEDS: A Dataset of Gaze, Head, Hand, and Scene Signals During Exploration in Open-Ended VR Environments. In ETRA ’21. 1–7.
  53. Ralf Engbert and Reinhold Kliegl. 2003. Microsaccades uncover the orientation of covert attention. Vision research 43, 9 (2003), 1035–1045.
  54. Orbits: Gaze Interaction for Smart Watches Using Smooth Pursuit Eye Movements. In UIST ’15. ACM, 457–466.
  55. EyeTech. 2023. EyeOn Air – Eye Tracking Communication Aid. eyetechds. Retrieved 2023-03-10 from https://eyetechds.com/eyeon-air/
  56. An automated behavioral measure of mind wandering during computerized reading. Behavior Research Methods 50, 1 (2018), 134–150.
  57. The eye–mind wandering link: Identifying gaze indices of mind wandering across tasks. Journal of experimental psychology: human perception and performance 46, 10 (2020), 1201.
  58. Eyelid gestures on mobile devices for people with motor impairments. In The 22nd International ACM SIGACCESS Conference on Computers and Accessibility. Association for Computing Machinery, 1–8.
  59. Detecting Relevance during Decision-Making from Eye Movements for UI Adaptation. In ACM Symposium on Eye Tracking Research & Applications. Association for Computing Machinery, 1–11.
  60. Joint 3d face reconstruction and dense alignment with position map regression network. In the European Conference on Computer Vision (ECCV). Springer, 534–551.
  61. MIDAS: Deep learning human action intention prediction from natural eye movement patterns. arXiv:2201.09135 [cs.CV]
  62. RT-GENE: Real-Time Eye Gaze Estimation in Natural Environments. In the European Conference on Computer Vision (ECCV). Springer, 334–352.
  63. BORE: Boosted-Oriented Edge Optimization for Robust, Real Time Remote Pupil Center Detection. In ETRA ’16. ACM, Article 48, 5 pages.
  64. CBF: Circular Binary Features for Robust and Real-Time Pupil Center Detection. In ETRA ’18. ACM, Article 8, 6 pages.
  65. TEyeD: Over 20 Million Real-World Eye Images with Pupil, Eyelid, and Iris 2D and 3D Segmentations, 2D and 3D Landmarks, 3D Eyeball, Gaze Vector, and Eye Movement Types. In 2021 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, 367–375.
  66. ExCuSe: Robust Pupil Detection in Real-World Scenarios. In Computer Analysis of Images and Patterns, George Azzopardi and Nicolai Petkov (Eds.). Springer, 39–51.
  67. PupilNet v2.0: Convolutional Neural Networks for CPU based real time Robust Pupil Detection. arXiv:1711.00112 [cs.CV]
  68. ElSe: Ellipse Selection for Robust Pupil Detection in Real-World Environments. In ETRA ’16. ACM, 123–130.
  69. Cognitive processes involved in smooth pursuit eye movements: behavioral evidence, neural substrate and clinical correlation. Frontiers in systems neuroscience 7 (2013), 4.
  70. EYEDIAP: A Database for the Development and Evaluation of Gaze Estimation Algorithms from RGB and RGB-D Cameras. In ETRA ’14. ACM, 255–258.
  71. GazeR: A package for processing gaze position and pupil size data. Behavior research methods 52, 5 (2020), 2232–2255.
  72. GazeRoomLock: Using Gaze and Head-Pose to Improve the Usability and Observation Resistance of 3D Passwords in Virtual Reality. In Augmented Reality, Virtual Reality, and Computer Graphics, Lucio Tommaso De Paolis and Patrick Bourdot (Eds.). Springer, 61–81.
  73. PyTrack: An end-to-end analysis toolkit for eye tracking. Behavior research methods 52, 6 (2020), 2588–2603.
  74. Automatic Gaze Analysis: A Survey of Deep Learning based Approaches. arXiv:2108.05479 [cs.CV]
  75. Darren R Gitelman. 2002. ILAB: a program for postexperimental eye movement analysis. Behavior Research Methods, Instruments, & Computers 34, 4 (2002), 605–612.
  76. Grand View Research. 2022. Eye Tracking Market Size & Share Report, 2022 - 2030. grandviewresearch. https://www.grandviewresearch.com/industry-analysis/eye-tracking-market
  77. Eye-Tracking Technologies in Mobile Devices Using Edge Computing: A Systematic Review. ACM Comput. Surv. 55, 8, Article 158 (dec 2022), 33 pages.
  78. A Generalized and Robust Method Towards Practical Gaze Estimation on Smart Phone. arXiv:1910.07331 [cs.CV]
  79. Domain adaptation gaze estimation by embedding with prediction consistency. In the Asian Conference on Computer Vision. Springer, 292–307.
  80. Wrist-Worn Pervasive Gaze Interaction. In ETRA ’16. ACM, 57–64.
  81. Katarzyna Harezlak and Pawel Kasprowski. 2018. Application of eye tracking in medicine: A survey, research issues and challenges. Computerized Medical Imaging and Graphics 65 (2018), 176–190.
  82. On-Device Few-Shot Personalization for Real-Time Gaze Estimation. In 2019 IEEE/CVF International Conference on Computer Vision Workshop (ICCVW). IEEE, 1149–1158.
  83. Deep residual learning for image recognition. In CVPR ’16. IEEE, 770–778.
  84. OMEG: Oulu Multi-Pose Eye Gaze Dataset. In Image Analysis, Rasmus R. Paulsen and Kim S. Pedersen (Eds.). Springer, 418–427.
  85. GazeChat: Enhancing Virtual Conferences with Gaze-Aware 3D Photos. In UIST ’21. ACM, 769–782.
  86. Henna Heikkilä and Kari-Jouko Räihä. 2012. Simple Gaze Gestures and the Closure of the Eyes as an Interaction Technique. In ETRA ’12. ACM, 147–154.
  87. Oliver Hein and Wolfgang Zangemeister. 2017. Topology for gaze analyses-Raw data segmentation. Retrieved April 19, 2023 from https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7141061/
  88. Helena Hemmingsson and Maria Borgestig. 2020. Usability of eye-gaze controlled computers in Sweden: A total population survey. International journal of environmental research and public health 17, 5 (2020), 1639.
  89. Consequences of eye color, positioning, and head movement for eye-tracking data quality in infant research. Infancy 20, 6 (2015), 601–633.
  90. The area-of-interest problem in eyetracking research: A noise-robust solution for face and sparse stimuli. Behavior research methods 48, 4 (2016), 1694–1712.
  91. A Survey of Digital Eye Strain in Gaze-Based Interactive Systems. In ETRA ’20. ACM, Article 9, 12 pages.
  92. Sepp Hochreiter and Jürgen Schmidhuber. 1997. Long short-term memory. Neural computation 9, 8 (1997), 1735–1780.
  93. Eye Movements During Everyday Behavior Predict Personality Traits. Frontiers in Human Neuroscience 12 (2018), 105. https://www.frontiersin.org/articles/10.3389/fnhum.2018.00105
  94. Densely connected convolutional networks. In CVPR ’17. IEEE, 4700–4708.
  95. Labeled Faces in the Wild: A Database forStudying Face Recognition in Unconstrained Environments. In Workshop on Faces in ’Real-Life’ Images: Detection, Alignment, and Recognition. Erik Learned-Miller and Andras Ferencz and Frédéric Jurie, Inria, 1–14. https://hal.inria.fr/inria-00321923
  96. GAZEATTENTIONNET: Gaze Estimation with Attentions. In ICASSP 2022-2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2435–2439.
  97. Michael Xuelin Huang and Andreas Bulling. 2019. SacCalib: Reducing Calibration Distortion for Stationary Eye Trackers Using Saccadic Eye Movements. In ETRA ’19. ACM, Article 71, 10 pages.
  98. Building a Personalized, Auto-Calibrating Eye Tracker from User Interactions. In CHI ’16. ACM, 5169–5179.
  99. Stressclick: Sensing stress from gaze-click patterns. In the 24th ACM international conference on Multimedia. 1395–1404.
  100. ScreenGlint: Practical, In-Situ Gaze Estimation on Smartphones. In CHI ’17. ACM, 2546–2557.
  101. TabletGaze: dataset and analysis for unconstrained appearance-based gaze estimation in mobile tablets. Machine Vision and Applications 28, 5 (2017), 445–461.
  102. iMon: Appearance-based gaze tracking system on mobile devices. IMWUT 5, 4 (2021), 1–26.
  103. Yoon Min Hwang and Kun Chang Lee. 2022. An eye-tracking paradigm to explore the effect of online consumers’ emotion on their visual behaviour between desktop screen and mobile screen. Behaviour & Information Technology 41, 3 (2022), 535–546.
  104. Robert J. K. Jacob. 1991. The Use of Eye Movements in Human-Computer Interaction Techniques: What You Look at is What You Get. ACM Trans. Inf. Syst. 9, 2 (apr 1991), 152–169.
  105. Face-Centered Spatial User Interfaces on Smartwatches. In CHI EA ’22. ACM, Article 393, 7 pages.
  106. Sumit Jha and Carlos Busso. 2022. Estimation of driver’s gaze region from head position and orientation using probabilistic confidence regions. IEEE Transactions on Intelligent Vehicles 8, 1 (2022), 59–72.
  107. Learning to type with mobile keyboards: Findings with a randomized keyboard. Computers in Human Behavior 126 (2022), 106992.
  108. How We Type: Eye and Finger Movement Strategies in Mobile Typing. In CHI ’20. ACM, 1–14.
  109. Swati Jindal and Roberto Manduchi. 2022. Contrastive Representation Learning for Gaze Estimation. arXiv:2210.13404 [cs.CV]
  110. Contour-Guided Gaze Gestures: Using Object Contours as Visual Guidance for Triggering Interactions. In ETRA ’18. ACM, Article 28, 10 pages.
  111. Anuradha Kar and Peter Corcoran. 2017. A review and analysis of eye-gaze estimation systems, algorithms and performance evaluation methods in consumer platforms. IEEE Access 5 (2017), 16495–16519.
  112. Gestatten: Estimation of User’s Attention in Mobile MOOCs From Eye Gaze and Gaze Gesture Tracking. Proc. ACM Hum.-Comput. Interact. 4, EICS (2020), 1–32.
  113. Keith S. Karn. 2000. “Saccade Pickers” vs. “Fixation Pickers”: The Effect of Eye Tracking Instrumentation on Research. In ETRA ’00. ACM, 87–88.
  114. Pawel Kasprowski and Katarzyna Harezlak. 2016. Implicit Calibration Using Predicted Gaze Targets. In ETRA ’16. ACM, 245–248.
  115. Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-Based Interaction. In UbiComp ’14 Adjunct. ACM, 1151–1160.
  116. The Role of Eye Gaze in Security and Privacy Applications: Survey and Future HCI Research Directions. In CHI ’20. ACM, 1–21.
  117. Gaze360: Physically unconstrained gaze estimation in the wild. In the IEEE/CVF International Conference on Computer Vision. IEEE, 6912–6921.
  118. The Past, Present, and Future of Gaze-Enabled Handheld Mobile Devices: Survey and Lessons Learned. In MobileHCI ’18. ACM, Article 38, 17 pages.
  119. Understanding Face and Eye Visibility in Front-Facing Cameras of Smartphones Used in the Wild. In CHI ’18. ACM, 1–12.
  120. GTmoPass: Two-Factor Authentication on Public Displays Using Gaze-Touch Passwords and Personal Mobile Devices. In PerDis ’17. ACM, Article 8, 9 pages.
  121. GazeTouchPIN: Protecting Sensitive Data on Mobile Devices Using Secure Multimodal Authentication. In ICMI ’17. ACM, 446–450.
  122. User-centred multimodal authentication: securing handheld mobile devices using gaze and touch input. Behaviour & Information Technology 41, 10 (2022), 2061–2083.
  123. GAVIN: Gaze-assisted voice-based implicit note-taking. ACM Transactions on Computer-Human Interaction (TOCHI) 28, 4 (2021), 1–32.
  124. NVGaze: An Anatomically-Informed Dataset for Low-Latency, Near-Eye Gaze Estimation. In CHI ’19. ACM, 1–12.
  125. Pagination versus Scrolling in Mobile Web Search. In ACM International on Conference on Information and Knowledge Management. ACM, 751–760.
  126. Understanding eye movements on mobile devices for better presentation of search results. Journal of the Association for Information Science and Technology 67, 11 (2016), 2607–2619.
  127. Watch & Do: A smart iot interaction system with object detection and gaze estimation. IEEE Transactions on Consumer Electronics 65, 2 (2019), 195–204.
  128. Davis E. King. 2009. Dlib-ml: A Machine Learning Toolkit. Journal of Machine Learning Research 10 (2009), 1755–1758.
  129. Reinhold Kliegl and Richard K Olson. 1981. Reduction and calibration of eye monitor data. Behavior Research Methods & Instrumentation 13, 2 (1981), 107–111.
  130. Christof Koch and Shimon Ullman. 1987. Shifts in Selective Visual Attention: Towards the Underlying Neural Circuitry. Springer Netherlands, 115–141.
  131. Standardization of automated analyses of oculomotor fixation and saccadic behaviors. IEEE Transactions on biomedical engineering 57, 11 (2010), 2635–2645.
  132. Oleg V Komogortsev and Alex Karpov. 2013. Automated classification and scoring of smooth pursuit eye movements in the presence of fixations and saccades. Behavior research methods 45, 1 (2013), 203–215.
  133. EyeMU Interactions: Gaze + IMU Gestures on Mobile Devices. In ICMI ’21. ACM, 577–585.
  134. Weakly-supervised physically unconstrained gaze estimation. In CVPR ’21. IEEE, 9980–9989.
  135. Gaze-in-wild: A dataset for studying eye and head coordination in everyday activities. Scientific reports 10, 1 (2020), 1–18.
  136. Deep alignment network: A convolutional neural network for robust face alignment. In CVPR ’17 workshops. IEEE, 88–97.
  137. Predictive smooth pursuit eye movements. Annual review of vision science 5 (2019), 223–246.
  138. Eye tracking for everyone. In CVPR ’16. IEEE, 2176–2184.
  139. EyeMMV toolbox: An eye movement post-analysis tool based on a two-step spatial dispersion threshold for fixation identification. Journal of Eye Movement Research 7, 1 (Feb. 2014), 1–10.
  140. Eye Gaze Controlled Robotic Arm for Persons with Severe Speech and Motor Impairment. In ETRA ’20. ACM, Article 12, 9 pages.
  141. ImageNet Classification with Deep Convolutional Neural Networks. Commun. ACM 60, 6 (may 2017), 84–90.
  142. Chromium Based Framework to Include Gaze Interaction in Web Browser. In WWW ’17 Companion. International World Wide Web Conferences Steering Committee, 219–223.
  143. Gaze-Enhanced Scrolling Techniques. In CHI EA ’07. ACM, 2531–2536.
  144. Effects of shared gaze on audio-versus text-based remote collaborations. Proc. ACM Hum.-Comput. Interact. 4, CSCW2 (2020), 1–25.
  145. Towards Better Measurement of Attention and Satisfaction in Mobile Search. In SIGIR ’14. ACM, 113–122.
  146. Gradient-based learning applied to document recognition. Proc. IEEE 86, 11 (1998), 2278–2324.
  147. DynamicRead: Exploring Robust Gaze Interaction Methods for Reading on Handheld Mobile Devices under Dynamic Conditions. Proc. ACM Hum.-Comput. Interact. 7, ETRA23 (5 2023), 17.
  148. Ryan Lewien. 2021. GazeHelp: Exploring Practical Gaze-Assisted Interactions for Graphic Design Tools. In ETRA ’21 (ETRA ’21 Adjunct). ACM, Article 1, 4 pages.
  149. Device-Adaptive 2D Gaze Estimation: A Multi-Point Differential Framework. In Image and Graphics, Yuxin Peng, Shi-Min Hu, Moncef Gabbouj, Kun Zhou, Michael Elad, and Kun Xu (Eds.). Springer, 485–497.
  150. Towards Measuring and Inferring User Interest from Gaze. In WWW ’17 Companion. International World Wide Web Conferences Steering Committee, 525–533.
  151. Evaluation of appearance-based eye tracking calibration data selection. In 2020 IEEE International Conference on Artificial Intelligence and Computer Applications (ICAICA). IEEE, 222–224.
  152. Gaze-based kinaesthetic interaction for virtual reality. Interacting with Computers 32, 1 (2020), 17–32.
  153. BayesGaze: A Bayesian Approach to Eye-Gaze Based Target Selection. In Graphics Interface 2021. Canadian Information Processing Society, 231 – 240.
  154. Multiview multitask gaze estimation with deep convolutional neural networks. IEEE transactions on neural networks and learning systems 30, 10 (2018), 3010–3023.
  155. RGBD Based Gaze Estimation via Multi-Task CNN. AAAI 33, 01 (Jul. 2019), 2488–2495.
  156. Daniel J. Liebling and Susan T. Dumais. 2014. Gaze and Mouse Coordination in Everyday Work. In UbiComp ’14: Adjunct. ACM, 1141–1150.
  157. Deep Learning Face Attributes in the Wild. In International Conference on Computer Vision (ICCV). IEEE, 3730–3738.
  158. Chaochao Lu and Xiaoou Tang. 2014. Learning the Face Prior for Bayesian Face Recognition. In Computer Vision – ECCV 2014, David Fleet, Tomas Pajdla, Bernt Schiele, and Tinne Tuytelaars (Eds.). Springer, 119–134.
  159. Adaptive linear regression for appearance-based gaze estimation. IEEE transactions on pattern analysis and machine intelligence 36, 10 (2014), 2033–2046.
  160. Fast Gaze Typing with an Adjustable Dwell Time. In CHI ’09. ACM, 357–360.
  161. Päivi Majaranta and Andreas Bulling. 2014. Eye Tracking and Eye-Based Human–Computer Interaction. Springer London, 39–65.
  162. Inducing Gaze Gestures by Static Illustrations. In ETRA ’19. ACM, Article 75, 5 pages.
  163. Barry R Manor and Evian Gordon. 2003. Defining the temporal threshold for ocular fixation in free-viewing visuocognitive tasks. Journal of neuroscience methods 128, 1-2 (2003), 85–93.
  164. The role of fixational eye movements in visual perception. Nature reviews neuroscience 5, 3 (2004), 229–240.
  165. The impact of microsaccades on vision: towards a unified theory of saccadic function. Nature Reviews Neuroscience 14, 2 (2013), 83–96.
  166. Resource-aware Online Parameter Adaptation for Computationally -constrained Visual-Inertial Navigation Systems. In 2021 20th International Conference on Advanced Robotics (ICAR). IEEE, 842–848.
  167. Enhancing Mobile Voice Assistants with WorldGaze. In CHI ’20. ACM, 1–10.
  168. Schau genau! A gaze-controlled 3D game for entertainment and education. Journal of Eye Movement Research 10, 6 (2017), 220.
  169. Eye-mind reader: An intelligent reading interface that promotes long-term comprehension by detecting and responding to mind wandering. Human–Computer Interaction 36, 4 (2021), 306–332.
  170. Drivers use active gaze to monitor waypoints during automated driving. Scientific Reports 11, 1 (2021), 1–18.
  171. Kenneth Alberto Funes Mora and Jean-Marc Odobez. 2013. Person independent 3d gaze estimation from remote rgb-d cameras. In 2013 IEEE International Conference on Image Processing. IEEE, 2787–2791.
  172. Improving Dwell-Based Gaze Typing with Dynamic, Cascading Dwell Times. In CHI ’17. ACM, 2558–2570.
  173. Detecting Low Rapport During Natural Interactions in Small Groups from Non-Verbal Behaviour. In IUI ’18. ACM, 153–164.
  174. Vikrant Nagpure and Kenji Okuma. 2023. Searching Efficient Neural Architecture With Multi-Resolution Fusion Transformer for Appearance-Based Gaze Estimation. In the IEEE/CVF Winter Conference on Applications of Computer Vision. IEEE, 890–899.
  175. Comparing Dwell Time, Pursuits and Gaze Gestures for Gaze Interaction on Handheld Mobile Devices. In CHI ’23. ACM, Article 258, 17 pages.
  176. A Recognition Safety Net: Bi-Level Threshold Recognition for Mobile Motion Gestures. In MobileHCI ’12. ACM, 147–150.
  177. TurkEyes: A Web-Based Toolbox for Crowdsourcing Attention Data. In CHI ’20. ACM, 1–13.
  178. Self-attention with convolution and deconvolution for efficient eye gaze estimation from a full face image. In CVPR ’22. IEEE, 4992–5000.
  179. Reo Ogusu and Takao Yamanaka. 2019. LPM: learnable pooling module for efficient full-face gaze estimation. In 2019 14th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2019). IEEE, 1–5.
  180. Mobile News Learning — Investigating Political Knowledge Gains in a Social Media Newsfeed with Mobile Eye Tracking. Political Communication 39, 3 (2022), 339–357.
Citations (14)

Summary

We haven't generated a summary for this paper yet.