GazeIntent: Adapting dwell-time selection in VR interaction with real-time intent modeling (2404.13829v1)
Abstract: The use of ML models to predict a user's cognitive state from behavioral data has been studied for various applications which includes predicting the intent to perform selections in VR. We developed a novel technique that uses gaze-based intent models to adapt dwell-time thresholds to aid gaze-only selection. A dataset of users performing selection in arithmetic tasks was used to develop intent prediction models (F1 = 0.94). We developed GazeIntent to adapt selection dwell times based on intent model outputs and conducted an end-user study with returning and new users performing additional tasks with varied selection frequencies. Personalized models for returning users effectively accounted for prior experience and were preferred by 63% of users. Our work provides the field with methods to adapt dwell-based selection to users, account for experience over time, and consider tasks that vary by selection frequency
- Deep learning with differential privacy. In Proceedings of the 2016 ACM SIGSAC conference on computer and communications security. 308–318.
- Lost in Style: Gaze-Driven Adaptive Aid for VR Navigation. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (Glasgow, Scotland Uk) (CHI ’19). Association for Computing Machinery, New York, NY, USA, 1–12. https://doi.org/10.1145/3290605.3300578
- What Do You Want to Do next: A Novel Approach for Intent Prediction in Gaze-Based Interaction. In Proceedings of the Symposium on Eye Tracking Research and Applications (Santa Barbara, California) (ETRA ’12). Association for Computing Machinery, New York, NY, USA, 83–90. https://doi.org/10.1145/2168556.2168569
- What do you want to do next: a novel approach for intent prediction in gaze-based interaction. In Proceedings of the symposium on eye tracking research and applications. 83–90.
- Differential privacy for eye tracking with temporal correlations. Plos one 16, 8 (2021), e0255979.
- Eye-tracked Virtual Reality: A Comprehensive Survey on Methods and Privacy Challenges. arXiv preprint arXiv:2305.14080 (2023).
- Predicting future position from natural walking and eye movements with machine learning. In 2021 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR). IEEE, 19–28.
- For your eyes only: Privacy-preserving eye-tracking datasets. In 2022 Symposium on Eye Tracking Research and Applications. 1–6.
- Privacy-preserving datasets of eye-tracking samples with applications in XR. IEEE Transactions on Visualization and Computer Graphics 29, 5 (2023), 2774–2784.
- Towards Gaze-Based Prediction of the Intent to Interact in Virtual Reality. In ACM Symposium on Eye Tracking Research and Applications (Virtual Event, Germany) (ETRA ’21 Short Papers). Association for Computing Machinery, New York, NY, USA, Article 2, 7 pages. https://doi.org/10.1145/3448018.3458008
- A continual learning survey: Defying forgetting in classification tasks. IEEE transactions on pattern analysis and machine intelligence 44, 7 (2021), 3366–3385.
- Real-time recording and classification of eye movements in an immersive virtual environment. Journal of vision 13, 12 (2013), 5–5.
- Effects of aging on eye movements in the real world. Frontiers in human neuroscience 9 (2015), 46.
- Anjith George and Aurobinda Routray. 2016. A score level fusion method for eye movement biometrics. Pattern Recognition Letters 82 (2016), 207–215.
- On the class imbalance problem. In 2008 Fourth international conference on natural computation, Vol. 4. IEEE, 192–201.
- Privacy-certification standards for extended-reality devices and services. In 2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). IEEE, 397–398.
- Visual memory and motor planning in a natural task. Journal of vision 3, 1 (2003), 6–6.
- A study of multimodal head/eye orientation prediction techniques in virtual space. In International Workshop on Advanced Imaging Technology (IWAIT) 2022, Vol. 12177. SPIE, 295–300.
- When XR and AI Meet-A Scoping Review on Extended Reality and Artificial Intelligence. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems. 1–45.
- Sepp Hochreiter and Jürgen Schmidhuber. 1996. LSTM can solve hard long time lag problems. Advances in neural information processing systems 9 (1996).
- Eye tracking: A comprehensive guide to methods and measures. Oxford University Press, United States.
- Dwell Selection with ML-Based Intent Prediction Using Only Gaze Data. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 6, 3, Article 120 (sep 2022), 21 pages. https://doi.org/10.1145/3550301
- Robert JK Jacob. 1990. What you look at is what you get: eye movement-based interaction techniques. In Proceedings of the SIGCHI conference on Human factors in computing systems. 11–18.
- Iebeling Kaastra and Milton Boyd. 1996. Designing a neural network for forecasting financial and economic time series. Neurocomputing 10, 3 (1996), 215–236.
- Emiram Kablo and Patricia Arias-Cabarcos. 2023. Privacy in the Age of Neurotechnology: Investigating Public Attitudes towards Brain Data Collection and Use. In Proceedings of the 2023 ACM SIGSAC Conference on Computer and Communications Security. 225–238.
- Chaowanan Khundam. 2015. First person movement control with palm normal and hand gesture interaction in virtual reality. In 2015 12th International Joint Conference on Computer Science and Software Engineering (JCSSE). IEEE, 325–330.
- Diederik P Kingma and Jimmy Ba. 2014. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014).
- Veronika Krauss. 2022. Exploring Dark Patterns in XR. In Proceedings of the 1st Workshop on Novel Challenges of Safety, Security and Privacy in Extended Reality, CHI Extended Abstracts (CHIEA’22). ACM.
- Scott M Lundberg and Su-In Lee. 2017. A unified approach to interpreting model predictions. Advances in neural information processing systems 30 (2017).
- OptiDwell: Intelligent Adjustment of Dwell Click Time. In Proceedings of the 22nd International Conference on Intelligent User Interfaces (Limassol, Cyprus) (IUI ’17). Association for Computing Machinery, New York, NY, USA, 193–204. https://doi.org/10.1145/3025171.3025202
- Marcus Nyström and Kenneth Holmqvist. 2010. An adaptive algorithm for fixation, saccade, and glissade detection in eyetracking data. Behavior research methods 42, 1 (2010), 188–204.
- Anneli Olsen. 2012. The Tobii I-VT fixation filter. Tobii Technology 21 (2012), 4–19.
- Anneli Olsen and Ricardo Matos. 2012. Identifying parameter values for an I-VT fixation filter suitable for handling data sampled with various sampling frequencies. In proceedings of the symposium on Eye tracking research and applications. 317–320.
- Gaze as an Indicator of Input Recognition Errors. Proceedings of the ACM on Human-Computer Interaction 6, ETRA (2022), 1–18.
- Gaze dynamics are sensitive to target orienting for working memory encoding in virtual reality. Journal of vision 22, 1 (2022), 2–2.
- Gaze+ pinch interaction in virtual reality. In Proceedings of the 5th symposium on spatial user interaction. 99–108.
- The eye in extended reality: A survey on gaze interaction and eye tracking in head-worn extended reality. ACM Computing Surveys (CSUR) 55, 3 (2022), 1–39.
- Deceptive Patterns and Perceptual Risks in an Eye-Tracked Virtual Reality. In 2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). IEEE.
- Dario D. Salvucci and Joseph H. Goldberg. 2000. Identifying Fixations and Saccades in Eye-Tracking Protocols. In Proceedings of the 2000 Symposium on Eye Tracking Research & Applications (Palm Beach Gardens, Florida, USA) (ETRA ’00). Association for Computing Machinery, New York, NY, USA, 71–78. https://doi.org/10.1145/355017.355028
- Abraham Savitzky and Marcel JE Golay. 1964. Smoothing and differentiation of data by simplified least squares procedures. Analytical chemistry 36, 8 (1964), 1627–1639.
- Eye-Tracking in Virtual Reality: A Visceral Notice Approach for Protecting Privacy. Privacy Studies Journal 2 (2023), 1–34.
- A comparison of ARIMA and LSTM in forecasting time series. In 2018 17th IEEE international conference on machine learning and applications (ICMLA). IEEE, 1394–1401.
- A review of interaction techniques for immersive environments. IEEE Transactions on Visualization and Computer Graphics (2022).
- Jameson Spivack. 2023. Risk Framework For Body-Related Data in Immersive Technologies. Retrieved Feb. 22, 2023 from https://fpf.org/wp-content/uploads/2023/12/FPF-Risk-Framework-for-Body-Related-Data-FINAL-Digital.pdf https://fpf.org/wp-content/uploads/2023/12/FPF-Risk-Framework-for-Body-Related-Data-FINAL-Digital.pdf.
- Human factors methods: a practical guide for engineering and design. Ashgate Publishing, Ltd.
- Privacy-aware eye tracking using differential privacy. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications. 1–9.
- Vildan Tanriverdi and Robert JK Jacob. 2000. Interacting with eye movements in virtual environments. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems. 265–272.
- Analyzing eye movements in interview communication with virtual reality agents. In Proceedings of the 7th International Conference on Human-Agent Interaction. 3–10.
- The Dark Side of Augmented Reality: Exploring Manipulative Designs in AR. International Journal of Human–Computer Interaction (2023), 1–16.
- Predicting Gaze-Based Target Selection in Augmented Reality Headsets Based on Eye and Head Endpoint Distributions. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (Hamburg, Germany) (CHI ’23). Association for Computing Machinery, New York, NY, USA, Article 283, 14 pages. https://doi.org/10.1145/3544548.3581042
- Gesture interaction in virtual reality. Virtual Reality & Intelligent Hardware 1, 1 (2019), 84–112.
- On early stopping in gradient descent learning. Constructive Approximation 26 (2007), 289–315.