FocusFlow: 3D Gaze-Depth Interaction in Virtual Reality Leveraging Active Visual Depth Manipulation (2401.12872v3)
Abstract: Gaze interaction presents a promising avenue in Virtual Reality (VR) due to its intuitive and efficient user experience. Yet, the depth control inherent in our visual system remains underutilized in current methods. In this study, we introduce FocusFlow, a hands-free interaction method that capitalizes on human visual depth perception within the 3D scenes of Virtual Reality. We first develop a binocular visual depth detection algorithm to understand eye input characteristics. We then propose a layer-based user interface and introduce the concept of 'Virtual Window' that offers an intuitive and robust gaze-depth VR interaction, despite the constraints of visual depth accuracy and precision spatially at further distances. Finally, to help novice users actively manipulate their visual depth, we propose two learning strategies that use different visual cues to help users master visual depth control. Our user studies on 24 participants demonstrate the usability of our proposed virtual window concept as a gaze-depth interaction method. In addition, our findings reveal that the user experience can be enhanced through an effective learning process with adaptive visual cues, helping users to develop muscle memory for this brand-new input mechanism. We conclude the paper by discussing strategies to optimize learning and potential research topics of gaze-depth interaction.
- Calibration-free text entry using smooth pursuit eye movements. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications. 1–5.
- Eye Tracking in virtual reality: A broad review of applications and challenges. Virtual Reality (2023), 1–25.
- Verge-it: Gaze interaction for a binocular head-worn display using modulated disparity vergence eye movement. In Extended abstracts of the 2020 CHI conference on human factors in computing systems. 1–7.
- Ehsan Mohammadi Arvacheh and Hamid R Tizhoosh. 2006. Iris segmentation: Detecting pupil, limbus and eyelids. In 2006 International Conference on Image Processing. IEEE, 2453–2456.
- Exploring 3D Interaction with Gaze Guidance in Augmented Reality. In 2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR). IEEE, 22–32.
- Advantages of eye-gaze over head-gaze-based selection in virtual and augmented reality under varying field of views. In Proceedings of the Workshop on Communication by Gaze Interaction. 1–9.
- Orbits: Gaze interaction for smart watches using smooth pursuit eye movements. In Proceedings of the 28th annual ACM symposium on user interface software & technology. 457–466.
- Toward everyday gaze input: Accuracy and precision of eye tracking and implications for design. In Proceedings of the 2017 Chi conference on human factors in computing systems. 1118–1130.
- Hummer: Text entry by gaze and hum. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1–11.
- A design space for gaze interaction on head-mounted displays. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 1–12.
- Classifying Head Movements to Separate Head-Gaze and Head Gestures as Distinct Modes of Input. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems. 1–14.
- Robert JK Jacob. 1990. What you look at is what you get: eye movement-based interaction techniques. In Proceedings of the SIGCHI conference on Human factors in computing systems. 11–18.
- Dominik Kirst and Andreas Bulling. 2016. On the verge: Voluntary convergences for accurate and precise timing of gaze input. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems. 1519–1525.
- Aleksandra Królak and Paweł Strumiłło. 2012. Eye-blink detection system for human–computer interaction. Universal Access in the Information Society 11 (2012), 409–419.
- Input method using divergence eye movement. In CHI’13 Extended Abstracts on Human Factors in Computing Systems. 1335–1340.
- R. John Leigh and David S. Zee. 2015. 520Vergence Eye Movements. In The Neurology of Eye Movements. Oxford University Press. https://doi.org/10.1093/med/9780199969289.003.0009 arXiv:https://academic.oup.com/book/0/chapter/189897581/chapter-ag-pdf/44632480/book_25280_section_189897581.ag.pdf
- Feiyu Lu and Doug A Bowman. 2021. Evaluating the potential of glanceable ar interfaces for authentic everyday uses. In 2021 IEEE Virtual Reality and 3D User Interfaces (VR). IEEE, 768–777.
- Exploration of techniques for rapid activation of glanceable information in head-worn augmented reality. In Proceedings of the 2021 ACM Symposium on Spatial User Interaction. 1–11.
- Gaze-Pinch Menu: Performing Multiple Interactions Concurrently in Mixed Reality. In 2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). IEEE, 536–537.
- itext: Hands-free text entry on an imaginary keyboard for augmented reality systems. In The 34th Annual ACM Symposium on User Interface Software and Technology. 815–825.
- Gaze-hand alignment: Combining eye gaze and mid-air pointing for interacting with menus in augmented reality. Proceedings of the ACM on Human-Computer Interaction 6, ETRA (2022), 1–18.
- Fast gaze typing with an adjustable dwell time. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 357–360.
- Eyeseethrough: Unifying tool selection and application in virtual environments. In 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). IEEE, 474–483.
- Improving dwell-based gaze typing with dynamic, cascading dwell times. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. 2558–2570.
- My eyes hurt: Effects of jitter in 3d gaze tracking. In 2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). IEEE, 310–315.
- Transparent reality: Using eye gaze focus depth as interaction modality. In Adjunct Proceedings of the 29th Annual ACM Symposium on User Interface Software and Technology. 171–172.
- Designing for the eye: design parameters for dwell in gaze interaction. In Proceedings of the 24th Australian Computer-Human Interaction Conference. 479–488.
- ARtention: A design space for gaze-adaptive user interfaces in augmented reality. Computers & Graphics 95 (2021), 1–12.
- Gaze+ pinch interaction in virtual reality. In Proceedings of the 5th symposium on spatial user interaction. 99–108.
- Empirical evaluation of gaze-enhanced menus in virtual reality. In Proceedings of the 26th ACM Symposium on Virtual Reality Software and Technology. 1–11.
- Exploring natural eye-gaze-based interaction for immersive virtual reality. In 2017 IEEE symposium on 3D user interfaces (3DUI). IEEE, 36–39.
- Blink Don’t Wink: Exploring Blinks as Input for VR Games. In Proceedings of the 2023 ACM Symposium on Spatial User Interaction. 1–8.
- Detecting Input Recognition Errors and User Errors using Gaze Dynamics in Virtual Reality. In Proceedings of the 35th Annual ACM Symposium on User Interface Software and Technology. 1–19.
- Vergence Matching: Inferring Attention to Objects in 3D Environments for Gaze-Assisted Selection. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems. 1–15.
- Radi-Eye: Hands-free radial interfaces for 3D interaction using gaze-activated head-crossing. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1–11.
- Binocular eye-tracking for the control of a 3D immersive multimedia user interface. In 2015 IEEE 1St workshop on everyday virtual reality (WEVR). IEEE, 15–18.
- Vildan Tanriverdi and Robert JK Jacob. 2000. Interacting with eye movements in virtual environments. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems. 265–272.
- Christopher W Tyler and Maureen B Clarke. 1990. Autostereogram. In Stereoscopic displays and applications, Vol. 1256. SPIE, 182–197.
- Towards gaze-mediated interaction: Collecting solutions of the “Midas touch problem”. In Human-Computer Interaction INTERACT’97: IFIP TC13 International Conference on Human-Computer Interaction, 14th–18th July 1997, Sydney, Australia. Springer, 509–516.
- Looking at or through? using eye tracking to infer attention location for wearable transparent displays. In Proceedings of the 2014 ACM International Symposium on Wearable Computers. 87–90.
- Control with vergence eye movement in augmented reality see-through vision. In 2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). IEEE, 548–549.
- Gaze-vergence-controlled see-through vision in augmented reality. IEEE Transactions on Visualization and Computer Graphics 28, 11 (2022), 3843–3853.
- Presence and cybersickness in virtual reality are negatively related: a review. Frontiers in psychology 10 (2019), 158.
- Predicting gaze-based target selection in augmented reality headsets based on eye and head endpoint distributions. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems. 1–14.
- DEEP: 3D Gaze Pointing in Virtual Reality Leveraging Eyelid Movement. In Proceedings of the 35th Annual ACM Symposium on User Interface Software and Technology. 1–14.
- Gaze-supported 3d object manipulation in virtual reality. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1–13.
- Zhe Zeng and Matthias Roetting. 2018. A text entry interface using smooth pursuit movements and language model. In Proceedings of the 2018 acm symposium on eye tracking research & applications. 1–2.
- Xinyong Zhang. 2021. Evaluating the Effects of Saccade Types and Directions on Eye Pointing Tasks. In The 34th Annual ACM Symposium on User Interface Software and Technology. 1221–1234.