Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Assessing Augmented Reality Selection Techniques for Passengers in Moving Vehicles: A Real-World User Study (2307.06173v1)

Published 12 Jul 2023 in cs.HC

Abstract: Nowadays, cars offer many possibilities to explore the world around you by providing location-based information displayed on a 2D-Map. However, this information is often only available to front-seat passengers while being restricted to in-car displays. To propose a more natural way of interacting with the environment, we implemented an augmented reality head-mounted display to overlay points of interest onto the real world. We aim to compare multiple selection techniques for digital objects located outside a moving car by investigating head gaze with dwell time, head gaze with hardware button, eye gaze with hardware button, and hand pointing with gesture confirmation. Our study was conducted in a moving car under real-world conditions (N=22), with significant results indicating that hand pointing usage led to slower and less precise content selection while eye gaze was preferred by participants and performed on par with the other techniques.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (36)
  1. You Have a Point There: Object Selection Inside an Automobile Using Gaze, Head Pose and Finger Pointing. In Proceedings of the 2020 International Conference on Multimodal Interaction (Virtual Event, Netherlands) (ICMI ’20). Association for Computing Machinery, New York, NY, USA, 595–603. https://doi.org/10.1145/3382507.3418836
  2. Multimodal Fusion Using Deep Learning Applied to Driver’s Referencing of Outside-Vehicle Objects. In 2021 IEEE Intelligent Vehicles Symposium (IV). IEEE, 1108–1115.
  3. Determining What Individual SUS Scores Mean: Adding an Adjective Rating Scale. J. Usability Studies 4, 3 (may 2009), 114–123.
  4. Advantages of Eye-Gaze over Head-Gaze-Based Selection in Virtual and Augmented Reality under Varying Field of Views. In Proceedings of the Workshop on Communication by Gaze Interaction (Warsaw, Poland) (COGAIN ’18). Association for Computing Machinery, New York, NY, USA, Article 1, 9 pages. https://doi.org/10.1145/3206343.3206349
  5. John Brooke. 1996. SUS-A quick and dirty usability scale. Usability evaluation in industry 189, 194 (1996), 4–7.
  6. Traditional and raw task load index (TLX) correlations: Are paired comparisons necessary. Advances in industrial ergonomics and safety 1 (1989), 481–485.
  7. SwiVR-Car-Seat: Exploring Vehicle Motion Effects on Interaction Quality in Virtual Reality Automated Driving Using a Motorized Swivel Seat. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 5, 4, Article 150 (dec 2022), 26 pages. https://doi.org/10.1145/3494968
  8. Microsoft Corporation. 2022. What is Mixed Reality Toolkit 2? Retrieved April 04, 2023 from https://learn.microsoft.com/en-us/windows/mixed-reality/mrtk-unity/mrtk2/?view=mrtkunity-2022-05.
  9. Gaze- vs. Hand-Based Pointing in Virtual Environments. In CHI ’03 Extended Abstracts on Human Factors in Computing Systems (Ft. Lauderdale, Florida, USA) (CHI EA ’03). Association for Computing Machinery, New York, NY, USA, 772–773. https://doi.org/10.1145/765891.765982
  10. Interaction in Virtual Worlds. Springer International Publishing, Cham, 201–244. https://doi.org/10.1007/978-3-030-79062-2_6
  11. Driver Queries Using Wheel-Constrained Finger Pointing and 3-D Head-up Display Visual Feedback. In Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (Eindhoven, Netherlands) (AutomotiveUI ’13). Association for Computing Machinery, New York, NY, USA, 56–62. https://doi.org/10.1145/2516540.2516551
  12. John F Golding. 1998. Motion sickness susceptibility questionnaire revised and its relationship to other forms of sickness. Brain research bulletin 47, 5 (1998), 507–516.
  13. Studying Person-Specific Pointing and Gaze Behavior for Multimodal Referencing of Outside Objects from a Moving Vehicle. In Proceedings of the 2020 International Conference on Multimodal Interaction (Virtual Event, Netherlands) (ICMI ’20). Association for Computing Machinery, New York, NY, USA, 501–509. https://doi.org/10.1145/3382507.3418817
  14. A Design Space to Support the Development of Windshield Applications for the Car. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (San Jose, California, USA) (CHI ’16). Association for Computing Machinery, New York, NY, USA, 5076–5091. https://doi.org/10.1145/2858036.2858336
  15. A Fitts’ Law Study of Click and Dwell Interaction by Gaze, Head and Mouse with a Head-Mounted Display. In Proceedings of the Workshop on Communication by Gaze Interaction (Warsaw, Poland) (COGAIN ’18). Association for Computing Machinery, New York, NY, USA, Article 7, 5 pages. https://doi.org/10.1145/3206343.3206344
  16. Sandra G Hart. 2006. NASA-task load index (NASA-TLX); 20 years later. In Proceedings of the human factors and ergonomics society annual meeting, Vol. 50. Sage publications Sage CA: Los Angeles, CA, 904–908.
  17. Sandra G Hart and Lowell E Staveland. 1988. Development of NASA-TLX (Task Load Index): Results of empirical and theoretical research. In Advances in psychology. Vol. 52. Elsevier, 139–183.
  18. A taxonomy of interaction techniques for immersive augmented reality based on an iterative literature review. In 2021 IEEE international symposium on mixed and augmented reality (ISMAR). IEEE, 431–440.
  19. Robert J. K. Jacob. 1990. What You Look at is What You Get: Eye Movement-Based Interaction Techniques. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Seattle, Washington, USA) (CHI ’90). Association for Computing Machinery, New York, NY, USA, 11–18. https://doi.org/10.1145/97243.97246
  20. Andrew L. Kun. 2018. Human-Machine Interaction for Vehicles: Review and Outlook. Foundations and Trends® in Human–Computer Interaction 11, 4 (2018), 201–293. https://doi.org/10.1561/1100000069
  21. Pinpointing: Precise head-and eye-based target selection for augmented reality. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1–14.
  22. Ultraleap Limited. 2023. Ultraleap Gemini. Retrieved April 04, 2023 from https://www.ultraleap.com/tracking/gemini-hand-tracking-platform/.
  23. Francisco Lopez Luro and Veronica Sundstedt. 2019. A comparative study of eye tracking and hand controller for aiming tasks in virtual reality. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications. 1–9.
  24. Towards Future Interior Concepts: User Perception and Requirements for the Use Case Working in the Autonomous Car. In Advances in Human Aspects of Transportation, Neville Stanton (Ed.). Springer International Publishing, Cham, 315–322.
  25. Challenges in passenger use of mixed reality headsets in cars and other transportation. Virtual Reality 24 (2020), 583–603.
  26. A review of multimodal interaction technique in augmented reality environment. Int. J. Adv. Sci. Eng. Inf. Technol 8, 4-2 (2018), 1460.
  27. Varjo Technologies Oy. 2023. Introducing Varjo XR-3, the only true mixed reality headset. Retrieved April 04, 2023 from https://varjo.com/products/xr-3.
  28. Virtually the Same Experience? Learning from User Experience Evaluation of In-Vehicle Systems in VR and in the Field. In Proceedings of the 2019 on Designing Interactive Systems Conference (San Diego, CA, USA) (DIS ’19). Association for Computing Machinery, New York, NY, USA, 463–473. https://doi.org/10.1145/3322276.3322288
  29. Gaze-based interaction with windshield displays for automated driving: Impact of dwell time and feedback design on task performance and subjective workload. In 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 151–160.
  30. A Research Agenda for Mixed Reality in Automated Vehicles. In Proceedings of the 19th International Conference on Mobile and Ubiquitous Multimedia (Essen, Germany) (MUM ’20). Association for Computing Machinery, New York, NY, USA, 119–131. https://doi.org/10.1145/3428361.3428390
  31. Augmented Reality for Future Mobility: Insights from a Literature Review and HCI Workshop. i-com 20, 3 (2021), 295–318. https://doi.org/doi:10.1515/icom-2021-0029
  32. Free-Hand Pointing for Identification and Interaction with Distant Objects. In Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (Eindhoven, Netherlands) (AutomotiveUI ’13). Association for Computing Machinery, New York, NY, USA, 40–47. https://doi.org/10.1145/2516540.2516556
  33. Vildan Tanriverdi and Robert J. K. Jacob. 2000. Interacting with Eye Movements in Virtual Environments. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (The Hague, The Netherlands) (CHI ’00). Association for Computing Machinery, New York, NY, USA, 265–272. https://doi.org/10.1145/332040.332443
  34. Unity Technologies. 2023. Unity User Manual 2021.3 / Important Classes - Time. Retrieved April 04, 2023 from https://docs.unity3d.com/Manual/TimeFrameManagement.html.
  35. In-CAR Gaming: Exploring the Use of AR Headsets to Leverage Passenger Travel Environments for Mixed Reality Gameplay. In Extended Abstracts of the 2022 CHI Conference on Human Factors in Computing Systems (New Orleans, LA, USA) (CHI EA ’22). Association for Computing Machinery, New York, NY, USA, Article 369, 7 pages. https://doi.org/10.1145/3491101.3519741
  36. Trends in augmented reality tracking, interaction and display: A review of ten years of ISMAR. In 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality. IEEE, 193–202.
Citations (4)

Summary

We haven't generated a summary for this paper yet.