Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Evaluation of AI-Supported Input Methods in Augmented Reality Environment (2306.17132v1)

Published 29 Jun 2023 in cs.HC

Abstract: Augmented Reality (AR) solutions are providing tools that could improve applications in the medical and industrial fields. Augmentation can provide additional information in training, visualization, and work scenarios, to increase efficiency, reliability, and safety, while improving communication with other devices and systems on the network. Unfortunately, tasks in these fields often require both hands to execute, reducing the variety of input methods suitable to control AR applications. People with certain physical disabilities, where they are not able to use their hands, are also negatively impacted when using these devices. The goal of this work is to provide novel hand-free interfacing methods, using AR technology, in association with AI support approaches to produce an improved Human-Computer interaction solution.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (28)
  1. D. Parsons and K. MacCallum, “Current perspectives on augmented reality in medical education: Applications, affordances and limitations,” Advances in Medical Education and Practice, vol. 12, pp. 77–91, 2021.
  2. C. Dennler, D. E. Bauer, and A.-G. Scheibler, “Augmented reality in the operating room: a clinical feasibility study,” BMC Musculoskeletal Disorders, vol. 22, no. 1, p. 451, 2021.
  3. S. Hell and V. Argyriou, “Machine learning architectures to predict motion sickness using a virtual reality rollercoaster simulation tool,” in 2018 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR), pp. 153–156, 2018.
  4. V. Reljić, I. Milenković, S. Dudić, J. Šulc, and B. Bajči, “Augmented reality applications in industry 4.0 environment,” Applied Sciences, vol. 11, no. 12, 2021.
  5. X. Xu, Y. Lu, B. Vogel-Heuser, and L. Wang, “Industry 4.0 and industry 5.0—inception, conception and perception,” Journal of Manufacturing Systems, vol. 61, pp. 530–535, 2021.
  6. J. C. Kim, T. H. Laine, and C. Åhlund, “Multimodal interaction systems based on internet of things and augmented reality: A systematic literature review,” Applied Sciences, vol. 11, no. 4, 2021.
  7. M. Zilak, Z. Car, and I. Culjak, “A systematic literature review of handheld augmented reality solutions for people with disabilities,” Sensors, vol. 22, no. 20, 2022.
  8. Z. Rashid, J. Melià-Seguí, R. Pous, and E. Peig, “Using augmented reality and internet of things to improve accessibility of people with motor disabilities in the context of smart cities,” Future Generation Computer Systems, vol. 76, pp. 248–261, 2017.
  9. H. Kato, M. Billinghurst, I. Poupyrev, K. Imamoto, and K. Tachibana, “Virtual object manipulation on a table-top ar environment,” in Proceedings of the International Symposium on Augmented Reality (ISAR 2000), pp. 111–119, Oct. 2000.
  10. M. Billinghurst, H. Kato, and S. Myojin, “Advanced interaction techniques for augmented reality applications,” The Human Interface Technology New Zealand (HIT Lab NZ), vol. 1, 2001.
  11. K. Pfeuffer and et al., “ARtention: a design space for gaze-adaptive user interfaces in augmented reality,” Comput. Graph., vol. 95, pp. 1–12, 2021.
  12. R. Vertegaal et al., “Attentive user interfaces,” Communications of the ACM, vol. 46, no. 3, pp. 30–33, 2003.
  13. R. Piening, K. Pfeuffer, A. Esteves, T. Mittermeier, S. Prange, P. Schröder, and F. Alt, “Looking for info: Evaluation of gaze based information retrieval in augmented reality,” in Human-Computer Interaction – INTERACT 2021, (Cham), pp. 544–565, Springer International Publishing, 2021.
  14. M. Whitlock, E. Harnner, J. R. Brubaker, S. Kane, and D. A. Szafir, “Interacting with distant objects in augmented reality,” in 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), pp. 42–48, IEEE, 2018.
  15. F. Müller, J. McManus, S. Günther, M. Schmitz, M. Mühlhäuser, and M. Funk, “Mind the tap: Assessing foot-taps for interacting with head-mounted displays,” in Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, (New York, NY, USA), pp. 477:1–477:13, 2019.
  16. V. Bloom, D. Makris, and V. Argyriou, “Clustered spatio-temporal manifolds for online action recognition,” in 2014 22nd International Conference on Pattern Recognition, pp. 3963–3968, 2014.
  17. 1995.
  18. V. Argyriou, M. Petrou, and S. Barsky, “Photometric stereo with an arbitrary number of illuminants,” Computer Vision and Image Understanding, vol. 114, no. 8, pp. 887–900, 2010.
  19. V. Argyriou, “Performance study of gradient correlation for sub-pixel motion estimation in the frequency domain,” IEE Proceedings - Vision, Image and Signal Processing, vol. 152, pp. 107–114(7), February 2005.
  20. M. Bates, “Models of natural language understanding,” Proc. Natl. Acad. Sci. USA, vol. 92, no. 22, p. 9977, 1995.
  21. F. Manuri and G. Piumatti, “A preliminary study of a hybrid user interface for augmented reality applications,” IEEE, 8 2015.
  22. M. H. Ali, M. M. Jaber, S. K. Abd, A. Rehman, M. J. Awan, D. Vitkutė-Adžgauskienė, R. Damaševičius, and S. A. Bahaj, “Harris hawks sparse auto-encoder networks for automatic speech recognition system,” Applied Sciences, vol. 12, no. 3, 2022.
  23. M. Frutos-Pascual, C. Gale, C. Harrison, Jake M.and Creed, and I. Williams, “Character input in augmented reality: An evaluation of keyboard position and interaction visualisation for head-mounted displays,” in Human-Computer Interaction – INTERACT 2021, (Cham), pp. 480–501, Springer International Publishing, 2021.
  24. L. Chen, R. Balakrishnan, and T. Grossman, “Disambiguation techniques for freehand object manipulations in virtual reality,” in 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), pp. 285–292, Mar. 2020.
  25. S. Mahmud, X. Lin, and J.-H. Kim, “Interface for human machine interaction for assistant devices: A review,” in 2020 10th Annual Computing and Communication Workshop and Conference (CCWC), pp. 0768–0773, 2020.
  26. H. Ho, “Low cost and better accuracy eye tracker,” in 2014 International Symposium on Next-Generation Electronics (ISNE), pp. 1–2, May 2014.
  27. M. Yu, X. Wang, Y. Lin, and X. Bai, “Gaze tracking system for teleoperation,” in The 26th Chinese Control and Decision Conference (2014 CCDC), pp. 4617–4622, May 2014.
  28. D. Gego, C. Carreto, and L. Figueiredo, “Teleoperation of a mobile robot based on eye-gaze tracking,” in 2017 12th Iberian Conference on Information Systems and Technologies (CISTI), (Lisbon), 2017.

Summary

We haven't generated a summary for this paper yet.