Investigating the Usability of Collaborative Robot control through Hands-Free Operation using Eye gaze and Augmented Reality (2306.13072v1)
Abstract: This paper proposes a novel operation for controlling a mobile robot using a head-mounted device. Conventionally, robots are operated using computers or a joystick, which creates limitations in usability and flexibility because control equipment has to be carried by hand. This lack of flexibility may prevent workers from multitasking or carrying objects while operating the robot. To address this limitation, we propose a hands-free method to operate the mobile robot with a human gaze in an Augmented Reality (AR) environment. The proposed work is demonstrated using the HoloLens 2 to control the mobile robot, Robotnik Summit-XL, through the eye-gaze in AR. Stable speed control and navigation of the mobile robot were achieved through admittance control which was calculated using the gaze position. The experiment was conducted to compare the usability between the joystick and the proposed operation, and the results were validated through surveys (i.e., SUS, SEQ). The survey results from the participants after the experiments showed that the wearer of the HoloLens accurately operated the mobile robot in a collaborative manner. The results for both the joystick and the HoloLens were marked as easy to use with above-average usability. This suggests that the HoloLens can be used as a replacement for the joystick to allow hands-free robot operation and has the potential to increase the efficiency of human-robot collaboration in situations when hands-free controls are needed.
- A. Ajoudani, A. M. Zanchettin, S. Ivaldi, A. Albu-Schäffer, K. Kosuge, and O. Khatib, “Progress and prospects of the human–robot collaboration,” Autonomous Robots, vol. 42, pp. 957–975, 2018.
- L. Wang, S. Liu, H. Liu, and X. V. Wang, “Overview of human-robot collaboration in manufacturing,” in Proceedings of 5th International Conference on the Industry 4.0 Model for Advanced Manufacturing: AMP 2020, pp. 15–58, Springer, 2020.
- K. Sakita, K. Ogawara, S. Murakami, K. Kawamura, and K. Ikeuchi, “Flexible cooperation between human and robot by interpreting human intention from gaze information,” in 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)(IEEE Cat. No. 04CH37566), vol. 1, pp. 846–851, IEEE, 2004.
- B. M. Faria, L. Ferreira, L. P. Reis, N. Lau, M. Petry, and J. Couto, “Manual control for driving an intelligent wheelchair: A comparative study of joystick mapping methods,” environment, vol. 17, p. 18, 2012.
- B. M. Faria, L. M. Ferreira, L. P. Reis, N. Lau, and M. Petry, “Intelligent wheelchair manual control methods: A usability study by cerebral palsy patients,” in Progress in Artificial Intelligence: 16th Portuguese Conference on Artificial Intelligence, EPIA 2013, Angra do Heroísmo, Azores, Portugal, September 9-12, 2013. Proceedings 16, pp. 271–282, Springer, 2013.
- M. F. Land and S. Furneaux, “The knowledge base of the oculomotor system,” Philosophical Transactions of the Royal Society of London. Series B: Biological Sciences, vol. 352, no. 1358, pp. 1231–1239, 1997.
- D. Trombetta, G. S. Rotithor, I. Salehi, and A. P. Dani, “Human intention estimation using fusion of pupil and hand motion,” IFAC-PapersOnLine, vol. 53, no. 2, pp. 9535–9540, 2020.
- C. Gkournelos, P. Karagiannis, N. Kousi, G. Michalos, S. Koukas, and S. Makris, “Application of wearable devices for supporting operators in human-robot cooperative assembly tasks,” Procedia CIRP, vol. 76, pp. 177–182, 2018.
- G. Evans, J. Miller, M. I. Pena, A. MacAllister, and E. Winer, “Evaluating the microsoft hololens through an augmented reality assembly application,” in Degraded environments: sensing, processing, and display 2017, vol. 10197, pp. 282–297, SPIE, 2017.
- M. Walker, H. Hedayati, J. Lee, and D. Szafir, “Communicating robot motion intent with augmented reality,” in 2018 13th ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 316–324, 2018.
- J. Wilson, D. Steingart, R. Romero, J. Reynolds, E. Mellers, A. Redfern, L. Lim, W. Watts, C. Patton, J. Baker, et al., “Design of monocular head-mounted displays for increased indoor firefighting safety and efficiency,” in Helmet-and head-mounted displays X: technologies and applications, vol. 5800, pp. 103–114, SPIE, 2005.
- E. Lamon, A. De Franco, L. Peternel, and A. Ajoudani, “A capability-aware role allocation approach to industrial assembly tasks,” IEEE Robotics and Automation Letters, vol. 4, no. 4, pp. 3378–3385, 2019.
- M. Dalle Mura and G. Dini, “An augmented reality approach for supporting panel alignment in car body assembly,” Journal of Manufacturing Systems, vol. 59, pp. 251–260, 2021.
- K. Kosuge and Y. Hirata, “Human-robot interaction,” in 2004 IEEE International Conference on Robotics and Biomimetics, pp. 8–11, IEEE, 2004.
- J. Scholtz, “Theory and evaluation of human robot interactions,” in 36th Annual Hawaii International Conference on System Sciences, 2003. Proceedings of the, pp. 10–pp, IEEE, 2003.
- P. A. Lasota, T. Fong, J. A. Shah, et al., “A survey of methods for safe human-robot interaction,” Foundations and Trends® in Robotics, vol. 5, no. 4, pp. 261–349, 2017.
- J. Carmigniani and B. Furht, “Augmented reality: an overview,” Handbook of augmented reality, pp. 3–46, 2011.
- M. R. Mine, “Virtual environment interaction techniques,” UNC Chapel Hill CS Dept, 1995.
- M. Cognolato, M. Atzori, and H. Müller, “Head-mounted eye gaze tracking devices: An overview of modern devices and recent advances,” Journal of rehabilitation and assistive technologies engineering, vol. 5, p. 2055668318773991, 2018.
- Microsoft, “Microsoft hololens 2.” https://www.microsoft.com/en-us/hololens/hardware, 2022. Accessed: 2022-07-30.
- T. Kot, P. Novák, and J. Bajak, “Using hololens to create a virtual operator station for mobile robots,” in 2018 19th International Carpathian Control Conference (ICCC), pp. 422–427, IEEE, 2018.
- S. Kapp, M. Barz, S. Mukhametov, D. Sonntag, and J. Kuhn, “Arett: Augmented reality eye tracking toolkit for head mounted displays,” Sensors, vol. 21, no. 6, p. 2234, 2021.
- C. Ott and Y. Nakamura, “Admittance control using a base force/torque sensor.,” IFAC Proceedings Volumes, vol. 42, no. 16, pp. 467–472, 2009.
- J. Brooke et al., “Sus-a quick and dirty usability scale,” Usability evaluation in industry, vol. 189, no. 194, pp. 4–7, 1996.
- W. Wetzlinger, A. Auinger, and M. Dörflinger, “Comparing effectiveness, efficiency, ease of use, usability and user experience when using tablets and laptops,” in Design, User Experience, and Usability. Theories, Methods, and Tools for Designing the User Experience: Third International Conference, DUXU 2014, Held as Part of HCI International 2014, Heraklion, Crete, Greece, June 22-27, 2014, Proceedings, Part I 3, pp. 402–412, Springer, 2014.