MRNaB: Mixed Reality-based Robot Navigation Interface using Optical-see-through MR-beacon (2403.19310v1)
Abstract: Recent advancements in robotics have led to the development of numerous interfaces to enhance the intuitiveness of robot navigation. However, the reliance on traditional 2D displays imposes limitations on the simultaneous visualization of information. Mixed Reality (MR) technology addresses this issue by enhancing the dimensionality of information visualization, allowing users to perceive multiple pieces of information concurrently. This paper proposes Mixed reality-based robot navigation interface using an optical-see-through MR-beacon (MRNaB), a novel approach that incorporates an MR-beacon, situated atop the real-world environment, to function as a signal transmitter for robot navigation. This MR-beacon is designed to be persistent, eliminating the need for repeated navigation inputs for the same location. Our system is mainly constructed into four primary functions: "Add", "Move", "Delete", and "Select". These allow for the addition of a MR-beacon, location movement, its deletion, and the selection of MR-beacon for navigation purposes, respectively. The effectiveness of the proposed method was then validated through experiments by comparing it with the traditional 2D system. As the result, MRNaB was proven to increase the performance of the user when doing navigation to a certain place subjectively and objectively. For additional material, please check: https://mertcookimg.github.io/mrnab
- M. Quigley, K. Conley, B. Gerkey, J. Faust, T. Foote, J. Leibs, R. Wheeler, and A. Ng, “ROS: an open-source Robot Operating System,” vol. 3, 2009.
- A.-T. Ngo, N.-H. Tran, T.-P. Ton, H. Nguyen, and T.-P. Tran, “Simulation of hybrid autonomous underwater vehicle based on ros and gazebo,” in 2021 International Conference on Advanced Technologies for Communications (ATC), 2021, pp. 109–113.
- A. B. Cruz, A. Sousa, and L. P. Reis, “Controller for real and simulated wheelchair with a multimodal interface using gazebo and ros,” in 2020 IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC), 2020, pp. 164–169.
- G. Beraldo, N. Castaman, R. Bortoletto, E. Pagello, J. del R. Millán, L. Tonin, and E. Menegatti, “Ros-health: An open-source framework for neurorobotics,” in 2018 IEEE International Conference on Simulation, Modeling, and Programming for Autonomous Robots (SIMPAR), 2018, pp. 174–179.
- S. S. Velamala, D. Patil, and X. Ming, “Development of ros-based gui for control of an autonomous surface vehicle,” in 2017 IEEE International Conference on Robotics and Biomimetics (ROBIO), 2017, pp. 628–633.
- K. C. Hoang, W. P. Chan, S. Lay, A. Cosgun, and E. A. Croft, “ARviz: An Augmented Reality-Enabled Visualization Platform for ROS Applications,” IEEE Robotics & Automation Magazine, vol. 29, no. 1, pp. 58–67, 2022.
- M. Salvato, N. Heravi, A. M. Okamura, and J. Bohg, “Predicting hand-object interaction for improved haptic feedback in mixed reality,” IEEE Robotics and Automation Letters, vol. 7, no. 2, pp. 3851–3857, 2022.
- A. Devo, J. Mao, G. Costante, and G. Loianno, “Autonomous single-image drone exploration with deep reinforcement learning and mixed reality,” IEEE Robotics and Automation Letters, vol. 7, no. 2, pp. 5031–5038, 2022.
- Z. Makhataeva and H. A. Varol, “Augmented reality for robotics: A review,” Robotics, vol. 9, no. 2, 2020.
- W. Fan, X. Guo, E. Feng, J. Lin, Y. Wang, J. Liang, M. Garrad, J. Rossiter, Z. Zhang, N. Lepora, L. Wei, and D. Zhang, “Digital twin-driven mixed reality framework for immersive teleoperation with haptic rendering,” IEEE Robotics and Automation Letters, vol. 8, no. 12, pp. 8494–8501, 2023.
- L. Penco, K. Momose, S. McCrory, D. Anderson, N. Kitchel, D. Calvert, and R. J. Griffin, “Mixed reality teleoperation assistance for direct control of humanoids,” IEEE Robotics and Automation Letters, vol. 9, no. 2, pp. 1937–1944, 2024.
- C. Zhang, C. Lin, Y. Leng, Z. Fu, Y. Cheng, and C. Fu, “An effective head-based hri for 6d robotic grasping using mixed reality,” IEEE Robotics and Automation Letters, vol. 8, no. 5, pp. 2796–2803, 2023.
- J. Lee, T. Lim, and W. Kim, “Investigating the Usability of Collaborative Robot Control Through Hands-Free Operation Using Eye Gaze and Augmented Reality,” in Proc. of the IEEE/RSJ International Conference on Intelligent Robots and Systems, 2023, pp. 4101–4106.
- J. Chen, B. Sun, M. Pollefeys, and H. Blum, “A 3D Mixed Reality Interface for Human-Robot Teaming,” arXiv preprint arXiv:2310.02392, 2023, available: arXiv:2310.02392 [cs.RO].
- G. Zhang, D. Zhang, L. Duan, and G. Han, “Accessible Robot Control in Mixed Reality,” arXiv preprint arXiv:2306.02393, 2023, available: arXiv:2306.02393 [cs.RO].
- N. Koenig and A. Howard, “Design and use paradigms for Gazebo, an open-source multi-robot simulator,” in Proc. of the IEEE/RSJ International Conference on Intelligent Robots and Systems, vol. 3, 2004, pp. 2149–2154 vol.3.
- J. S. Cepeda, L. Chaimowicz, and R. Soto, “Exploring Microsoft Robotics Studio as a Mechanism for Service-Oriented Robotics,” in Proc. of the Latin American Robotics Symposium and Intelligent Robotics Meeting, 2010, pp. 7–12.
- O. Michel, “WebotsTM: Professional Mobile Robot Simulation,” International Journal of Advanced Robotic Systems, vol. 1, 2004.
- D. D. Rajapaksha, M. N. Mohamed Nuhuman, S. D. Gunawardhana, A. Sivalingam, M. N. Mohamed Hassan, S. Rajapaksha, and C. Jayawardena, “Web Based User-Friendly Graphical Interface to Control Robots with ROS Environment,” in Proc. of the International Conference on Information Technology Research, 2021, pp. 1–6.
- W. A. M. Fernando, C. Jayawardena, and U. U. S. Rajapaksha, “Developing A User-Friendly Interface from Robotic Applications Development,” in Proc. of the International Research Conference on Smart Computing and Systems Engineering, vol. 5, 2022, pp. 196–204.
- M. Aarizou, “ROS-based web application for an optimized multi-robots multi-users manipulation,” in Proc. of the National Conference in Computer Science Research and its Applications, 2023.
- H. Kam, S.-H. Lee, T. Park, and C.-H. Kim, “RViz: a toolkit for real domain data visualization,” Telecommunication Systems, vol. 60, pp. 1–9, 2015.
- I. Tiddi, E. Bastianelli, G. Bardaro, and E. Motta, “A User-friendly Interface to Control ROS Robotic Platforms,” in Proc. of the International Workshop on the Semantic Web, 2018.
- M. Gu, A. Cosgun, W. P. Chan, T. Drummond, and E. Croft, “Seeing Thru Walls: Visualizing Mobile Robots in Augmented Reality,” in Proc. of the IEEE International Conference on Robot & Human Interactive Communication. IEEE, 2021.
- M. Walker, H. Hedayati, J. Lee, and D. Szafir, “Communicating Robot Motion Intent with Augmented Reality,” in Proc. of the ACM/IEEE International Conference on Human-Robot Interaction, 2018, pp. 316–324.
- K. Owaki, N. Techasarntikul, and H. Shimonishi, “Human Behavior Analysis in Human-Robot Cooperation with AR Glasses,” in Proc. of the IEEE International Symposium on Mixed and Augmented Reality. Los Alamitos, CA, USA: IEEE Computer Society, 2023, pp. 20–28.
- M. Zolotas and Y. Demiris, “Towards Explainable Shared Control using Augmented Reality,” in Proc. of the IEEE/RSJ International Conference on Intelligent Robots and Systems, 2019, pp. 3020–3026.
- M. Zolotas, J. Elsdon, and Y. Demiris, “Head-Mounted Augmented Reality for Explainable Robotic Wheelchair Assistance,” in Proc. of the IEEE/RSJ International Conference on Intelligent Robots and Systems, 2018, pp. 1823–1829.
- F. Regal, C. Petlowany, C. Pehlivanturk, C. Van Sice, C. Suarez, B. Anderson, and M. Pryor, “AugRE: Augmented Robot Environment to Facilitate Human-Robot Teaming and Communication,” in Proc. of the IEEE International Conference on Robot and Human Interactive Communication, 2022, pp. 800–805.
- T. Kot, P. Novák, and J. Bajak, “Using HoloLens to create a virtual operator station for mobile robots,” in Proc. of the International Carpathian Control Conference, 2018, pp. 422–427.
- A. Angelopoulos, A. Hale, H. Shaik, A. Paruchuri, K. Liu, R. Tuggle, and D. Szafir, “Drone Brush: Mixed Reality Drone Path Planning,” in Proc. of the ACM/IEEE International Conference on Human-Robot Interaction, 2022, pp. 678–682.
- R. C. Quesada and Y. Demiris, “Holo-SpoK: Affordance-Aware Augmented Reality Control of Legged Manipulators,” in Proc. of the IEEE/RSJ International Conference on Intelligent Robots and Systems, 2022, pp. 856–862.
- C. Cruz, J. Cerro, and A. Barrientos, “Mixed-reality for quadruped-robotic guidance in SAR tasks,” Journal of Computational Design and Engineeringy, vol. 10, 2023.
- M. Wu, S.-L. Dai, and C. Yang, “Mixed Reality Enhanced User Interactive Path Planning for Omnidirectional Mobile Robot,” Applied Sciences, vol. 10, no. 3, p. 1135, 2020.
- J. Brooke, “SUS: A quick and dirty usability scale,” Usability Eval. Ind., vol. 189, 1995.
- S. S. Shapiro and M. B. Wilk, “An Analysis of Variance Test for Normality (Complete Samples),” Biometrika, vol. 52, no. 3/4, pp. 591–611, 1965.
- D. Rey and M. Neuhäuser, “Wilcoxon-Signed-Rank Test,” International Encyclopedia of Statistical Science, pp. 1658–1659, 2011.