Augmented Reality User Interface for Command, Control, and Supervision of Large Multi-Agent Teams (2401.05665v1)
Abstract: Multi-agent human-robot teaming allows for the potential to gather information about various environments more efficiently by exploiting and combining the strengths of humans and robots. In industries like defense, search and rescue, first-response, and others alike, heterogeneous human-robot teams show promise to accelerate data collection and improve team safety by removing humans from unknown and potentially hazardous situations. This work builds upon AugRE, an Augmented Reality (AR) based scalable human-robot teaming framework. It enables users to localize and communicate with 50+ autonomous agents. Through our efforts, users are able to command, control, and supervise agents in large teams, both line-of-sight and non-line-of-sight, without the need to modify the environment prior and without requiring users to use typical hardware (i.e. joysticks, keyboards, laptops, tablets, etc.) in the field. The demonstrated work shows early indications that combining these AR-HMD-based user interaction modalities for command, control, and supervision will help improve human-robot team collaboration, robustness, and trust.
- F. Regal, C. Petlowany, C. Pehlivanturk, C. Van Sice, C. Suarez, B. Anderson, and M. Pryor, “AugRE: Augmented robot environment to facilitate human-robot teaming and communication,” in 2022 31st IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), 2022, pp. 800–805, ISSN: 1944-9437.
- E. Ruffaldi, F. Brizzi, F. Tecchia, and S. Bacinelli, “Third point of view augmented reality for robot intentions visualization,” in Int. Conf. on AR, VR and Computer Graphics. Springer, 2016, pp. 471–478.
- H. Hedayati, M. Walker, and D. Szafir, “Improving collocated robot teleoperation with augmented reality,” in Proceedings of the 2018 ACM/IEEE Int. Conf. on Human-Robot Interaction, 2018, pp. 78–86.
- B. Huang, N. G. Timmons, and Q. Li, “Augmented Reality with Multi-view Merging for Robot Teleoperation,” in Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, ser. HRI ’20. New York, NY, USA: Association for Computing Machinery, Apr. 2020, pp. 260–262. [Online]. Available: https://doi.org/10.1145/3371382.3378336
- M. Gu, E. Croft, and A. Cosgun, “AR Point &Click: An Interface for Setting Robot Navigation Goals,” in Social Robotics, ser. Lecture Notes in Computer Science, F. Cavallo, J.-J. Cabibihan, L. Fiorini, A. Sorrentino, H. He, X. Liu, Y. Matsumoto, and S. S. Ge, Eds. Cham: Springer Nature Switzerland, 2022, pp. 38–49.
- C. Reardon, K. Lee, J. G. Rogers, and J. Fink, “Augmented reality for human-robot teaming in field environments,” in Int. Conf. on Human-Computer Interaction. Springer, 2019, pp. 79–92.
- S. A. Arboleda, F. Rücker, T. Dierks, and J. Gerken, “Assisting manipulation and grasping in robot teleoperation with augmented reality visual cues.” in CHI, 2021, pp. 728–1.
- F. Regal, Y. S. Park, J. Nolan, and M. Pryor, “Augmented Reality Remote Operation of Dual Arm Manipulators in Hot Boxes,” Mar. 2023, arXiv:2303.16055 [cs]. [Online]. Available: http://arxiv.org/abs/2303.16055
- R. Argüelles, M. Anderson, C. Fowler, and R. McAllister, “Azure spatial anchors overview,” Feb. 2022, accessed 2023-06-14. www.learn. microsoft.com/en-us/azure/spatial-anchors/overview. [Online]. Available: https://learn.microsoft.com/en-us/azure/spatial-anchors/overview
- K. S. Sikand, L. Zartman, S. Rabiee, and J. Biswas, “Robofleet: Open Source Communication and Management for Fleets of Autonomous Robots,” in 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Sept. 2021, pp. 406–412, iSSN: 2153-0866.
- M. McGill, J. Gugenheimer, and E. Freeman, “A Quest for Co-Located Mixed Reality: Aligning and Assessing SLAM Tracking for Same-Space Multi-User Experiences,” in 26th ACM Symposium on Virtual Reality Software and Technology. Virtual Event Canada: ACM, Nov. 2020, pp. 1–10. [Online]. Available: https://dl.acm.org/doi/10.1145/3385956.3418968
- Microsoft, “Microsoft/MixedReality-UXTools-Unreal: UX tools and components for developing Mixed Reality applications in UE4.” https://github.com/microsoft/MixedReality-UXTools-Unreal, 2022.