Shared Affordance-awareness via Augmented Reality for Proactive Assistance in Human-robot Collaboration (2312.13410v1)
Abstract: Enabling humans and robots to collaborate effectively requires purposeful communication and an understanding of each other's affordances. Prior work in human-robot collaboration has incorporated knowledge of human affordances, i.e., their action possibilities in the current context, into autonomous robot decision-making. This "affordance awareness" is especially promising for service robots that need to know when and how to assist a person that cannot independently complete a task. However, robots still fall short in performing many common tasks autonomously. In this work-in-progress paper, we propose an augmented reality (AR) framework that bridges the gap in an assistive robot's capabilities by actively engaging with a human through a shared affordance-awareness representation. Leveraging the different perspectives from a human wearing an AR headset and a robot's equipped sensors, we can build a perceptual representation of the shared environment and model regions of respective agent affordances. The AR interface can also allow both agents to communicate affordances with one another, as well as prompt for assistance when attempting to perform an action outside their affordance region. This paper presents the main components of the proposed framework and discusses its potential through a domestic cleaning task experiment.
- A topology of shared control systems—finding common ground in diversity. IEEE Transactions on Human-Machine Systems, 48(5):509–525, 2018.
- Affordances from human videos as a versatile representation for robotics. In IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 13778–13790, June 2023.
- John Brooke. Sus: a “quick and dirty” usability scale. Usability evaluation in industry, 189, 1996.
- Yale-cmu-berkeley dataset for robotic manipulation research. The International Journal of Robotics Research, 36(3):261–268, 2017.
- That’s on my mind! robot to human intention communication through on-board projection on shared floor space. In European Conference on Mobile Robots, pages 1–6, 2015.
- Projection-aware task planning and execution for human-in-the-loop operation of robots in a mixed-reality workspace. In EEE/RSJ International Conference on Intelligent Robots and Systems, pages 4476–4482, 2018.
- ARROCH: Augmented Reality for Robots Collaborating with a Human. In IEEE International Conference on Robotics and Automation, pages 3787–3793, May 2021.
- Spatial Computing and Intuitive Interaction: Bringing Mixed Reality and Robotics Together. IEEE Robotics & Automation Magazine, 29(1):45–57, Mar. 2022.
- Yiannis Demiris. Knowing when to assist: Developmental issues in lifelong assistive robotics. In Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pages 3357–3360, 2009.
- XRgonomics: Facilitating the Creation of Ergonomic 3D Interfaces. In CHI Conference on Human Factors in Computing Systems, pages 1–11, 2021.
- Vision-based holistic scene understanding towards proactive human–robot collaboration. Robotics and Computer-Integrated Manufacturing, 75:102304, 2022.
- James J Gibson. The theory of affordances. Hilldale, USA, 1(2):67–82, 1977.
- Seven principles of efficient human robot interaction. In IEEE International Conference on Systems, Man and Cybernetics, volume 4, pages 3942–3948, 2003.
- Development of nasa-tlx (task load index): Results of empirical and theoretical research. In Human Mental Workload, volume 52 of Advances in Psychology, pages 139–183. 1988.
- Using gaze patterns to predict task intent in collaboration. Frontiers in Psychology, 6, 2015.
- Challenges and opportunities in human robot collaboration context of industry 4.0 - a state of the art review. Industrial Robot: the international journal of robotics research and application, 49(2):226–239, 2022.
- Using eye gaze patterns to identify user tasks. In The Grace Hopper Celebration of Women in Computing, page 6, 2004.
- Affordances in psychology, neuroscience, and robotics: A survey. IEEE Transactions on Cognitive and Developmental Systems, 10(1):4–25, 2018.
- Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction. In ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication, pages 1151–1160, 2014.
- Physically Grounded Spatio-temporal Object Affordances. In Computer Vision – ECCV 2014, pages 831–847, 2014.
- Anticipating human activities using object affordances for reactive robotic response. IEEE Transactions on Pattern Analysis and Machine Intelligence, 38(1):14–29, 2016.
- Look into my Eyes: Using Pupil Dilation to Estimate Mental Workload for Task Complexity Adaptation. In CHI Conference on Human Factors in Computing Systems, pages 1–6. ACM, 2018.
- Effects of Integrated Intent Recognition and Communication on Human-Robot Collaboration. In IEEE/RSJ International Conference on Intelligent Robots and Systems, 2018.
- Simple model of human arm reachable workspace. IEEE Transactions on Systems, Man, and Cybernetics, 24:1239–1246, 1994.
- Long-Term Human Affordance Maps. In IEEE/RSJ International Conference on Intelligent Robots and Systems, pages 5748–5754, 2015.
- A Review of Intent Detection, Arbitration, and Communication Aspects of Shared Control for Physical Human–Robot Interaction. Applied Mechanics Reviews, 70(1), 2018.
- Integrating Eye-Tracking to Augmented Reality System for Surgical Training. Journal of medical systems, 44:192, 2020.
- Robot operating system 2: Design, architecture, and uses in the wild. Science Robotics, 7(66):eabm6074, 2022.
- Review of Eye-related Measures of Drivers’ Mental Workload. Procedia Manufacturing, 3:2854–2861, 2015.
- Object affordance based multimodal fusion for natural human-robot interaction. Cognitive Systems Research, 54:128–137, 2019.
- Creating a shared reality with robots. In ACM/IEEE International Conference on Human-Robot Interaction, pages 614–615, 2019.
- Learning Affordance Landscapes for Interaction Exploration in 3D Environments. In Advances in Neural Information Processing Systems, volume 33, pages 2005–2015, 2020.
- Workload Estimation in Physical Human–Robot Interaction Using Physiological Measurements. Interacting with Computers, 27(6):616–629, 2014.
- Human-Robot Interaction in a Shared Augmented Reality Workspace. In IEEE/RSJ International Conference on Intelligent Robots and Systems, pages 11413–11418, Oct. 2020. ISSN: 2153-0866.
- Proactive robot assistance: Affordance-aware augmented reality user interfaces. IEEE Robotics & Automation Magazine, 29(1):22–34, 2022.
- Communicating via Augmented Reality for Human-Robot Teaming in Field Environments. In IEEE International Symposium on Safety, Security, and Rescue Robotics, pages 94–101, 2019.
- Bringing the human arm reachable space to a virtual environment for its analysis. In International Conference on Multimedia and Expo, volume 1, pages I–229, 2003.
- Mixed reality as a bidirectional communication interface for human-robot interaction. In IEEE/RSJ International Conference on Intelligent Robots and Systems, pages 11431–11438, 2020.
- Augmented Reality and Robotics: A Survey and Taxonomy for AR-enhanced Human-Robot Interaction and Robotic Interfaces. In CHI Conference on Human Factors in Computing Systems, pages 1–33. ACM, 2022.
- Multimodal uncertainty reduction for intention recognition in human-robot interaction. In IEEE/RSJ International Conference on Intelligent Robots and Systems, pages 7009–7016, 2019.
- Intuitive and Safe Interaction in Multi-User Human Robot Collaboration Environments through Augmented Reality Displays. In IEEE International Conference on Robot & Human Interactive Communication, pages 520–526, 2021.
- Robot placement based on reachability inversion. In IEEE International Conference on Robotics and Automation, pages 1970–1975, 2013.
- Communicating Robot Motion Intent with Augmented Reality. In ACM/IEEE International Conference on Human-Robot Interaction, pages 316–324, 2018.
- Detectron2. https://github.com/facebookresearch/detectron2, 2019.
- Planning an efficient and robust base sequence for a mobile manipulator performing multiple pick-and-place tasks. In 2020 IEEE International Conference on Robotics and Automation (ICRA), pages 11018–11024, 2020.
- Capturing robot workspace structure: representing robot capabilities. In 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems, pages 3229–3236, 2007.
- Disentangled sequence clustering for human intention inference. In IEEE/RSJ International Conference on Intelligent Robots and Systems, pages 9814–9820, 2022.
- Head-mounted augmented reality for explainable robotic wheelchair assistance. In IEEE/RSJ International Conference on Intelligent Robots and Systems, pages 1823–1829, 2018.
- Drake Moore (2 papers)
- Mark Zolotas (8 papers)
- Taskin Padir (52 papers)