A Modular Framework for Flexible Planning in Human-Robot Collaboration (2406.04907v1)
Abstract: This paper presents a comprehensive framework to enhance Human-Robot Collaboration (HRC) in real-world scenarios. It introduces a formalism to model articulated tasks, requiring cooperation between two agents, through a smaller set of primitives. Our implementation leverages Hierarchical Task Networks (HTN) planning and a modular multisensory perception pipeline, which includes vision, human activity recognition, and tactile sensing. To showcase the system's scalability, we present an experimental scenario where two humans alternate in collaborating with a Baxter robot to assemble four pieces of furniture with variable components. This integration highlights promising advancements in HRC, suggesting a scalable approach for complex, cooperative tasks across diverse applications.
- Valerio Belcamino (9 papers)
- Mariya Kilina (3 papers)
- Linda Lastrico (10 papers)
- Alessandro Carfì (25 papers)
- Fulvio Mastrogiovanni (74 papers)