Interactive Multi-Robot Flocking with Gesture Responsiveness and Musical Accompaniment (2404.00442v1)
Abstract: For decades, robotics researchers have pursued various tasks for multi-robot systems, from cooperative manipulation to search and rescue. These tasks are multi-robot extensions of classical robotic tasks and often optimized on dimensions such as speed or efficiency. As robots transition from commercial and research settings into everyday environments, social task aims such as engagement or entertainment become increasingly relevant. This work presents a compelling multi-robot task, in which the main aim is to enthrall and interest. In this task, the goal is for a human to be drawn to move alongside and participate in a dynamic, expressive robot flock. Towards this aim, the research team created algorithms for robot movements and engaging interaction modes such as gestures and sound. The contributions are as follows: (1) a novel group navigation algorithm involving human and robot agents, (2) a gesture responsive algorithm for real-time, human-robot flocking interaction, (3) a weight mode characterization system for modifying flocking behavior, and (4) a method of encoding a choreographer's preferences inside a dynamic, adaptive, learned system. An experiment was performed to understand individual human behavior while interacting with the flock under three conditions: weight modes selected by a human choreographer, a learned model, or subset list. Results from the experiment showed that the perception of the experience was not influenced by the weight mode selection. This work elucidates how differing task aims such as engagement manifest in multi-robot system design and execution, and broadens the domain of multi-robot tasks.
- AFTA (2018) Americans speak out about the arts in 2018. https://tinyurl.com/5a59e6uy. Retrieved May, 2023.
- The International Journal of Robotics Research 31(6): 753–773.
- In: IEEE International Conference on Robotics and Automation. pp. 5948–5953.
- Apostolus MK (1990) Robot choreography: Moving in a new direction. Leonardo 23(1): 25–29.
- Bachrach J, McLurkin J and Grue A (2008) Protoswarm: a language for programming multi-robot systems using the amorphous medium abstraction. In: International Joint Conference on Autonomous Agents and Multi-Agent Systems. Citeseer, pp. 1175–1178.
- International Journal of Social Robotics 1(1): 71–81. 10.1007/s12369-008-0001-3.
- Berger J, Bacula A and Knight H (2021) Exploring communicatory gestures for simple multi-robot systems. In: International Conference on Social Robotics. Springer, pp. 819–823.
- Communications of the ACM 46(7): 76–85.
- In: Workshop on Sound in Human-Robot Interaction at International Conference on Human-Robot Interaction. URL https://hal.science/hal-03250920/.
- Canal G, Angulo C and Escalera S (2015) Gesture based human multi-robot interaction. In: IEEE International Joint Conference on Neural Networks. pp. 1–8.
- In: ACM/IEEE International Conference on Human-Robot Interaction. ACM/IEEE Interaction on Human-Robot Interaction, pp. 434–442.
- Couture-Beil A, Vaughan RT and Mori G (2010) Selecting and commanding individual robots in a multi-robot system. In: IEEE Canadian Conference on Computer and Robot Vision. pp. 159–166.
- Cuan C, Berl E and LaViers A (2019) Time to compile: A performance installation as human-robot interaction study examining self-evaluation and perceived control. Paladyn, Journal of Behavioral Robotics 10(1): 267–285.
- arXiv preprint arXiv:2306.02632 .
- In: IEEE International Symposium on Robot and Human Interactive Communication. pp. 255–261.
- Cuan C, Pakrasi I and LaViers A (2018b) Time to compile: An interactive art installation. In: Biennial Symposium on Arts & Technology, volume 51. p. 19.
- Darmanin RN and Bugeja MK (2017) A review on multi-robot systems categorised by application domain. In: IEEE Mediterranean Conference on Control and Automation. pp. 701–706.
- International Journal of Advanced Manufacturing Technology 118(5-6): 1999–2015.
- In: Extended Abstracts in Conference on Human Factors in Computing Systems. pp. 2520–2527.
- Dragan AD, Lee KC and Srinivasa SS (2013) Legibility and predictability of robot motion. In: ACM/IEEE International Conference on Human-Robot Interaction. pp. 301–308.
- Duhaut D and Monacelli E (1991) Including control in the definition of a programming language for multi-robots. In: IEEE/RSJ International Workshop on Intelligent Robots and Systems. pp. 1382–1387.
- Annual Reviews in Control 49: 113–127.
- Forsythe W (2014) Black flags. https://www.williamforsythe.com/installations.html?&no_cache=1&detail=1&uid=62. Retrieved December, 2022.
- Gautam A and Mohan S (2012) A review of research in multi-robot systems. In: IEEE International Conference on Industrial and Information Systems. pp. 1–5.
- Google (2021) Google research tensorflow 3d. https://github.com/google-research/google-research/tree/master/tf3d. Retrived January, 2023.
- Hoffman G and Ju W (2014) Designing robots with movement in mind. Journal of Human-Robot Interaction 3(1): 91–122.
- Hoffman G and Vanunu K (2013) Effects of robotic companionship on music enjoyment and agent perception. In: ACM/IEEE International Conference on Human-Robot Interaction. pp. 317–324.
- Hoffman G and Weinberg G (2010) Shimon: an interactive improvisational robotic marimba player. In: Extended Abstracts in Conference on Human Factors in Computing Systems. Netherlands: Autonomous Robots, pp. 3097–3102.
- https://tinyurl.com/mtty3dsb. Retrieved May, 2023.
- Journal of Sensor and Actuator Networks 7(4): 52.
- Computer Music Journal 35(4): 49–63.
- Kim H and Landay JA (2018) Aeroquake: Drone augmented dance. In: Designing Interactive Systems Conference. pp. 691–701.
- Kim HJ and Ahn HS (2016) Realization of swarm formation flying and optimal trajectory generation for multi-drone performance show. In: IEEE International Symposium on System Integration. pp. 850–855.
- In: Conference on Human Factors in Computing Systems. pp. 1–13.
- ACM Transactions on Human-Robot Interaction 10(3): 1–25.
- In: IEEE International Conference on Collaboration Technologies and Systems. pp. 212–215.
- Knight H (2011) Eight lessons learned about non-verbal interactions through robot theater. In: International Conference on Social Robotics. Springer, pp. 42–51.
- IEEE Transactions on Human-Machine Systems 46(1): 9–26.
- In: ACM/IEEE International Conference on Human-Robot Interaction. pp. 243–251.
- In: Arts, volume 7. MDPI, p. 11.
- In: IEEE International Conference on Advanced Robotics. pp. 1–6.
- Expert Systems with Applications 173: 114660.
- In: IEEE/RSJ International Conference on Intelligent Robots and Systems. pp. 2297–2302.
- Moore D, Currano R and Sirkin D (2020) Sound decisions: How synthetic motor sounds improve autonomous vehicle-pedestrian interactions. In: International Conference on Automotive User Interfaces and Interactive Vehicular Applications. New York City: Association for Computing Machinery, pp. 94–103.
- In: Conference on Human Factors in Computing Systems. pp. 1–12.
- In: ACM/IEEE International Conference on Human-Robot Interaction. pp. 12–21.
- PhD Thesis, Massey University.
- NEA (2017) U.s. patterns of arts participation: A full report from the 2017 survey of public participation in the arts. https://tinyurl.com/kcbdceuu. Retrieved May, 2023.
- In: European Conference on Computer Vision. pp. 269–286.
- Parker LE (2007) Distributed intelligence: Overview of the field and its application in multi-robot systems. In: AAAI fall symposium: regarding the intelligence in distributed intelligent systems. pp. 1–6.
- Parker LE, Rus D and Sukhatme GS (2016) Multiple mobile robot systems. Springer handbook of robotics : 1335–1384.
- Patel J and Pinciroli C (2020) Improving human performance using mixed granularity of control in multi-human multi-robot interaction. In: IEEE International Conference on Robot and Human Interactive Communication. pp. 1135–1142.
- arXiv preprint arXiv:2102.00672 .
- Pinciroli C and Beltrame G (2016) Swarm-oriented programming of distributed robot networks. Computer 49(12): 32–41.
- Swarm Intelligence 10: 193–210.
- Reynolds CW (1987) Flocks, herds and schools: A distributed behavioral model. In: Annual Conference on Computer Graphics and Interactive Techniques. pp. 25–34.
- Rizk Y, Awad M and Tunstel EW (2019) Cooperative heterogeneous multi-robot systems: A survey. ACM Computing Surveys 52(2): 1–31.
- Robinson FA, Bown O and Velonaki M (2022) Designing sound for social robots: Candidate design principles. International Journal of Social Robotics 14(1): 1–19.
- In: IEEE International Conference on Robot and Human Interactive Communication. pp. 1–8.
- Santos M and Egerstedt M (2021) From motions to emotions: Can the fundamental emotions be expressed in a robot swarm? International Journal of Social Robotics 13: 751–764.
- Simmons R and Knight H (2017) Keep on dancing: Effects of expressive motion mimicry. In: IEEE International Symposium on Robot and Human Interactive Communication. pp. 720–727.
- Singer E, Feddersen J and Bowen B (2005) A large-scale networked robotic musical instrument installation. In: Conference on New Interfaces for Musical Expression. pp. 50–55.
- In: Conference on New Interfaces for Musical Expression. pp. 181–184.
- Singer E, Larke K and Bianciardi D (2003) Lemur guitarbot: Midi robotic string instrument. In: Conference on New Interfaces for Musical Expression, volume 3. pp. 188–191.
- Sirkin D and Ju W (2014) Using embodied design improvisation as a design research tool. In: International Conference on Human Behavior in Design.
- ACM Transactions on Human-Robot Interaction 8(2): 1–26.
- Autonomous Robots 44: 601–616.
- Weinberg G, Raman A and Mallikarjuna T (2009) Interactive jamming with shimon: a social robotic musician. In: ACM/IEEE International Conference on Human-Robot Interaction. pp. 233–234.