Robotic Blended Sonification: Consequential Robot Sound as Creative Material for Human-Robot Interaction (2404.13821v1)
Abstract: Current research in robotic sounds generally focuses on either masking the consequential sound produced by the robot or on sonifying data about the robot to create a synthetic robot sound. We propose to capture, modify, and utilise rather than mask the sounds that robots are already producing. In short, this approach relies on capturing a robot's sounds, processing them according to contextual information (e.g., collaborators' proximity or particular work sequences), and playing back the modified sound. Previous research indicates the usefulness of non-semantic, and even mechanical, sounds as a communication tool for conveying robotic affect and function. Adding to this, this paper presents a novel approach which makes two key contributions: (1) a technique for real-time capture and processing of consequential robot sounds, and (2) an approach to explore these sounds through direct human-robot interaction. Drawing on methodologies from design, human-robot interaction, and creative practice, the resulting 'Robotic Blended Sonification' is a concept which transforms the consequential robot sounds into a creative material that can be explored artistically and within application-based studies.
- 2006. Auditory and other non-verbal expressions of affect for robots. In AAAI fall symposium: aurally informed performance, 1–5.
- Cage, J. 1975. Child of Tree. Peters Edition EP 66685. https://www.johncage.org/pp/John-Cage-Work-Detail.cfm?work_ID=40.
- del Castello, G. 2023. RobotExMachina. GitHub repository. https://github.com/RobotExMachina.
- 2018. Perception of mechanical sounds inherent to expressive gestures of a nao robot-implications for movement sonification of humanoids.
- Giannini, N. 2015. Inner Out. Nicola Giannini. https://www.nicolagiannini.com/portfolio/inner-out-2/.
- 2017. Making noise intentional: A study of servo sound perception. In Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, HRI ’17, 12–21. New York, NY, USA: Association for Computing Machinery.
- 2023. The robot soundscape. In Cultural Robotics: Social Robots and Their Emergent Cultural Ecologies. Springer. 35–65.
- 2021. Smooth operator: Tuning robot perception through artificial movement sound. In Proceedings of the 2021 ACM/IEEE International Conference on Human-Robot Interaction, HRI ’21, 53–62. New York, NY, USA: Association for Computing Machinery.
- 2018. The sound or silence: investigating the influence of robot noise on proxemics. In 2018 27th IEEE international symposium on robot and human interactive communication (RO-MAN), 713–718. IEEE.
- 2013. Blended sonification: Sonification for casual interaction. In ICAD 2013-Proceedings of the International Conference on Auditory Display.
- Van Egmond, R. 2008. The experience of product sounds. In Product experience. Elsevier. 69–89.
- 2020. Robot gesture sonification to enhance awareness of robot status and enjoyment of interaction. In 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), 978–985. IEEE.