Sound Matters: Auditory Detectability of Mobile Robots (2404.06807v2)
Abstract: Mobile robots are increasingly being used in noisy environments for social purposes, e.g. to provide support in healthcare or public spaces. Since these robots also operate beyond human sight, the question arises as to how different robot types, ambient noise or cognitive engagement impacts the detection of the robots by their sound. To address this research gap, we conducted a user study measuring auditory detection distances for a wheeled (Turtlebot 2i) and quadruped robot (Unitree Go 1), which emit different consequential sounds when moving. Additionally, we also manipulated background noise levels and participants' engagement in a secondary task during the study. Our results showed that the quadruped robot sound was detected significantly better (i.e., at a larger distance) than the wheeled one, which demonstrates that the movement mechanism has a meaningful impact on the auditory detectability. The detectability for both robots diminished significantly as background noise increased. But even in high background noise, participants detected the quadruped robot at a significantly larger distance. The engagement in a secondary task had hardly any impact. In essence, these findings highlight the critical role of distinguishing auditory characteristics of different robots to improve the smooth human-centered navigation of mobile robots in noisy environments.
- Y. Kim and B. Mutlu, “How social distance shapes human–robot interaction,” International Journal of Human-Computer Studies, vol. 72, no. 12, pp. 783–795, 2014.
- K. Klüber and L. Onnasch, “Keep your distance! assessing proxemics to virtual robots by caregivers,” in Companion of the 2023 ACM/IEEE International Conference on Human-Robot Interaction, ser. HRI ’23. Association for Computing Machinery, pp. 193–197.
- J. de Heuvel, N. Corral, L. Bruckschen, and M. Bennewitz, “Learning Personalized Human-Aware Robot Navigation Using Virtual Reality Demonstrations from a User Study,” in 2022 31th IEEE International Conference on Robot Human Interactive Communication (RO-MAN), 2022.
- F. Babel, J. Kraus, and M. Baumann, “Findings from a qualitative field study with an autonomous robot in public: exploration of user reactions and conflicts,” International Journal of Social Robotics, vol. 14, no. 7, pp. 1625–1655, 2022.
- J. Han, H.-J. Kang, and G. H. Kwon, “Understanding the servicescape of nurse assistive robot: the perspective of healthcare service experience,” in 2017 14th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI). IEEE, 2017, pp. 644–649.
- R. Van Egmond, “The experience of product sounds,” in Product experience. Elsevier, 2008, pp. 69–89.
- F. A. Robinson, O. Bown, and M. Velonaki, “Implicit communication through distributed sound design: Exploring a new modality in human-robot interaction,” in Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, 2020, pp. 597–599.
- S. Bhagya, P. Samarakoon, M. Viraj, J. Muthugala, A. Buddhika, P. Jayasekara, and M. R. Elara, “An exploratory study on proxemics preferences of humans in accordance with attributes of service robots,” in 2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN). IEEE, 2019, pp. 1–7.
- G. Trovato, R. Paredes, J. Balvin, F. Cuellar, N. B. Thomsen, S. Bech, and Z.-H. Tan, “The sound or silence: investigating the influence of robot noise on proxemics,” in 2018 27th IEEE international symposium on robot and human interactive communication (RO-MAN). IEEE, 2018, pp. 713–718.
- A. B. Latupeirissa, C. Panariello, and R. Bresin, “Probing aesthetics strategies for robot sound: Complexity and materiality in movement sonification,” ACM Transactions on Human-Robot Interaction, vol. 12, no. 4, pp. 1–22, 2023.
- B. Petrak, K. Weitz, I. Aslan, and E. Andre, “Let me show you your new home: Studying the effect of proxemic-awareness of robots on users’ first impressions,” in 2019 28th IEEE international conference on robot and human interactive communication (RO-MAN). IEEE, 2019, pp. 1–7.
- E. Cha, N. T. Fitter, Y. Kim, T. Fong, and M. J. Matarić, “Effects of robot sound on auditory localization in human-robot collaboration,” in Proceedings of the 2018 ACM/IEEE international conference on human-robot interaction, 2018, pp. 434–442.
- S. B. P. Samarakoon, M. V. J. Muthugala, and A. B. P. Jayasekara, “A review on human–robot proxemics,” Electronics, vol. 11, no. 16, p. 2490, 2022.
- G. Johannsen, “Auditory displays in human–machine interfaces of mobile robots for non-speech communication with humans,” Journal of Intelligent and Robotic Systems, vol. 32, pp. 161–169, 2001.
- P. Tsarouchi, S. Makris, and G. Chryssolouris, “Human–robot interaction review and challenges on task planning and programming,” International Journal of Computer Integrated Manufacturing, vol. 29, no. 8, pp. 916–931, 2016.
- D. Moore, R. Currano, and D. Sirkin, “Sound decisions: How synthetic motor sounds improve autonomous vehicle-pedestrian interactions,” in 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, 2020, pp. 94–103.
- M. Wessels, S. Kröling, and D. Oberfeld, “Audiovisual time-to-collision estimation for accelerating vehicles: The acoustic signature of electric vehicles impairs pedestrians’ judgments,” vol. 91, pp. 191–212.
- M. Geronazzo, E. Sikström, J. Kleimola, F. Avanzini, A. De Götzen, and S. Serafin, “The impact of an accurate vertical localization with hrtfs on short explorations of immersive virtual reality scenarios,” in 2018 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, 2018, pp. 90–97.
- C. I. Cheng and G. H. Wakefield, “Introduction to head-related transfer functions (hrtfs): Representations of hrtfs in time, frequency, and space,” journal of the Audio Engineering Society, vol. 49, no. 4, pp. 231–249, 2001.
- W. M. To, A. Chung, et al., “Noise in restaurants: Levels and mathematical model,” Noise and Health, vol. 16, no. 73, p. 368, 2014.
- D. Oberfeld, M. Wessels, and D. Büttner, “Overestimated time-to-collision for quiet vehicles: Evidence from a study using a novel audiovisual virtual-reality system for traffic scenarios,” vol. 175, p. 106778.
- H. Levitt, “Transformed up‐down methods in psychoacoustics,” vol. 49, no. 2, pp. 467–477, number: 2B.
- J. Pinheiro, D. Bates, S. DebRoy, D. Sarkar, R. C. Team, S. H. A. f. sigma), B. V. W. P. f. sigma), J. Ranke (varConstProp()), and R. C. Team, “nlme: Linear and nonlinear mixed effects models.”
- A. R. A. Conway, M. J. Kane, M. F. Bunting, D. Z. Hambrick, O. Wilhelm, and R. W. Engle, “Working memory span tasks: A methodological review and user’s guide,” vol. 12, no. 5, pp. 769–786.
- W. K. Kirchner, “Age differences in short-term retention of rapidly changing information,” vol. 55, no. 4, pp. 352–358, place: US Publisher: American Psychological Association.
- S. G. Hart and L. E. Staveland, “Development of nasa-tlx (task load index): Results of empirical and theoretical research,” in Advances in psychology. Elsevier, 1988, vol. 52, pp. 139–183.