Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Emotional Tandem Robots: How Different Robot Behaviors Affect Human Perception While Controlling a Mobile Robot (2403.03746v1)

Published 6 Mar 2024 in cs.RO and cs.HC

Abstract: In human-robot interaction (HRI), we study how humans interact with robots, but also the effects of robot behavior on human perception and well-being. Especially, the influence on humans by tandem robots with one human controlled and one autonomous robot or even semi-autonomous multi-robot systems is not yet fully understood. Here, we focus on a leader-follower scenario and study how emotionally expressive motion patterns of a small, mobile follower robot affect the perception of a human operator controlling the leading robot. We examined three distinct emotional behaviors for the follower compared to a neutral condition: angry, happy and sad. We analyzed how participants maneuvered the leader robot along a set path while experiencing each follower behavior in a randomized order. We identified a significant shift in attention toward the follower with emotionally expressive behaviors compared to the neutral condition. For example, the angry behavior significantly heightened participant stress levels and was considered the least preferred behavior. The happy behavior was the most preferred and associated with increased excitement by the participants. Integrating the proposed behaviors in robots can profoundly influence the human operator's attention, emotional state, and overall experience. These insights are valuable for future HRI tandem robot designs.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (30)
  1. N. Dael, M. Mortillaro, and K. R. Scherer, “Emotion expression in body action and posture,” Emotion, vol. 12, no. 5, pp. 1085–1101, 2012.
  2. B. Hendriks, B. Meerbeek, S. Boess, S. Pauws, and M. Sonneveld, “Robot Vacuum Cleaner Personality and Behavior,” International Journal of Social Robotics, vol. 3, no. 2, pp. 187–195, 2011.
  3. M. Natarajan, E. Seraj, B. Altundas, R. Paleja, S. Ye, L. Chen, R. Jensen, K. C. Chang, and M. Gombolay, “Human-robot teaming: Grand challenges,” Current Robotics Reports, pp. 1–20, 2023.
  4. S. Bashyal and G. K. Venayagamoorthy, “Human swarm interaction for radiation source search and localization,” 2008 IEEE Swarm Intelligence Symposium, pp. 1–8, 2008.
  5. B. Pendleton and M. Goodrich, “Scalable human interaction with robotic swarms,” in AIAA Infotech@Aerospace (I@A) Conference, 2013.
  6. A. Kolling, P. Walker, N. Chakraborty, K. Sycara, and M. Lewis, “Human Interaction With Robot Swarms: A Survey,” IEEE Transactions on Human-Machine Systems, vol. 46, no. 1, pp. 9–26, 2016.
  7. M. A. Goodrich, S. Kerman, and S.-Y. Jun, “On leadership and influence in human-swarm interaction,” in AAAI Fall Symp. Series, 2012.
  8. M. Divband Soorati, J. Clark, J. Ghofrani, D. Tarapore, and S. D. Ramchurn, “Designing a user-centered interaction interface for human–swarm teaming,” Drones, vol. 5, no. 4, 2021. [Online]. Available: https://www.mdpi.com/2504-446X/5/4/131
  9. M. Santos and M. Egerstedt, “From Motions to Emotions: Can the Fundamental Emotions be Expressed in a Robot Swarm?” International Journal of Social Robotics, vol. 13, no. 4, pp. 751–764, 2021.
  10. G. Dietz, J. L. E, P. Washington, L. H. Kim, and S. Follmer, “Human Perception of Swarm Robot Motion,” Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems, pp. 2520–2527, 2017.
  11. A. F. Shariff and J. L. Tracy, “What Are Emotion Expressions For?” Current Directions in Psychological Science, vol. 20, no. 6, pp. 395–399, 2011.
  12. R. Stock-Homburg, “Survey of Emotions in Human–Robot Interactions: Perspectives from Robotic Psychology on 20 Years of Research,” Int. Journal of Social Robotics, vol. 14, no. 2, pp. 389–411, 2022.
  13. D. P. Newton and L. D. Newton, “Humanoid robots as teachers and a proposed code of practice,” in Frontiers in education, vol. 4.   Frontiers Media SA, 2019, p. 125.
  14. J. A. Pepito, H. Ito, F. Betriana, T. Tanioka, and R. C. Locsin, “Intelligent humanoid robots expressing artificial humanlike empathy in nursing situations,” Nursing Philosophy, vol. 21, no. 4, 2020.
  15. M. S. Erden, “Emotional Postures for the Humanoid-Robot Nao,” Int. Journal of Social Robotics, vol. 5, no. 4, pp. 441–456, 2013.
  16. K. Berns and J. Hirth, “Control of facial expressions of the humanoid robot head ROMAN,” 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 3119–3124, 2006.
  17. P. Ekman and H. Oster, “Facial expressions of emotion,” Annual review of psychology, vol. 30, no. 1, pp. 527–554, 1979.
  18. C. Breazeal, “Emotion and sociable humanoid robots,” Int. Journal of Human-Computer Studies, vol. 59, no. 1-2, pp. 119–155, 2003.
  19. Y. Wang, L. Hespanhol, and M. Tomitsch, “How Can Autonomous Vehicles Convey Emotions to Pedestrians? A Review of Emotionally Expressive Non-Humanoid Robots,” Multimodal Technologies and Interaction, vol. 5, no. 12, p. 84, 2021.
  20. B. Mutlu, J. Forlizzi, I. Nourbakhsh, and J. Hodgins, “The use of abstraction and motion in the design of social interfaces,” Proc. of the 6th Conf. on Designing Interactive Systems, pp. 251–260, 2006.
  21. J. Harris and E. Sharlin, “Exploring the affect of abstract motion in social human-robot interaction,” 2011 RO-MAN, pp. 441–448, 2011.
  22. D. Löffler, N. Schmidt, and R. Tscharn, “Multimodal expression of artificial emotion in social robots using color, motion and sound,” Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, pp. 334–343, 2018.
  23. M. Hoggenmueller, J. Chen, and L. Hespanhol, “Emotional expressions of non-humanoid urban robots: The role of contextual aspects on interpretations,” in Proceedings of the 9TH ACM International Symposium on Pervasive Displays, ser. PerDis ’20.   New York, NY, USA: Association for Computing Machinery, 2020, p. 87–95. [Online]. Available: https://doi.org/10.1145/3393712.3395341
  24. J. D. Hasbach and M. Bennewitz, “The design of self-organizing human–swarm intelligence,” Adaptive Behavior, vol. 30, no. 4, pp. 361–386, 2022.
  25. V. Villani, B. Capelli, C. Secchi, C. Fantuzzi, and L. Sabattini, “Humans interacting with multi-robot systems: a natural affect-based approach,” Autonomous Robots, vol. 44, no. 3-4, pp. 601–616, 2020.
  26. D. St-Onge, F. Levillain, E. Zibetti, and G. Beltrame, “Collective expression: how robotic swarms convey information with group motion,” J. of Behavioral Robotics, vol. 10, no. 1, pp. 418–435, 2019.
  27. J. Kaduk, M. Cavdan, K. Drewing, A. Vatakis, and H. Hamann, “Effects of human-swarm interaction on subjective time perception: Swarm size and speed,” Proceedings of the 2023 ACM/IEEE International Conference on Human-Robot Interaction, pp. 456–465, 2023.
  28. M. Cavdan, J. Kaduk, A. Vatakis, H. Hamann, and K. Drewing, “Time perception and emotion in a real-life human-robot swarm interaction,” Journal of Vision, vol. 23, no. 9, pp. 4798–4798, 2023. [Online]. Available: https://doi.org/10.1167/jov.23.9.4798
  29. F. Riedo, M. Chevalier, S. Magnenat, and F. Mondada, “Thymio II, a robot that grows wiser with children,” IEEE Workshop on Adv. Rob. and its Social Impacts, p. 187–193, 2013.
  30. G. Bradski, “The openCV library.” Dr. Dobb’s Journal: Software Tools for the Professional Programmer, vol. 25, no. 11, pp. 120–123, 2000.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Julian Kaduk (3 papers)
  2. Friederike Weilbeer (1 paper)
  3. Heiko Hamann (31 papers)

Summary

We haven't generated a summary for this paper yet.