Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

How Can Autonomous Vehicles Convey Emotions to Pedestrians? A Review of Emotionally Expressive Non-Humanoid Robots (2403.04930v1)

Published 7 Mar 2024 in cs.HC

Abstract: In recent years, researchers and manufacturers have started to investigate ways to enable autonomous vehicles (AVs) to interact with nearby pedestrians in compensation for the absence of human drivers. The majority of these efforts focuses on external human-machine interfaces (eHMIs), using different modalities, such as light patterns or on-road projections, to communicate the AV's intent and awareness. In this paper, we investigate the potential role of affective interfaces to convey emotions via eHMIs. To date, little is known about the role that affective interfaces can play in supporting AV-pedestrian interaction. However, emotions have been employed in many smaller social robots, from domestic companions to outdoor aerial robots in the form of drones. To develop a foundation for affective AV-pedestrian interfaces, we reviewed the emotional expressions of non-humanoid robots in 25 articles published between 2011 and 2021. Based on findings from the review, we present a set of considerations for designing affective AV-pedestrian interfaces and highlight avenues for investigating these opportunities in future studies.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (71)
  1. Litman, T. Autonomous vehicle implementation predictions; Victoria Transport Policy Institute Victoria, Canada, 2017.
  2. Autonomous vehicles that interact with pedestrians: A survey of theory and practice. IEEE transactions on intelligent transportation systems 2019, 21, 900–918.
  3. AVIP-Autonomous vehicles’ interaction with pedestrians-An investigation of pedestrian-driver communication and development of a vehicle external interface. Master’s thesis, 2016.
  4. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems; Association for Computing Machinery: New York, NY, USA, 2018; p. 1–12.
  5. Designing for Projection-Based Communication between Autonomous Vehicles and Pedestrians. Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications; Association for Computing Machinery: New York, NY, USA, 2019; AutomotiveUI ’19, p. 284–294. doi:\changeurlcolorblack10.1145/3342197.3344543.
  6. Comparing State-of-the-Art and Emerging Augmented Reality Interfaces for Autonomous Vehicle-to-Pedestrian Communication. IEEE Transactions on Vehicular Technology 2021, 70, 1157–1168.
  7. Eyes on a Car: An Interface Design for Communication between an Autonomous Car and a Pedestrian. Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications; Association for Computing Machinery: New York, NY, USA, 2017; AutomotiveUI ’17, p. 65–73. doi:\changeurlcolorblack10.1145/3122986.3122989.
  8. How Should Automated Vehicles Interact with Pedestrians? A Comparative Analysis of Interaction Concepts in Virtual Reality. Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications; Association for Computing Machinery: New York, NY, USA, 2019; AutomotiveUI ’19, p. 262–274. doi:\changeurlcolorblack10.1145/3342197.3344544.
  9. Emotion encoding in human-drone interaction. 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 2016, pp. 263–270.
  10. Picard, R.W. Affective computing: challenges. International Journal of Human-Computer Studies 2003, 59, 55–64.
  11. A survey of socially interactive robots. Robotics and autonomous systems 2003, 42, 143–166.
  12. Social robots for long-term interaction: a survey. International Journal of Social Robotics 2013, 5, 291–308.
  13. Emotionally expressive dynamic physical behaviors in robots. International Journal of Human-Computer Studies 2015, 78, 1–16.
  14. Humans attribute emotions to a robot that shows simple behavioural patterns borrowed from dog behaviour. Computers in Human Behavior 2016, 59, 411–419.
  15. Personalized synthesis of intentional and emotional non-verbal sounds for social robots. 2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII). IEEE, 2019, pp. 1–7.
  16. Anthropomorphic inferences from emotional nonverbal cues: A case study. 19th international symposium in robot and human interactive communication. IEEE, 2010, pp. 646–651.
  17. Multimodal Expression of Artificial Emotion in Social Robots Using Color, Motion and Sound. Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction; Association for Computing Machinery: New York, NY, USA, 2018; HRI ’18, p. 334–343. doi:\changeurlcolorblack10.1145/3171221.3171261.
  18. Autonomously detecting interaction with an affective robot to explore connection to developmental ability. 2015 International Conference on Affective Computing and Intelligent Interaction (ACII). IEEE, 2015, pp. 1–7.
  19. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems; Association for Computing Machinery: New York, NY, USA, 2021.
  20. Designing Personas for Expressive Robots: Personality in the New Breed of Moving, Speaking, and Colorful Social Home Robots. J. Hum.-Robot Interact. 2021, 10. doi:\changeurlcolorblack10.1145/3424153.
  21. Exploring the affect of abstract motion in social human-robot interaction. 2011 Ro-Man. IEEE, 2011, pp. 441–448.
  22. Emotional Expressions of Non-Humanoid Urban Robots: The Role of Contextual Aspects on Interpretations; Association for Computing Machinery: New York, NY, USA, 2020; PerDis ’20, p. 87–95. doi:\changeurlcolorblack10.1145/3393712.3395341.
  23. Character Actor: Design and Evaluation of Expressive Robot Car Seat Motion. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2018, 1. doi:\changeurlcolorblack10.1145/3161407.
  24. Demonstration: First Steps in Emotional Expression of the Humanoid Robot Nao. Proceedings of the 2009 International Conference on Multimodal Interfaces; Association for Computing Machinery: New York, NY, USA, 2009; ICMI-MLMI ’09, p. 235–236. doi:\changeurlcolorblack10.1145/1647314.1647362.
  25. A mass-produced sociable humanoid robot: Pepper: The first machine of its kind. IEEE Robotics & Automation Magazine 2018, 25, 40–48.
  26. A Design Model of Emotional Body Expressions in Non-Humanoid Robots. Proceedings of the Second International Conference on Human-Agent Interaction; Association for Computing Machinery: New York, NY, USA, 2014; HAI ’14, p. 353–360. doi:\changeurlcolorblack10.1145/2658861.2658892.
  27. Expressing Emotions through Color, Sound, and Vibration with an Appearance-Constrained Social Robot. Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction; Association for Computing Machinery: New York, NY, USA, 2017; HRI ’17, p. 2–11. doi:\changeurlcolorblack10.1145/2909824.3020239.
  28. Designing Expressive Lights and In-Situ Motions for Robots to Express Emotions. Proceedings of the 6th International Conference on Human-Agent Interaction; Association for Computing Machinery: New York, NY, USA, 2018; HAI ’18, p. 222–228. doi:\changeurlcolorblack10.1145/3284432.3284458.
  29. How children interpret robots’ contextual behaviors in live theatre: Gaining insights for multi-robot theatre design. 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN). IEEE, 2020, pp. 327–334.
  30. Taming the eHMI jungle: A classification taxonomy to guide, compare, and assess the design principles of automated vehicles’ external human-machine interfaces. Transportation Research Interdisciplinary Perspectives 2020, 7, 100174.
  31. Pedestrian notifications. US Patent Office 2015. Patent No. 9 196 164 B1.
  32. Investigating the Effects of Feedback Communication of Autonomous Vehicles. 13th International Conference on Automotive User Interfaces and Interactive Vehicular Applications; Association for Computing Machinery: New York, NY, USA, 2021; AutomotiveUI ’21, p. 263–273. doi:\changeurlcolorblack10.1145/3409118.3475133.
  33. Evaluation of vehicle-to-pedestrian communication displays for autonomous vehicles. 96th Annual Transportation Research Board; , 2017.
  34. Don’t Panic! Guiding Pedestrians in Autonomous Traffic with Augmented Reality. Proceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct; Association for Computing Machinery: New York, NY, USA, 2018; MobileHCI ’18, p. 261–268. doi:\changeurlcolorblack10.1145/3236112.3236148.
  35. Mairs, J. Umbrellium develops interactive road crossing that only appears when needed 2017.
  36. A Design Space for External Communication of Autonomous Vehicles. 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications; Association for Computing Machinery: New York, NY, USA, 2020; AutomotiveUI ’20, p. 212–222. doi:\changeurlcolorblack10.1145/3409120.3410646.
  37. Newcomb, A. Humans Harass and Attack Self-Driving Waymo Cars. Available online: https://www.nbcnews.com/tech/innovation/humans-harass-attack-self-driving-waymo-cars-n950971 (accessed on 11th Oct 2021). NBC News 2018.
  38. Connor, S. First Self-Driving Cars Will Be Unmarked So That Other Drivers Don’t Try to Bully Them. Available online: https://www.theguardian.com/technology/2016/oct/30/volvo-self-driving-car-autonomous (accessed on 11th Oct 2021). The Guardian 2016.
  39. What driving style makes pedestrians think a passing vehicle is driving automatically? Applied ergonomics 2021, 95, 103428.
  40. Pedestrian trust in automated vehicles: Role of traffic signal and av driving behavior. Frontiers in Robotics and AI 2019, 6, 117.
  41. Garber, M. The Revolution Will Be Adorable: Why Google’s Cars Are So Cute. Available online: https://www.theatlantic.com/technology/archive/2014/05/the-revolution-will-be-adorable-why-googles-driverless-cars-are-so-cute/371699/ (accessed on 11th Oct 2021). The Atlantic 2014.
  42. D’Onfro, J. Why Google Made Its Self-Driving Car Look So Cute. Available online: https://www.businessinsider.com.au/google-self-driving-car-why-its-so-cute-2014-12?r=US&IR=T (accessed on 11th Oct 2021). Business Insider Australia 2014.
  43. Sood, G. Honda 2040 NIKO Comes with A Tiny AI Assistant, Taking the Car from A Vehicle to Your Friend! https://www.yankodesign.com/2021/08/21/honda-2040-niko-comes-with-a-tiny-ai-assistant-taking-the-car-from-a-vehicle-to-your-friend/ (accessed on 11th Oct 2021). Yanko Design 2021.
  44. Design and Evaluation of a Peripheral Robotic Conversation Companion. Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction; Association for Computing Machinery: New York, NY, USA, 2015; HRI ’15, p. 3–10. doi:\changeurlcolorblack10.1145/2696454.2696495.
  45. Affective Automotive User Interfaces–Reviewing the State of Driver Affect Research and Emotion Regulation in the Car. ACM Comput. Surv. 2021, 54. doi:\changeurlcolorblack10.1145/3460938.
  46. An Exploration of Prosocial Aspects of Communication Cues between Automated Vehicles and Pedestrians. 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications; Association for Computing Machinery: New York, NY, USA, 2020; AutomotiveUI ’20, p. 205–211. doi:\changeurlcolorblack10.1145/3409120.3410657.
  47. Designing Communication Strategies of Autonomous Vehicles with Pedestrians: An Intercultural Study. 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications; Association for Computing Machinery: New York, NY, USA, 2020; AutomotiveUI ’20, p. 122–131. doi:\changeurlcolorblack10.1145/3409120.3410653.
  48. Happy Moves, Sad Grooves: Using Theories of Biological Motion and Affect to Design Shape-Changing Interfaces. Proceedings of the 2016 ACM Conference on Designing Interactive Systems; Association for Computing Machinery: New York, NY, USA, 2016; DIS ’16, p. 1282–1293. doi:\changeurlcolorblack10.1145/2901790.2901845.
  49. Action elements of emotional body expressions for flying robots. 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 2016, pp. 439–440.
  50. Harzing, A. Publish or Perish. Available from https://harzing.com/resources/publish-or-perish 2007.
  51. Designing Emotional Expressions of Conversational States for Voice Assistants: Modality and Engagement. Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems; Association for Computing Machinery: New York, NY, USA, 2018; CHI EA ’18, p. 1–6. doi:\changeurlcolorblack10.1145/3170427.3188560.
  52. On the causality between affective impact and coordinated human-robot reactions. 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN). IEEE, 2020, pp. 488–494.
  53. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems; Association for Computing Machinery: New York, NY, USA, 2018; p. 1–11.
  54. Differences in Haptic and Visual Perception of Expressive 1DoF Motion. ACM Symposium on Applied Perception 2019; Association for Computing Machinery: New York, NY, USA, 2019; SAP ’19. doi:\changeurlcolorblack10.1145/3343036.3343136.
  55. Robots can defuse high-intensity conflict situations. 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2020, pp. 11376–11382.
  56. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems; Association for Computing Machinery: New York, NY, USA, 2019; p. 1–13.
  57. Affective Touch Robots with Changing Textures and Movements. 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN). IEEE, 2020, pp. 1–6.
  58. Constants across cultures in the face and emotion. Journal of personality and social psychology 1971, 17, 124.
  59. Russell, J.A. A circumplex model of affect. Journal of personality and social psychology 1980, 39, 1161.
  60. Mehrabian, A. Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in temperament. Current Psychology 1996, 14, 261–292.
  61. Goldberg, L.R. The development of markers for the Big-Five factor structure. Psychological assessment 1992, 4, 26.
  62. Communicating affect via flight path exploring use of the laban effort system for designing affective locomotion paths. 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 2013, pp. 293–300.
  63. Exploring Relationships between Interaction Attributes and Experience. Proceedings of the 6th International Conference on Designing Pleasurable Products and Interfaces; Association for Computing Machinery: New York, NY, USA, 2013; DPPI ’13, p. 126–135. doi:\changeurlcolorblack10.1145/2513506.2513520.
  64. People interpret robotic non-linguistic utterances categorically. International Journal of Social Robotics 2016, 8, 31–50.
  65. Measuring emotion: the self-assessment manikin and the semantic differential. Journal of behavior therapy and experimental psychiatry 1994, 25, 49–59.
  66. A Review of Virtual Reality Studies on Autonomous Vehicle–Pedestrian Interaction. IEEE Transactions on Human-Machine Systems 2021.
  67. I see your gesture: a vr-based study of bidirectional communication between pedestrians and automated vehicles. Journal of advanced transportation 2021, 2021.
  68. Road users rarely use explicit communication when interacting in today’s traffic: implications for automated vehicles. Cognition, Technology & Work 2021, 23, 367–380.
  69. Pillai, A. Virtual reality based study to analyse pedestrian attitude towards autonomous vehicles. Master’s thesis, KTH Roy. Inst. Technol., Stockholm, Sweden, 2017.
  70. Emotion expression in HRI–when and why. 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 2019, pp. 29–38.
  71. Augmenting Remote Interviews through Virtual Experience Prototypes; Association for Computing Machinery: New York, NY, USA, 2020; OzCHI ’20, p. 78–86. doi:\changeurlcolorblack10.1145/3441000.3441057.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Yiyuan Wang (11 papers)
  2. Luke Hespanhol (6 papers)
  3. Martin Tomitsch (20 papers)
Citations (15)

Summary

We haven't generated a summary for this paper yet.