Comparing an android head with its digital twin regarding the dynamic expression of emotions (2309.10146v1)
Abstract: Emotions, which are an important component of social interaction, can be studied with the help of android robots and their appearance, which is as similar to humans as possible. The production and customization of android robots is expensive and time-consuming, so it may be practical to use a digital replica. In order to investigate whether there are any perceptual differences in terms of emotions based on the difference in appearance, a robot head was digitally replicated. In an experiment, the basic emotions evaluated in a preliminary study were compared in three conditions and then statistically analyzed. It was found that apart from fear, all emotions were recognized on the real robot head. The digital head with "ideal" emotions performed better than the real head apart from the anger representation, which offers optimization potential for the real head. Contrary to expectations, significant differences between the real and the replicated head with the same emotions could only be found in the representation of surprise.
- A. Paiva, I. Leite, H. Boukricha, and I. Wachsmuth, “Empathy in Virtual Agents and Robots,” ACM Trans. Interact. Intell. Syst., vol. 7, no. 3, pp. 1–40, 2017, doi: 10.1145/2912150 .
- C. Becker-Asano and H. Ishiguro, “Evaluating facial displays of emotion for the android robot Geminoid F,” in 2011 IEEE Workshop on Affective Computational Intelligence (WACI 2011): Paris, France, 11 - 15 April 2011 ; [part of IEEE SSCI 2011, Symposium Series on Computational Intelligence, Paris, France, 2011, pp. 1–8.
- S. Al Moubayed, J. Beskow, G. Skantze, and B. Granström, “Furhat: A Back-Projected Human-Like Robot Head for Multiparty Human-Machine Interaction,” in Lecture Notes in Computer Science, vol. 7403, Cognitive behavioural systems: COST 2102 International Training School, Dresden, Germany, February 21 - 26, 2011; revised selected papers, A. Esposito, Ed., Berlin, Heidelberg: Springer, 2012, pp. 114–130.
- K. Mulligan and K. R. Scherer, “Toward a Working Definition of Emotion,” Emotion Review, vol. 4, no. 4, pp. 345–357, 2012, doi: 10.1177/1754073912445818 .
- M. Cabanac, “What is emotion?,” Behavioural Processes, vol. 60, no. 2, pp. 69–83, 2002, doi: 10.1016/S0376-6357(02)00078-5 .
- D. Keltner, D. Sauter, J. Tracy, and A. Cowen, “Emotional Expression: Advances in Basic Emotion Theory,” Journal of nonverbal behavior, vol. 43, no. 2, pp. 133–160, 2019, doi: 10.1007/s10919-019-00293-3 .
- B. Fehr and J. A. Russell, “Concept of emotion viewed from a prototype perspective,” Journal of Experimental Psychology: General, vol. 113, no. 3, pp. 464–486, 1984, doi: 10.1037/0096-3445.113.3.464 .
- P. Ekman, “Universals and cultural differences in facial expressions of emotion,” Nebraska Symposium on Motivation, vol. 19, 1971.
- J. A. Russell, “Is there universal recognition of emotion from facial expression? A review of the cross-cultural studies,” Psychological Bulletin, vol. 115, no. 1, pp. 102–141, 1994, doi: 10.1037/0033-2909.115.1.102 .
- W. Sato, S. Namba, D. Yang, S. Nishida, C. Ishi, and T. Minato, “An Android for Emotional Interaction: Spatiotemporal Validation of Its Facial Expressions,” Frontiers in psychology, vol. 12, p. 800657, 2021, doi: 10.3389/fpsyg.2021.800657 .
- D. Matsumoto, D. Keltner, M. N. Shiota, M. O’Sullivan, and M. Frank, “Facial expressions of emotion,” in Handbook of emotions, M. Lewis, J. M. Haviland-Jones, and L. F. Barrett, Eds., 3rd ed., New York, NY, US: The Guilford Press, 2008, pp. 211–234.
- C. Breazeal, K. Dautenhahn, and T. Kanda, “Social Robotics,” in Springer Handbooks, Springer handbook of robotics, B. Siciliano and O. Khatib, Eds., 2nd ed., Berlin, Heidelberg: Springer, 2016, pp. 1935–1972.
- H. Ishiguro, “Interactive humanoids and androids as ideal interfaces for humans,” in Proceedings of the 11th international conference on Intelligent user interfaces, Sydney, Australia, 2006, p. 2.
- H. S. Ahn, D.-W. Lee, D. Choi, D.-Y. Lee, M. Hur, and H. Lee, “Appropriate emotions for facial expressions of 33-DOFs android head EveR-4 H33,” in 2012 IEEE Ro-Man, Paris, France, 2012, pp. 1115–1120.
- K. Berns and J. Hirth, “Control of facial expressions of the humanoid robot head ROMAN,” in 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems: Beijing, China, 9 - 13 October 2006, Beijing, China, 2006, pp. 3119–3124.
- M. Ochs, C. Pelachaud, and D. Sadek, “An Empathic Virtual Dialog Agent to Improve Human-Machine Interaction,” in Proceedings of the 7th International Joint Conference on Autonomous Agents and Multiagent Systems - Volume 1, 2008, pp. 89–96.
- E. J. Mehdi, P. Nico, J. Dugdale, and B. Pavard, “Modelling character emotion in an interactive virtual environment,” in AISB Convention, Symposium on Language, Speech and Gesture for Expressive Characters, 2004.
- J. M. Beer, A. D. Fisk, and W. A. Rogers, “Emotion recognition of virtual agents facial expressions: the effects of age and emotion intensity,” Proceedings of the Human Factors and Ergonomics Society … Annual Meeting. Human Factors and Ergonomics Society. Annual meeting, vol. 53, no. 2, pp. 131–135, 2009, doi: 10.1177/154193120905300205 .
- J. Gratch, J. Rickel, E. Andre, J. Cassell, E. Petajan, and N. Badler, “Creating interactive virtual humans: some assembly required,” IEEE Intell. Syst., vol. 17, no. 4, pp. 54–63, 2002, doi: 10.1109/MIS.2002.1024753 .
- J. Li, “The benefit of being physically present: A survey of experimental works comparing copresent robots, telepresent robots and virtual agents,” International Journal of Human-Computer Studies, vol. 77, pp. 23–37, 2015, doi: 10.1016/j .ijhcs.2015.01.001.
- T. Holz, M. Dragone, and G. M. P. O’Hare, “Where Robots and Virtual Agents Meet,” Int J of Soc Robotics, vol. 1, no. 1, pp. 83–93, 2009, doi: 10.1007/s12369-008-0002-2 .
- C. Greco, T. Amorese, M. Cuciniello, G. Cordasco, and A. Esposito, “Android Robots vs Virtual Agents: which system differently aged users prefer?,” in 31st IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), Napoli, Italy, 2022, pp. 1–7.
- R. Hortensius, F. Hekele, and E. S. Cross, “The Perception of Emotion in Artificial Agents,” IEEE Trans. Cogn. Dev. Syst., vol. 10, no. 4, pp. 852–864, 2018, doi: 10.1109/TCDS.2018.2826921 .
- W. A. Bainbridge, J. W. Hart, E. S. Kim, and B. Scassellati, “The Benefits of Interactions with Physically Present Robots over Video-Displayed Agents,” Int J of Soc Robotics, vol. 3, no. 1, pp. 41–52, 2011, doi: 10.1007/s12369-010-0082-7 .
- S. Nonaka, K. Inoue, T. Arai, and Y. Mae, “Evaluation of human sense of security for coexisting robots using virtual reality. 1st report: evaluation of pick and place motion of humanoid robots,” in Proceedings / 2004 IEEE International Conference on Robotics and Automation, April 26 - May 1, 2004, Hilton New Orleans Riverside, New Orleans, LA, USA, New Orleans, LA, USA, 2004, 2770-2775 Vol.3.
- H. Kamide, M. Yasumoto, Y. Mae, T. Takubo, K. Ohara, and T. Arai, “Comparative evaluation of virtual and real humanoid with robot-oriented psychology scale,” in 2011 IEEE International Conference on Robotics and Automation: (ICRA 2011) ; Shanghai, China, 9 - 13 May 2011, Shanghai, China, 2011, pp. 599–604.