Papers
Topics
Authors
Recent
2000 character limit reached

Computer says "No": The Case Against Empathetic Conversational AI

Published 21 Dec 2022 in cs.CL and cs.HC | (2212.10983v2)

Abstract: Emotions are an integral part of human cognition and they guide not only our understanding of the world but also our actions within it. As such, whether we soothe or flame an emotion is not inconsequential. Recent work in conversational AI has focused on responding empathetically to users, validating and soothing their emotions without a real basis. This AI-aided emotional regulation can have negative consequences for users and society, tending towards a one-noted happiness defined as only the absence of "negative" emotions. We argue that we must carefully consider whether and how to respond to users' emotions.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (24)
  1. Lisa Feldman Barrett. 2017. How emotions are made: The secret life of the brain. Pan Macmillan.
  2. Macalester Bell. 2013. Hard feelings: The moral psychology of contempt. Oxford University Press.
  3. Emily M Bender and Alexander Koller. 2020. Climbing towards nlu: On meaning, form, and understanding in the age of data. In Proceedings of the 58th annual meeting of the association for computational linguistics, pages 5185–5198.
  4. Paul Bloom. 2017. Against empathy: The case for rational compassion. Random House.
  5. Fritz Breithaupt. 2019. The dark sides of empathy. Cornell University Press.
  6. Joanna J Bryson. 2010. Robots should be slaves. Close Engagements with Artificial Companions: Key social, psychological, ethical and design issues, 8:63–74.
  7. Myisha Cherry. 2021. The case for rage: Why anger is essential to anti-racist struggle. Oxford University Press.
  8. Alba Curry. 2022. An Apologia for Anger With Reference to Early China and Ancient Greece. Ph.D. thesis, UC Riverside.
  9. Simulating empathic behavior in a social assistive robot. Multimedia Tools and Applications, 76(4):5073–5094.
  10. SafetyKit: First aid for measuring safety in open-domain conversational systems. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 4113–4133, Dublin, Ireland. Association for Computational Linguistics.
  11. Patricia S Greenspan. 1995. Practical guilt: Moral dilemmas, emotions, and social norms. Oxford University Press on Demand.
  12. Tatsuya Ide and Daisuke Kawahara. 2022. Building a dialogue corpus annotated with expressed and experienced emotions. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop, pages 21–30, Dublin, Ireland. Association for Computational Linguistics.
  13. A critical reflection and forward perspective on empathy and natural language processing. arXiv preprint arXiv:2210.16604.
  14. Towards an artificially empathic conversational agent for mental health applications: system design and user perceptions. Journal of medical Internet research, 20(6):e10148.
  15. Jesse Prinz. 2011. Against empathy. The Southern Journal of Philosophy, 49:214–233.
  16. Sara Protasi. 2021. The philosophy of envy. Cambridge University Press.
  17. Byron Reeves and Clifford Nass. 1996. The media equation: How people treat computers, television, and new media like real people. Cambridge, UK, 10:236605.
  18. Anthropomorphism in ai. AJOB neuroscience, 11(2):88–95.
  19. A Scarantino and R de Sousa. 2018. Emotion, in “the stanford encyclopedia of philosophy”(winter 2018 edition). EN ZALTA (a cura di), URL: https://plato. stanford. edu/archives/win2018/entries/emotion.
  20. Laura Silva. 2021. The epistemic role of outlaw emotions. Ergo, 8(23).
  21. A taxonomy of empathetic questions in social dialogs. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 2952–2973, Dublin, Ireland. Association for Computational Linguistics.
  22. Daniel Jerónimo Tobón. 2019. Empathy and sympathy: two contemporary models of character engagement. In The Palgrave handbook of the philosophy of film and motion pictures, pages 865–891. Springer.
  23. Carissa Véliz. 2021. Moral zombies: why algorithms are not moral agents. AI & SOCIETY, 36(2):487–497.
  24. Multi-party empathetic dialogue generation: A new task for dialog systems. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 298–307, Dublin, Ireland. Association for Computational Linguistics.
Citations (6)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 5 likes about this paper.