Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Exploring Gender Biases in Language Patterns of Human-Conversational Agent Conversations (2401.03030v1)

Published 5 Jan 2024 in cs.HC, cs.CL, and cs.CY

Abstract: With the rise of human-machine communication, machines are increasingly designed with humanlike characteristics, such as gender, which can inadvertently trigger cognitive biases. Many conversational agents (CAs), such as voice assistants and chatbots, default to female personas, leading to concerns about perpetuating gender stereotypes and inequality. Critiques have emerged regarding the potential objectification of females and reinforcement of gender stereotypes by these technologies. This research, situated in conversational AI design, aims to delve deeper into the impacts of gender biases in human-CA interactions. From a behavioral and communication research standpoint, this program focuses not only on perceptions but also the linguistic styles of users when interacting with CAs, as previous research has rarely explored. It aims to understand how pre-existing gender biases might be triggered by CAs' gender designs. It further investigates how CAs' gender designs may reinforce gender biases and extend them to human-human communication. The findings aim to inform ethical design of conversational agents, addressing whether gender assignment in CAs is appropriate and how to promote gender equality in design.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (25)
  1. Rachel Adams and Nora Ni Loideain. 2019. Addressing Indirect Discrimination and Gender Stereotypes in AI Virtual Personal Assistants: The Role of International Human Rights Law. https://doi.org/10.2139/ssrn.3392243
  2. Hilary Bergen. 2016. ‘I’d Blush if I Could’: Digital Assistants, Disembodied Cyborgs and the Problem of Gender. https://www.semanticscholar.org/paper/%E2%80%98I%E2%80%99d-Blush-if-I-Could%E2%80%99%3A-Digital-Assistants%2C-Cyborgs-Bergen/1f00770d4ef6a27c4c1abedc11915bdbf4c64b4b
  3. Cathy Boggs and Howard giles. 1999. ‘The canary in the coalmine’: The nonaccommodation cycle in the gendered workplace. International journal of applied linguistics 9, 2 (1999), 223–245.
  4. Engin Bozdag. 2013. Bias in algorithmic filtering and personalization. Ethics and Information Technology 15, 3 (Sept. 2013), 209–227. https://doi.org/10.1007/s10676-013-9321-6
  5. Gendered or neutral? considering the language of HCI. In Proceedings of the 41st Graphics Interface Conference (GI ’15). Canadian Information Processing Society, CAN, 163–170.
  6. Caitlin Chin and Mishaela Robison. 2020. How AI bots and voice assistants reinforce gender bias. (Nov. 2020). https://policycommons.net/artifacts/4143566/how-ai-bots-and-voice-assistants-reinforce-gender-bias/4952630/ Publisher: Brookings Institution.
  7. Pedro Costa. 2018. Conversing with personal digital assistants: on gender and artificial intelligence. Journal of Science and Technology of the Arts (Sept. 2018), 59–72 Páginas. https://doi.org/10.7559/CITARJ.V10I3.563 Artwork Size: 59-72 Páginas Publisher: Journal of Science and Technology of the Arts.
  8. Conversational assistants and gender stereotypes: Public perceptions and desiderata for voice personas. In Proceedings of the second workshop on gender bias in natural language processing. 72–78.
  9. Sex stereotypes and conversational agents. Proc. of Gender and Interaction: real and virtual women in a male world, Venice, Italy (2006).
  10. Gender Bias in Chatbot Design. In Chatbot Research and Design: Third International Workshop, CONVERSATIONS 2019, Amsterdam, The Netherlands, November 19–20, 2019, Revised Selected Papers. Springer-Verlag, Berlin, Heidelberg, 79–93. https://doi.org/10.1007/978-3-030-39540-7_6
  11. Communication accommodation theory: A look back and a look ahead. In Theorizing about intercultural communication. Thousand Oaks: Sage, 121–148.
  12. Building a Stronger CASA: Extending the Computers Are Social Actors Paradigm. In Human-Machine Communication, Vol. 1. 71–86. https://doi.org/10.30658/hmc.1.5 ISSN: 2638-6038, 2638-602X Journal Abbreviation: HMC.
  13. Campbell Leaper and Melanie M. Ayres. 2007. A Meta-Analytic Review of Gender Variations in Adults’ Language Use: Talkativeness, Affiliative Speech, and Assertive Speech. Personality and Social Psychology Review 11, 4 (Nov. 2007), 328–363. https://doi.org/10.1177/1088868307302221 Publisher: SAGE Publications Inc.
  14. Can computer-generated speech have gender? An experimental test of gender stereotype. In CHI’00 extended abstracts on Human factors in computing systems. 289–290.
  15. Weizi Liu and Mike Yao. 2023. Gender identity and influence in human-machine communication: A mixed-methods exploration. Computers in Human Behavior 144 (2023), 107750.
  16. Matthew Lombard and Kun Xu. 2021. Social Responses to Media Technologies in the 21st Century: The Media are Social Actors Paradigm. Human-Machine Communication 2, 1 (April 2021). https://doi.org/10.30658/hmc.2.2
  17. Gender Differences in Vocal Accommodation:: The Role of Perception. Journal of Language and Social Psychology 21, 4 (Dec. 2002), 422–432. https://doi.org/10.1177/026192702237958 Publisher: SAGE Publications Inc.
  18. Clifford Nass and Youngme Moon. 2000. Machines and Mindlessness: Social Responses to Computers. Journal of Social Issues 56, 1 (Jan. 2000), 81–103. https://doi.org/10.1111/0022-4537.00153
  19. Diagnosing Bias in the Gender Representation of HCI Research Participants: How it Happens and Where We Are. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (CHI ’21). Association for Computing Machinery, New York, NY, USA, 1–18. https://doi.org/10.1145/3411764.3445383
  20. Nicholas A. Palomares. 2008. Explaining Gender-Based Language Use: Effects of Gender Identity Salience on References to Emotion and Tentative Language in Intra- and Intergroup Contexts. Human Communication Research 34, 2 (April 2008), 263–286. https://doi.org/10.1111/j.1468-2958.2008.00321.x
  21. Winifred R. Poster. 2016. 5. The Virtual Receptionist with a Human Touch: Opposing Pressures of Digital Automation and Outsourcing in Interactive Services. In 5. The Virtual Receptionist with a Human Touch: Opposing Pressures of Digital Automation and Outsourcing in Interactive Services. University of California Press, 87–112. https://doi.org/10.1525/9780520961630-007
  22. Byron Reeves and Clifford Nass. 1996. The Media Equation: How People Treat Computers, Television, and New Media Like Real People and Pla. Bibliovault OAI Repository, the University of Chicago Press (Jan. 1996).
  23. Gender-inclusive HCI research and design: A conceptual review. Foundations and Trends® in Human–Computer Interaction 13, 1 (2020), 1–69.
  24. From Gender Biases to Gender-Inclusive Design: An Empirical Investigation. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI ’19). Association for Computing Machinery, New York, NY, USA, 1–14. https://doi.org/10.1145/3290605.3300283
  25. I’d blush if I could: closing gender divides in digital skills through education. Technical Report. UNESCO. https://docs.edtechhub.org/lib/ZQPSPXQG
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (1)
  1. Weizi Liu (3 papers)