Gender Biases in Error Mitigation by Voice Assistants (2310.13074v1)
Abstract: Commercial voice assistants are largely feminized and associated with stereotypically feminine traits such as warmth and submissiveness. As these assistants continue to be adopted for everyday uses, it is imperative to understand how the portrayed gender shapes the voice assistant's ability to mitigate errors, which are still common in voice interactions. We report a study (N=40) that examined the effects of voice gender (feminine, ambiguous, masculine), error mitigation strategies (apology, compensation) and participant's gender on people's interaction behavior and perceptions of the assistant. Our results show that AI assistants that apologized appeared warmer than those offered compensation. Moreover, male participants preferred apologetic feminine assistants over apologetic masculine ones. Furthermore, male participants interrupted AI assistants regardless of perceived gender more frequently than female participants when errors occurred. Our results suggest that the perceived gender of a voice assistant biases user behavior, especially for male users, and that an ambiguous voice has the potential to reduce biases associated with gender-specific traits.
- Alexa, Google, Siri: What are Your Pronouns? Gender and Anthropomorphism in the Design and Perception of Conversational Assistants. arXiv preprint arXiv:2106.02578 (2021).
- The effect of gender stereotypes on artificial intelligence recommendations. Journal of Business Research 141 (2022), 50–59.
- Intelligent Voice Instructor-Assistant System for Collaborative and Interactive Classes. Journal of Artificial Intelligence and Technology 1, 2 (2021), 121–130.
- Aronté Marie Bennett and Ronald Paul Hill. 2012. The universality of warmth and competence: A response to brands as intentional agents. Journal of Consumer Psychology 22, 2 (2012), 199–204.
- Gregory T Bradley and Elizabeth K LaFleur. 2016. Toward the development of hedonic-utilitarian measures of retail service. Journal of Retailing and Consumer Services 32 (2016), 60–66.
- Form Follows Function: Designing For Tensions Of Conversational Agents In Service Encounters. (2022).
- The effect of social-cognitive recovery strategies on likability, capability and trust in social robots. Computers in Human Behavior 114 (2021), 106561.
- Gender effects in perceptions of robots and humans with varying emotional intelligence. In 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 230–238.
- Jacob Cohen. 1988. Statistical power analysis for the behavioral sciences. England: Routledge (1988).
- How to use apology and compensation to repair competence-versus integrity-based trust violations in e-commerce. Electronic Commerce Research and Applications 32 (2018), 37–48.
- Amanda Cercas Curry and Verena Rieser. 2019. A crowd-based evaluation of abuse response strategies in conversational agents. arXiv preprint arXiv:1909.04387 (2019).
- Conversational assistants and gender stereotypes: Public perceptions and desiderata for voice personas. In Proceedings of the Second Workshop on Gender Bias in Natural Language Processing. 72–78.
- Nicole Damen and Christine Toh. 2019. Designing for trust: Understanding the role of agent gender and location on user perceptions of trust in home automation. Journal of Mechanical Design 141, 6 (2019).
- A taxonomy of social cues for conversational agents. International Journal of Human-Computer Studies 132 (2019), 138–161.
- Universal dimensions of social cognition: Warmth and competence. Trends in cognitive sciences 11, 2 (2007), 77–83.
- Do smart speakers respond to their errors properly? A study on human-computer dialogue strategy. In International Conference on Human-Computer Interaction. Springer, 440–455.
- Kylie L Goodman and Christopher B Mayhorn. 2023. It’s not what you say but how you say it: Examining the influence of perceived voice assistant gender and pitch on trust and reliance. Applied Ergonomics 106 (2023), 103864.
- Trudy Govier and Wilhelm Verwoerd. 2002. The promise and pitfalls of apology. Journal of social philosophy 33, 1 (2002), 67–82.
- Andrea L Guzman. 2016. Making AI safe for humans: A conversation with Siri. In Socialbots and their friends. Routledge, 85–101.
- Effects of Smart Virtual Assistants’ Gender and Language. In Proceedings of Mensch und Computer 2019. 469–473.
- Money isn’t all that matters: The use of financial compensation and apologies to preserve relationships in the aftermath of distributive harm. Journal of Economic Psychology 35 (2013), 95–107.
- Ian Hutchby. 2008. Participants’ orientations to interruptions, rudeness and other impolite acts in talk-in-interaction. Journal of Politeness Research-language Behaviour Culture 4 (08 2008), 221–241. https://doi.org/10.1515/JPLR.2008.011
- It sounds like a woman: Exploring gender stereotypes in South Korean voice assistants. In Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems. 1–6.
- Sunyoung Kim et al. 2021. Exploring How Older Adults Use a Smart Speaker–Based Voice Assistant in Their First Interactions: Qualitative Study. JMIR mHealth and uHealth 9, 1 (2021), e20427.
- Pedagogical agents as learning companions: the impact of agent emotion and gender. Journal of Computer Assisted Learning 23, 3 (2007), 220–234.
- Are leader stereotypes masculine? A meta-analysis of three research paradigms. Psychological bulletin 137, 4 (2011), 616.
- Terry K Koo and Mae Y Li. 2016. A guideline of selecting and reporting intraclass correlation coefficients for reliability research. Journal of chiropractic medicine 15, 2 (2016), 155–163.
- Closing the gender gap in STEM with friendly male instructors? On the effects of rapport behavior and gender of a virtual agent in an instructional interaction. Computers & Education 99 (2016), 1–13.
- Effects of gender stereotypes on trust and likability in spoken human-robot interaction. In Proceedings of the eleventh international conference on language resources and evaluation (LREC 2018).
- Gracefully mitigating breakdowns in robotic services. In 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 203–210.
- The voice makes the car: Enhancing autonomous vehicle perceptions and adoption intention through voice agent gender and style. Multimodal Technologies and Interaction 3, 1 (2019), 20.
- Social interactions and relationships with an intelligent virtual agent. International Journal of Human-Computer Studies 150 (2021), 102608.
- Roderick JA Little. 1988. A test of missing completely at random for multivariate data with missing values. Journal of the American statistical Association 83, 404 (1988), 1198–1202.
- Nora Ni Loideain and Rachel Adams. 2020. From Alexa to Siri and the GDPR: the gendering of virtual personal assistants and the role of data protection impact assessments. Computer Law & Security Review 36 (2020), 105366.
- Effects of voice-adaptation and social dialogue on perceptions of a robotic learning companion. In 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 255–262.
- Owning Mistakes Sincerely: Strategies for Mitigating AI Errors. In CHI Conference on Human Factors in Computing Systems. 1–11.
- James G Maxham III and Richard G Netemeyer. 2003. Firms reap what they sow: the effects of shared values and perceived organizational justice on customers’ evaluations of complaint handling. Journal of Marketing 67, 1 (2003), 46–62.
- Does social desirability bias favor humans? Explicit–implicit evaluations of synthesized speech support a new HCI model of impression management. Computers in Human Behavior 27, 1 (2011), 402–412.
- Tatsuya Nomura. 2017. Robots and gender. Gender and the Genome 1, 1 (2017), 18–25.
- Chidera Obinali. 2019. The perception of gender in voice assistants. In SAIS 2019 Proceedings of the Southern Association for Information Systems Conference. 22–23.
- Investigating the impact of pedagogical agent gender matching and learner choice on learning outcomes and perceptions. Computers & Education 67 (2013), 36–50.
- Emerson Cabrera Paraiso and Jean-Paul A Barthès. 2006. An intelligent speech interface for personal assistants in R&D projects. Expert Systems with Applications 31, 4 (2006), 673–683.
- Cathy Pearl. 2016. Designing voice user interfaces: Principles of conversational experiences. ” O’Reilly Media, Inc.”.
- Thao Phan. 2019. Amazon Echo and the aesthetics of whiteness. Catalyst: Feminism, Theory, Technoscience 5, 1 (2019), 1–38.
- Gender and managerial stereotypes: have the times changed? Journal of management 28, 2 (2002), 177–193.
- Holger Roschk and Susanne Kaiser. 2013. The nature of an apology: An experimental study on how to apologize after a service failure. Marketing Letters 24, 3 (2013), 293–309.
- Donald B Rubin. 2004. Multiple imputation for nonresponse in surveys. Vol. 81. John Wiley & Sons.
- Effectiveness of using voice assistants in learning: A study at the time of COVID-19. International journal of environmental research and public health 17, 15 (2020), 5618.
- Does Social Presence Increase Perceived Competence? Evaluating Conversational Agents in Advice Giving Through a Video-Based Survey. Proceedings of the ACM on Human-Computer Interaction 6, GROUP (2022), 1–22.
- Voice in human–agent interaction: a survey. ACM Computing Surveys (CSUR) 54, 4 (2021), 1–43.
- Persuasive robotics: The influence of robot gender on human behavior. In 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 2563–2568.
- Modeling human response to robot errors for timely error detection. In 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 676–683.
- On using social signals to enable flexible error-aware HRI. In Proceedings of the 2023 ACM/IEEE International Conference on Human-Robot Interaction. 222–230.
- Yolande Strengers and Jenny Kennedy. 2021. The smart wife: Why Siri, Alexa, and other smart home devices need a feminist reboot. MIT Press.
- Selina Jeanne Sutton. 2020. Gender ambiguous, not genderless: Designing gender in voice user interfaces (VUIs) with sensitivity. In Proceedings of the 2nd conference on conversational user interfaces. 1–8.
- Customer evaluations of service complaint experiences: implications for relationship marketing. Journal of marketing 62, 2 (1998), 60–76.
- When stereotypes meet robots: the double-edge sword of robot gender and personality in human–robot interaction. Computers in Human Behavior 38 (2014), 75–84.
- Female by Default?–Exploring the Effect of Voice Assistant Gender and Pitch on Trait and Trust Attribution. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems. 1–7.
- Are Men More Apologetic Than Women? Pertanika Journal of Social Sciences & Humanities 21, 3 (2013).
- EQUALS Skills Coalition UNESCO. 2019. I’d blush if I could: closing gender divides in digital skills through education. https://unesdoc.unesco.org/ark:/48223/pf0000367416.page=1. Accessed: 2021-09-02.
- Rebutting existing misconceptions about multiple imputation as a method for handling missing data. Journal of personality assessment 102, 3 (2020), 297–308.
- Heather Suzanne Woods. 2018. Asking more of Siri and Alexa: Feminine persona in service of surveillance capitalism. Critical Studies in Media Communication 35, 4 (2018), 334–349.
- Amama Mahmood (9 papers)
- Chien-Ming Huang (31 papers)