Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Data Feminism for AI (2405.01286v1)

Published 2 May 2024 in cs.CY and cs.AI

Abstract: This paper presents a set of intersectional feminist principles for conducting equitable, ethical, and sustainable AI research. In Data Feminism (2020), we offered seven principles for examining and challenging unequal power in data science. Here, we present a rationale for why feminism remains deeply relevant for AI research, rearticulate the original principles of data feminism with respect to AI, and introduce two potential new principles related to environmental impact and consent. Together, these principles help to 1) account for the unequal, undemocratic, extractive, and exclusionary forces at work in AI research, development, and deployment; 2) identify and mitigate predictable harms in advance of unsafe, discriminatory, or otherwise oppressive systems being released into the world; and 3) inspire creative, joyful, and collective ways to work towards a more equitable, sustainable world in which all of us can thrive.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (161)
  1. Khadijah Abdurahman. 2024. ”Logic(s) Magazine”. https://logicmag.io/. [Accessed 30-04-2024].
  2. Persistent Anti-Muslim Bias in Large Language Models. In Proceedings of the 2021 AAAI/ACM Conference on AI, Ethics, and Society (AIES ’21). Association for Computing Machinery, New York, NY, USA, 298–306. https://doi.org/10.1145/3461702.3462624
  3. Alison Adam. 1998. Artificial knowing. Routledge, London, England.
  4. ”Defense Advanced Research Projects Agency”. 2024. ”ARPANET”. https://www.darpa.mil/about-us/timeline/arpanet. [Accessed 30-04-2024].
  5. Kendra Albert and Maggie Delano. 2022. Sex trouble: Sex/gender slippage, sex confusion, and sex obsession in machine learning using electronic health records. Patterns 3, 8 (Aug. 2022), 100534. https://doi.org/10.1016/j.patter.2022.100534
  6. Linda Martín Alcoff. 2018. ”Rape and Resistance”. Polity, ”New York”. https://www.wiley.com/en-us/Rape+and+Resistance-p-9780745691916
  7. ”Data-Pop Alliance”. 2022. ”Data Feminism: Intersectional Data-Driven Advocacy and Policy for Gender Equality”. https://datapopalliance.org/our-work/thematic-programms/program_data_feminism/
  8. Dani Anguiano and Lois Beckett. 2023. ”How Hollywood writers triumphed over AI – and why it matters”. https://www.theguardian.com/culture/2023/oct/01/hollywood-writers-strike-artificial-intelligence
  9. Natasha Ansari. 2024. ”Welcome to the Data + Feminism Lab! LAB HANDBOOK”. "https://docs.google.com/document/d/1B48_Gub2ik6FA7Ly0zxn9q7ioQXLf7htMkFLbKxo_zE/"
  10. ”Designing Guiding Principles for NLP for Healthcare: A Case Study of Maternal Health”. https://doi.org/10.48550/arXiv.2312.11803 arXiv:2312.11803 [cs].
  11. ”No Tech For Apartheid”. 2024. ”No Tech For Apartheid”. https://www.notechforapartheid.com/. [Accessed 30-04-2024].
  12. On the Dangers of Stochastic Parrots: Can Language Models Be Too Big?. In Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency (FAccT ’21). Association for Computing Machinery, New York, NY, USA, 610–623. https://doi.org/10.1145/3442188.3445922
  13. Ruha Benjamin. 2019. Race after technology: abolitionist tools for the new Jim code. Polity, Medford, MA.
  14. Ruha Benjamin. 2022. Viral justice: how we grow the world we want. Princeton University Press, Princeton, New Jersey.
  15. W Lance Bennett and Marianne Kneuer. 2023. ”Communication and democratic erosion: The rise of illiberal public spheres”. European Journal of Communication 39, 2 (Dec. 2023), 02673231231217378. https://doi.org/10.1177/02673231231217378
  16. Jeffrey M. Binder. 2016. ”Debates in the Digital Humanities 2016”. ”University of Minnesota Press”, ”Minneapolis”, Chapter ”Alien Reading: Text Mining, Language Standardization, and the Humanities”, 201–217.
  17. The Forgotten Margins of AI Ethics. In Proceedings of the 2022 ACM Conference on Fairness, Accountability, and Transparency (FAccT ’22). Association for Computing Machinery, New York, NY, USA, 948–958. https://doi.org/10.1145/3531146.3533157
  18. Meredith Broussard. 2018. Artificial unintelligence: how computers misunderstand the world. The MIT Press, Cambridge, Massachusetts.
  19. Meredith Broussard. 2023. More than a glitch: confronting race, gender, and ability bias in tech. The MIT Press, Cambridge, Massachusetts.
  20. ”Feminist AI: Critical Perspectives on Algorithms, Data, and Intelligent Machines”. ”Oxford University Press”, Oxford, New York.
  21. Joy Buolamwini. 2023. Unmasking AI. Random House, ”New York”. https://www.penguinrandomhouse.com/books/670356/unmasking-ai-by-joy-buolamwini/
  22. Joy Buolamwini and Timnit Gebru. 2018. Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification. In Proceedings of the 1st Conference on Fairness, Accountability and Transparency. PMLR, New York, 77–91. https://proceedings.mlr.press/v81/buolamwini18a.html
  23. Daniene Byrne. 2022. Data Feminism—Catherine D’Ignazio and Lauren F. Klein (Cambridge, MA, USA: MIT Press, 2020, 314 pp.). IEEE Technology and Society Magazine 41, 4 (Dec. 2022), 16–18. https://doi.org/10.1109/MTS.2022.3216717
  24. Lorena Cabnal. 2010. Feminismos diversos: el feminismo comunitario. https://porunavidavivible.files.wordpress.com/2012/09/feminismos-comunitario-lorena-cabnal.pdf
  25. ”Women Canada and Gender Equality”. 2021. Gender-based Analysis Plus (GBA Plus). https://women-gender-equality.canada.ca/en/gender-based-analysis-plus.html Last Modified: 2022-10-13.
  26. Kriston Capps. 2023. ”An Architect Uses AI to Explore Surreal Black Worlds”. https://www.bloomberg.com/news/features/2023-09-23/with-ai-an-architect-imagines-a-world-that-centers-blackness
  27. Computer Science Communities: Who is Speaking, and Who is Listening to the Women? Using an Ethics of Care to Promote Diverse Voices. In Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency (FAccT ’21). Association for Computing Machinery, New York, NY, USA, 106–115. https://doi.org/10.1145/3442188.3445874
  28. Can Workers Meaningfully Consent to Workplace Wellbeing Technologies?. In Proceedings of the 2023 ACM Conference on Fairness, Accountability, and Transparency (FAccT ’23). Association for Computing Machinery, New York, NY, USA, 569–582. https://doi.org/10.1145/3593013.3594023
  29. Wendy Hui Kyoong Chun. 2021. ”Discriminating Data: Correlation, Neighborhoods, and the New Politics of Recognition”. ”MIT Press”, ”Cambridge, MA”.
  30. Feminist Data Manifest-No.
  31. Danielle Keats Citron. 2022. ”The fight for privacy”. ”Chatto & Windus”, ”London, England”.
  32. Beth Coleman. 2024. ”Octavia Butler AI”. https://realitywaswhateverhappened.com/OBAI
  33. ”Combahee River Collective”. 1977. The Combahee River Collective Statement. Combahee River Collective, Boston.
  34. Nick Couldry. 2019. The costs of connection: how data is colonizing human life and appropriating it for capitalism. Stanford University Press, Stanford, California.
  35. Nick Couldry and Ulises A. Mejias. 2019. Data Colonialism: Rethinking Big Data’s Relation to the Contemporary Subject. Television & New Media 20, 4 (May 2019), 336–349. https://doi.org/10.1177/1527476418796632
  36. Kimberlé Crenshaw. 1989. Demarginalizing the Intersection of Race and Sex: A Black Feminist Critique of Antidiscrimination Doctrine, Feminist Theory and Antiracist Politics. University of Chicago Legal Forum 1989 (1989), 139–167. Issue 1.
  37. Kimberlé Crenshaw. 2022. The Panic Over Critical Race Theory Is an Attempt to Whitewash U.S. History (3 ed.). Routledge, New York, 362–364. https://doi.org/10.4324/b23210-38
  38. Patrice Cullors. 2021. ”Imagining Abolition”. https://blacklivesmatter.com/imagining-abolition/
  39. Enacting Data Feminism in Advocacy Data Work. Proceedings of the ACM on Human-Computer Interaction 7, CSCW1 (April 2023), 47:1–47:28. https://doi.org/10.1145/3579480
  40. Lorraine Daston. 2014. ”The making of the humanities. Vol. 3: The modern humanities”. Amsterdam University Press, Amsterdam, Chapter ”Objectivity and impartiality: epistemic virtues in the humanities”, 24–41.
  41. ‘The Gospel’: how Israel uses AI to select bombing targets in Gaza. https://www.theguardian.com/world/2023/dec/01/the-gospel-how-israel-uses-ai-to-select-bombing-targets
  42. Angela Y. (Angela Yvonne) Davis. 2011. Women, race, & class. publisher not identified, Place of publication not identified.
  43. Angela Y. (Angela Yvonne) Davis. 2022. Abolition. Feminism. Now. Haymarket Books, Chicago, IL.
  44. Jodi Dean. 2021. Architecture and Collective Life. Routledge, New York, Chapter Neofeudalism: The end of capitalism?, 42–54.
  45. Sarah Deer. 2015. The Beginning and End of Rape: Confronting Sexual Violence in Native America. University of Minnesota Press, Minneapolis.
  46. ”We Are Here: Negotiating Difference and Alliance in Spaces of Cultural Rhetorics”. https://enculturation.net/we-are-here
  47. Matthew Desmond. 2016. Evicted: Poverty and Profit in the American City. Crown, New York.
  48. Theories of “Gender” in NLP Bias Research. In Proceedings of the 2022 ACM Conference on Fairness, Accountability, and Transparency (FAccT ’22). Association for Computing Machinery, New York, NY, USA, 2083–2102. https://doi.org/10.1145/3531146.3534627
  49. Catherine D’Ignazio. 2021. Uncertain archives: critical keywords for big data. MIT Press, Cambridge, MA, Chapter Outlier, 377–387.
  50. Catherine D’Ignazio. 2022. ”Data as a Cornerstone for a Feminist Development Cooperation”. https://www.blog-datalab.com/home/data-feminism-event-series/
  51. Catherine D’Ignazio and Lauren Klein. 2020. Data feminism. The MIT Press, Cambridge, Massachusetts.
  52. Avery C. Edenfield. 2019. Queering consent: design and sexual consent messaging. Communication Design Quarterly 7, 2 (Aug. 2019), 50–63. https://doi.org/10.1145/3358931.3358938
  53. Colbi Edmonds. 2024. Maine Secretary of State Targeted by ‘Swatting’ After Trump Ballot Decision. https://www.nytimes.com/2024/01/01/us/shenna-bellows-politicians-swatting.html
  54. Nathan L. Ensmenger. 2012. ”The computer boys take over”. ”MIT Press”, ”London, England”.
  55. Virginia Eubanks. 2018. Automating inequality: how high-tech tools profile, police, and punish the poor (first edition ed.). St. Martin’s Press, New York, NY.
  56. ”Worker Info Exchange”. https://www.workerinfoexchange.org/
  57. Silvia Federici. 2022. Foreword: The Significance of “Racial Capitalism. Journal of Law and Political Economy 2, 2 (2022), 119–121. https://papers.ssrn.com/abstract=4576305
  58. Melanie Feinberg. 2022. Everyday Adventures with Unruly Data. MIT Press, Cambridge.
  59. ”¡A+¿ Alliance for Inclusive Algorithms”. 2024. ”¡A+¿ Alliance for Inclusive Algorithms”. https://aplusalliance.org/
  60. Joe Foweraker. 2021. Oligarchy in the Americas: comparing oligarchic rule in Latin America and the United States. Palgrave Macmillan, Cham.
  61. Verónica Gago. 2017. Neoliberalism from below: popular pragmatics and baroque economies. Duke University Press, Durham.
  62. Rosemarie Garland-Thomson. 2011. Misfits: A Feminist Materialist Disability Concept. Hypatia 26, 3 (July 2011), 591–609. https://doi.org/10.1111/j.1527-2001.2011.01206.x
  63. Feminicide Data Activism (1 ed.). Routledge, London, 103–113. https://doi.org/10.4324/9781003202332-13
  64. ”Data Genero”. 2023. https://datagenero.org/proyectos/feminismo-de-datos/
  65. Sharon Goldman. 2023. A free AI image dataset, removed for child sex abuse images, has come under fire before. https://venturebeat.com/ai/a-free-ai-image-dataset-removed-for-child-sex-abuse-images-has-come-under-fire-before/
  66. Anita Gurumurthy. 2018. ”Keynote: Anita Gurumurthy (IT for Change)”. https://datajusticelab.org/conference-programme/
  67. Whose Language Counts as High Quality? Measuring Language Ideologies in Text Data Selection. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, Yoav Goldberg, Zornitsa Kozareva, and Yue Zhang (Eds.). Association for Computational Linguistics, Abu Dhabi, United Arab Emirates, 2562–2580. https://doi.org/10.18653/v1/2022.emnlp-main.165
  68. Lelia Marie Hampton. 2021. Black Feminist Musings on Algorithmic Oppression. In Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency (FAccT ’21). Association for Computing Machinery, New York, NY, USA, 1. https://doi.org/10.1145/3442188.3445929
  69. Lelia Marie Hampton. 2023. Feminist AI: Critical Perspectives on Algorithms, Data, and Intelligent Machines. Oxford University Press, Oxford, Chapter Techno-Racial Capitalism: A Decolonial Black Feminist Marxist Perspective, 119–126. https://doi.org/10.1093/oso/9780192889898.003.0008
  70. Leif Hancox-Li and I. Elizabeth Kumar. 2021. Epistemic values in feature importance methods: Lessons from feminist epistemology. In Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency (FAccT ’21). Association for Computing Machinery, New York, NY, USA, 817–826. https://doi.org/10.1145/3442188.3445943
  71. Karen Hao. 2018. Amazon is the invisible backbone of ICE’s immigration crackdown. https://www.technologyreview.com/2018/10/22/139639/amazon-is-the-invisible-backbone-behind-ices-immigration-crackdown/
  72. Donna Haraway. 1988. Situated Knowledges: The Science Question in Feminism and the Privilege of Partial Perspective. Feminist Studies 14, 3 (1988), 575–599. https://doi.org/10.2307/3178066
  73. A data feminist approach to urban data practice: Tenant power through eviction data. Journal of Urban Affairs 0, 0 (2023), 1–20. https://doi.org/10.1080/07352166.2023.2262629
  74. Daniel J. Hemel. 2005. ”Summers’ Comments on Women and Science Draw Ire”. https://www.thecrimson.com/article/2005/1/14/summers-comments-on-women-and-science/
  75. Mar Hicks. 2018. ”Programmed inequality”. ”MIT Press”, ”London, England”.
  76. Patricia Hill Collins. 2000. Black feminist thought: knowledge, consciousness, and the politics of empowerment (revised 10th anniversary edition. ed.). Routledge, New York.
  77. ”The White House”. 2023. Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence. https://www.whitehouse.gov/briefing-room/presidential-actions/2023/10/30/executive-order-on-the-safe-secure-and-trustworthy-development-and-use-of-artificial-intelligence/
  78. Kevin Hurler. 2023. ”Google Is Really, Really Thirsty”. https://gizmodo.com/google-water-usage-exploding-with-ai-development-1850673427
  79. Yes: Affirmative Consent as a Theoretical Framework for Understanding and Imagining Social Platforms. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (CHI ’21). Association for Computing Machinery, New York, NY, USA, 1–18. https://doi.org/10.1145/3411764.3445778
  80. Eun Seo Jo and Timnit Gebru. 2020. Lessons from archives: strategies for collecting sociocultural data in machine learning. In Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency (Barcelona, Spain) (FAT* ’20). Association for Computing Machinery, New York, NY, USA, 306–316. https://doi.org/10.1145/3351095.3372829
  81. Eun So Jo and Nils Reimers. 2024. ”Clover - Multilingual Dense Retrieval”. https://huggingface.co/CloverSearch
  82. Shubhankar Kashyap and Avantika Singh. 2021. Testing Data Feminism in India. Scholars Journal of Arts, Humanities and Social Sciences 9, 10 (May 2021), 516–530. https://doi.org/10.36347/sjahss.2021.v09i10.006
  83. Evelyn Fox Keller. 1996. Reflections on Gender and Science: Tenth Anniversary Paperback Edition (anniversary edition ed.). Yale University Press, New Haven.
  84. Os Keyes. 2018. The Misgendering Machines: Trans/HCI Implications of Automatic Gender Recognition. Proceedings of the ACM on Human-Computer Interaction 2, CSCW (Nov. 2018), 88:1–88:22. https://doi.org/10.1145/3274357
  85. Os Keyes and Kathleen Creel. 2022. Artificial Knowing Otherwise. Feminist Philosophy Quarterly 8, 3/4 (Dec. 2022), 1–25. https://ojs.lib.uwo.ca/index.php/fpq/article/view/14313
  86. Theodore Kim. 2023. Opinion: AI isn’t magic. It’s just knowledge sausage. https://www.latimes.com/opinion/story/2023-05-14/ai-google-chatgpt-code-emergent-properties
  87. Maria Klawe. 2013. Increasing Female Participation in Computing: The Harvey Mudd College Story. Computer 46, 3 (March 2013), 56–58. https://doi.org/10.1109/MC.2013.4
  88. Lauren Klein. 2022. Are Large Language Models Our Limit Case? https://doi.org/10.5281/ZENODO.6567985
  89. Lauren Klein and Brandeis Marshall. 2022. ”Social Justice Frameworks for Leveraging Data Science to Advance Gender Equity”.
  90. Critical Tools for Machine Learning: Working with Intersectional Critical Concepts in Machine Learning Systems Design. In 2022 ACM Conference on Fairness, Accountability, and Transparency. ACM, Seoul Republic of Korea, 1528–1541. https://doi.org/10.1145/3531146.3533207
  91. Youjin Kong. 2022. Are “Intersectionally Fair” AI Algorithms Really Fair to Women of Color? A Philosophical Analysis. In Proceedings of the 2022 ACM Conference on Fairness, Accountability, and Transparency (FAccT ’22). Association for Computing Machinery, New York, NY, USA, 485–494. https://doi.org/10.1145/3531146.3533114
  92. Gender bias and stereotypes in Large Language Models. In Proceedings of The ACM Collective Intelligence Conference (CI ’23). Association for Computing Machinery, New York, NY, USA, 12–24. https://doi.org/10.1145/3582269.3615599
  93. ”Algorithmic Justice League”. 2024. ”Algorithmic Justice League”. https://www.ajl.org/
  94. Marion Lean. 2021. Materialising Data Feminism – How Textile Designers Are Using Materials to Explore Data Experience. Journal of Textile Design Research and Practice 9, 2 (May 2021), 184–209. https://doi.org/10.1080/20511787.2021.1928987
  95. Una Lee and Dann Toliver. 2017. Building Consentful Tech. http://www.consentfultech.io/wp-content/uploads/2019/10/Building-Consentful-Tech.pdf
  96. Taking data feminism to school: A synthesis and review of pre-collegiate data science education projects. British Journal of Educational Technology 53, 5 (2022), 1096–1113. https://doi.org/10.1111/bjet.13251
  97. Alina Leidinger and Richard Rogers. 2023. Which Stereotypes Are Moderated and Under-Moderated in Search Engine Autocompletion?. In Proceedings of the 2023 ACM Conference on Fairness, Accountability, and Transparency (FAccT ’23). Association for Computing Machinery, New York, NY, USA, 1049–1061. https://doi.org/10.1145/3593013.3594062
  98. ”Lesan”. 2024. ”Lesan”. https://lesan.ai/
  99. ”Making AI Less ’Thirsty’”: Uncovering and Addressing the Secret Water Footprint of AI Models”. arXiv:2304.03271 [cs.LG]
  100. Annita Hetoevehotohke’e Lucchesi. 2022. Mapping Violence against Indigenous Women and Girls: Beyond Colonizing Data and Mapping Practices. ACME: An International Journal for Critical Geographies 21, 44 (May 2022), 389–398.
  101. ”AboutMe: Using Self-Descriptions in Webpages to Document the Effects of English Pretraining Data Filters”. https://doi.org/10.48550/arXiv.2401.06408 arXiv:2401.06408 [cs].
  102. Catharine A. MacKinnon. 2016. Rape Redefined General Essays. Harvard Law & Policy Review 10, 2 (2016), 431–478.
  103. Tasslyn Magnusson. 2024. Book Censorship Database by Dr. Tasslyn Magnusson. https://www.everylibraryinstitute.org/book_censorship_database_magnusson
  104. John Mah. 2024. ”Alt Text Generator”. https://huggingface.co/spaces/JMah/Alt_Text_Generator
  105. Coalition for Independent Technology Research. https://independenttechresearch.org/
  106. Mari J. Matsuda. 1991. Beside My Sister, Facing the Enemy: Legal Theory Out of Coalition. Stanford Law Review 43 (1991), 1183–1192. Issue 6. http://hdl.handle.net/10125/67582
  107. Katherine McKittrick. 2021. ”Dear science and other stories”. ”Duke University Press”, Durham.
  108. Model Cards for Model Reporting. In Proceedings of the Conference on Fairness, Accountability, and Transparency (Atlanta, GA, USA) (FAT* ’19). Association for Computing Machinery, New York, NY, USA, 220–229. https://doi.org/10.1145/3287560.3287596
  109. Sebastian Moss. 2023. ”Meta begins paying underpaid Irish data center electricity bill after ESB Networks error”. https://www.datacenterdynamics.com/en/news/meta-begins-paying-underpaid-irish-data-center-electricity-bill-after-esb-networks-error/
  110. Feminist AI: Critical Perspectives on Algorithms, Data, and Intelligent Machines. Oxford University Press, Oxford, Chapter Afrofeminist Data Futures, 347–388. https://doi.org/10.1093/oso/9780192889898.003.0020
  111. Ashley Nellis. 2021. The Color of Justice: Racial and Ethnic Disparity in State Prisons. https://www.sentencingproject.org/reports/the-color-of-justice-racial-and-ethnic-disparity-in-state-prisons-the-sentencing-project/
  112. ”Feminist Internet Research Network”. 2024. ”Feminist Internet Research Network”. "https://firn.genderit.org/"
  113. Leonardo Nicoletti and Dina Bass Technology + Equality. 2023. Humans Are Biased. Generative AI Is Even Worse. https://www.bloomberg.com/graphics/2023-generative-ai-bias/
  114. ”Office of Science and Technology Policy”. 2023. ”Blueprint for an AI Bill of Rights”. https://www.whitehouse.gov/ostp/ai-bill-of-rights/
  115. Large language models propagate race-based medicine. npj Digital Medicine 6, 11 (Oct. 2023), 1–4. https://doi.org/10.1038/s41746-023-00939-z
  116. Mimi Onuoha. 2024. MimiOnuoha/missing-datasets. https://github.com/MimiOnuoha/missing-datasets
  117. Cathy O’Neil. 2016. Weapons of math destruction: how big data increases inequality and threatens democracy (first edition. ed.). Crown, New York.
  118. Cheryl Pellerin. 2017. ”Project Maven to Deploy Computer Algorithms to War Zone by Year’s End”. https://www.defense.gov/News/News-Stories/Article/Article/1254719/project-maven-to-deploy-computer-algorithms-to-war-zone-by-years-end/
  119. Billy Perrigo. 2023. ”Exclusive: OpenAI Used Kenyan Workers on Less Than $2 Per Hour to Make ChatGPT Less Toxic”. https://time.com/6247678/openai-chatgpt-kenya-workers/
  120. Leah Lakshmi Piepzna-Samarasinha. 2018. ”Care work: dreaming disability justice”. Arsenal Pulp Press, Vancouver.
  121. ”Pollicy”. 2024. ”Pollicy”. https://pollicy.org/
  122. Julian Posada. 2022. The Coloniality of Data Work: Power and Inequality in Outsourced Data Production for Machine Learning. PhD Dissertation. University of Toronto, Toronto. https://tspace.library.utoronto.ca/bitstream/1807/126388/1/Posada_Gutierrez_Julian__Alberto_202211_PhD_thesis.pdf
  123. ”OntoNotes: A Unified Relational Semantic Representation”. In ”International Conference on Semantic Computing (ICSC 2007)”. IEEE, Irvine, 517–526. https://doi.org/10.1109/ICSC.2007.83
  124. AI’s Regimes of Representation: A Community-centered Study of Text-to-Image Models in South Asia. In 2023 ACM Conference on Fairness, Accountability, and Transparency. ACM, Chicago IL USA, 506–517. https://doi.org/10.1145/3593013.3594016
  125. ”Papa Reo”. 2024. ”Papa Reo”. https://papareo.nz
  126. Marisa Revilla Blanco. 2019. Del ¡Ni una más! al #NiUnaMenos: movimientos de mujeres y feminismos en América Latina. Política y Sociedad 56, 1 (2019), 47–67. https://doi.org/10.5209/poso.60792
  127. Paola Ricaurte. 2019. Data Epistemologies, The Coloniality of Power, and Resistance. Television & New Media 20, 4 (March 2019), 350–365. https://doi.org/10.1177/1527476419831640
  128. Lisa Rice. 2019. ”Missing Credit: How the U.S. Credit System Restricts Access to Consumers of Color”. https://democrats-financialservices.house.gov/uploadedfiles/hhrg-116-ba00-wstate-ricel-20190226.pdf
  129. Lisa Ann Richey. 2001. ”In Search of Feminist Foreign Policy: Gender, Development, and Danish State Identity”. , ”177-212” pages. https://genderandsecurity.org/projects-resources/research/search-feminist-foreign-policy-gender-development-and-danish-state
  130. ”Coding Rights”. 2024a. ”Coding Rights”. https://codingrights.org/en/
  131. ”Coding Rights”. 2024b. ”Coding Rights: Bridging Development Realities and Technological Possibilities”. https://itforchange.net/
  132. The Unequal Opportunities of Large Language Models: Revealing Demographic Bias through Job Recommendations. , 15 pages. https://doi.org/10.1145/3617694.3623257 arXiv:2308.02053 [cs].
  133. “Everyone wants to do the model work, not the data work”: Data Cascades in High-Stakes AI. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (¡conf-loc¿, ¡city¿Yokohama¡/city¿, ¡country¿Japan¡/country¿, ¡/conf-loc¿) (CHI ’21). Association for Computing Machinery, New York, NY, USA, Article 39, 15 pages. https://doi.org/10.1145/3411764.3445518
  134. Representation, Self-Determination, and Refusal: Queer People’s Experiences with Targeted Advertising. In Proceedings of the 2023 ACM Conference on Fairness, Accountability, and Transparency (FAccT ’23). Association for Computing Machinery, New York, NY, USA, 1711–1722. https://doi.org/10.1145/3593013.3594110
  135. Runa Sandvik. 2023. How US police use digital data to prosecute abortions. https://techcrunch.com/2023/01/27/digital-data-roe-wade-reproductive-privacy/
  136. How Computers See Gender: An Evaluation of Gender Classification in Commercial Facial Analysis Services. Proceedings of the ACM on Human-Computer Interaction 3, CSCW (Nov. 2019), 144:1–144:33. https://doi.org/10.1145/3359246
  137. Kyla Schuller. 2021. The Trouble with White Women. Bold Type, New York. https://www.hachettebookgroup.com/titles/kyla-schuller/the-trouble-with-white-women/9781645036883/?lens=bold-type-books
  138. WEIRD FAccTs: How Western, Educated, Industrialized, Rich, and Democratic is FAccT?. In Proceedings of the 2023 ACM Conference on Fairness, Accountability, and Transparency (FAccT ’23). Association for Computing Machinery, New York, NY, USA, 160–171. https://doi.org/10.1145/3593013.3593985
  139. Leanne Betasamosake Simpson. 2017. As We Have Always Done: Indigenous Freedom through Radical Resistance. U of Minnesota Press, Minneapolis. Google-Books-ID: MCp0DwAAQBAJ.
  140. Participation Is not a Design Fix for Machine Learning. In Proceedings of the 2nd ACM Conference on Equity and Access in Algorithms, Mechanisms, and Optimization (¡conf-loc¿, ¡city¿Arlington¡/city¿, ¡state¿VA¡/state¿, ¡country¿USA¡/country¿, ¡/conf-loc¿) (EAAMO ’22). Association for Computing Machinery, New York, NY, USA, Article 1, 6 pages. https://doi.org/10.1145/3551624.3555285
  141. Wonyoung So. 2023. Which Information Matters? Measuring Landlord Assessment of Tenant Screening Reports. Housing Policy Debate 33, 6 (Nov. 2023), 1484–1510. https://doi.org/10.1080/10511482.2022.2113815
  142. Karolina Stanczak and Isabelle Augenstein. 2021. A Survey on Gender Bias in Natural Language Processing. http://arxiv.org/abs/2112.14168 arXiv:2112.14168 [cs].
  143. Catherine Stinson. 2020. ”The Dark Past of Algorithms That Associate Appearance and Criminality”. https://www.americanscientist.org/article/the-dark-past-of-algorithms-that-associate-appearance-and-criminality
  144. Yolande Strengers, Jathan Sadowski, Zhuying Li, Anna Shimshak, and Florian “Floyd” Mueller. 2021. What Can HCI Learn from Sexual Consent?: A Feminist Process of Embodied Consent for Interactions with Emerging Technologies. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. ACM, Yokohama Japan, 1–13. https://doi.org/10.1145/3411764.3445107
  145. Towards Intersectional Feminist and Participatory ML: A Case Study in Supporting Feminicide Counterdata Collection. In Proceedings of the 2022 ACM Conference on Fairness, Accountability, and Transparency (FAccT ’22). Association for Computing Machinery, New York, NY, USA, 667–678. https://doi.org/10.1145/3531146.3533132
  146. Jasmina Tacheva and Srividya Ramasubramanian. 2023. AI Empire: Unraveling the interlocking systems of oppression in generative AI’s global order. Big Data & Society 10, 2 (July 2023), 20539517231219240. https://doi.org/10.1177/20539517231219241
  147. Sophie Toupin. 2023. Shaping feminist artificial intelligence. New Media & Society 26, 1 (Feb. 2023), 580–595. https://doi.org/10.1177/14614448221150776
  148. Georgiana Turculet. 2023. Data feminism and border ethics: power, invisibility and indeterminacy. Journal of Global Ethics 19, 3 (Sept. 2023), 323–334. https://doi.org/10.1080/17449626.2023.2278533
  149. The Abuse and Misogynoir Playbook. Montreal AI Ethics Institute, Montreal, 14–34.
  150. Helena Suárez Val. 2021. Networked Feminisms: Activist Assemblies and Digital Practices. Lexington Books/Fortress Academic, Lanham, MD, Chapter Affect Amplifiers: Feminist Activists and Digital Cartographies of Feminicide, 163–186.
  151. Judy Wajcman and Erin Young. 2023. ”Feminist AI: Critical Perspectives on Algorithms, Data, and Intelligent Machines”. Oxford University Press, Oxford, Chapter ”Feminism Confronts AI: The Gender Relations of Digitalisation”, 47–64. https://doi.org/10.1093/oso/9780192889898.003.0004
  152. Maggie Walter. 2013. ”Indigenous statistics: a quantitative research methodology”. Left Coast Press, Walnut Creek, CA.
  153. Angela Wattercutter and Will Bedingfield. 2023. ”Hollywood Actors Strike Ends With a Deal That Will Impact AI and Streaming for Decades”. https://www.wired.com/story/hollywood-actors-strike-ends-ai-streaming/
  154. Brooke Foucault Welles. 2014. On minorities and outliers: The case for making Big Data small. Big Data & Society 1, 1 (April 2014), 205395171454061. https://doi.org/10.1177/2053951714540613
  155. Meredith Whittaker. 2021. The steep cost of capture. Interactions 28, 6 (Nov. 2021), 50–55. https://doi.org/10.1145/3488666
  156. Katianne Williams. 2021. Data Feminism: D’ignazio and Klein Call Out Inequality in Data. IEEE Women in Engineering Magazine 15, 1 (June 2021), 21–23. https://doi.org/10.1109/MWIE.2021.3062921
  157. Laura Winig. 2021. ”Harvey Mudd College: Promoting Women in Computer Science through Inclusive Education”. https://case.hks.harvard.edu/harvey-mudd-college-promoting-women-in-computer-science-through-inclusive-education/
  158. BLOOM: A 176B-Parameter Open-Access Multilingual Language Model. https://doi.org/10.48550/arXiv.2211.05100 arXiv:2211.05100 [cs].
  159. Jonathan Zong and J. Nathan Matias. 2024. Data Refusal from Below: A Framework for Understanding, Evaluating, and Envisioning Refusal as Design. ACM J. Responsib. Comput. 1, 1, Article 10 (mar 2024), 23 pages. https://doi.org/10.1145/3630107
  160. Shoshana Zuboff. 2019. The age of surveillance capitalism: the fight for a human future at the new frontier of power (first edition. ed.). PublicAffairs, New York.
  161. Ethan Zuckerman. 2020. ”What Is Digital Public Infrastructure?”. https://www.journalismliberty.org/publications/what-is-digital-public-infrastructure
Citations (2)

Summary

  • The paper presents an intersectional feminist framework for AI that challenges corporate influence, biases, and environmental injustices.
  • It rearticulates data feminism principles for AI by emphasizing power analysis, inclusive design, and the visibility of hidden labor.
  • The framework advocates systemic reforms in AI research through interdisciplinary collaboration to ensure consent, accountability, and equity.

Data Feminism for AI: Principles and Implications

The paper "Data Feminism for AI," authored by Lauren Klein and Catherine D'Ignazio, proposes a set of principles for incorporating feminist thought into AI research and practice. Building upon their previous work in "Data Feminism" (2020), Klein and D'Ignazio present an intersectional feminist framework aimed at mitigating the unequal, undemocratic, extractive, and exclusionary dynamics prevalent in AI research, development, and deployment. This paper seeks to adapt the original seven principles of data feminism to the AI context and suggests two additional principles addressing environmental impact and consent.

Overview of Data Feminism Principles in AI

The authors begin by rearticulating the original principles of data feminism for AI research:

  1. Examine Power: An analysis of power structures is critical in AI, where economic and political hierarchies, notably those perpetuated by corporate interests, greatly influence research agendas and outcomes. This principle highlights the need to scrutinize how capitalist and colonialist dynamics perpetuate inequalities within AI systems.
  2. Challenge Power: AI development should aim to counter existing power imbalances. This involves building alternative, justice-oriented AI systems and engaging in collective action to resist corporate capture of AI technology.
  3. Rethink Binaries and Hierarchies: AI often reinforces rigid binaries, such as the gender binary. This principle calls for challenging these constructs and creating inclusive AI systems that do not depend on reductive classifications.
  4. Elevate Emotion and Embodiment: The authors emphasize the importance of integrating emotional and embodied knowledge into AI research, arguing against the dismissal of these forms of knowledge as irrational.
  5. Embrace Pluralism: Incorporating diverse perspectives, particularly those of marginalized communities, into AI research is vital for producing inclusive and representative technologies.
  6. Consider Context: AI systems should account for the socio-political contexts in which they are developed and deployed. The authors call for practices that foreground the origins, intentions, and potential biases of datasets used in AI.
  7. Make Labor Visible: The often-invisible labor undergirding AI systems, particularly that performed by marginalized communities, must be acknowledged and valued appropriately.

New Directions: Environmental Impact and Consent

In addition to recontextualizing these principles, the paper introduces two emerging principles:

  • Environmental Impact: AI research and deployment often lead to uneven environmental burdens, predominantly affecting marginalized communities in the Global South. This principle draws from ecofeminism and Indigenous feminist thought to address these imbalances.
  • Consent: AI systems frequently operate without the explicit consent of individuals whose data they exploit. This principle suggests rethinking traditional notions of consent to include more collective and interdependent frameworks.

Implications and Future Directions

Klein and D'Ignazio's paper suggests a significant shift in AI research, proposing that a feminist lens can provide valuable insights into addressing entrenched power imbalances, improving accountability, and fostering greater inclusivity and sustainability in technology development. Moving forward, AI researchers are urged to incorporate these principles into their methodologies and frameworks. This involves not only critical reflection but also proactive engagement with affected communities to ensure equitable and ethical AI practices.

The implications of this work reach beyond academic discourse, calling for systemic changes in how AI and related technologies are conceptualized, developed, and deployed. Strategies include fostering interdisciplinary collaborations, advocating for policy reforms, and supporting organizational practices that align with these feminist principles.

In conclusion, "Data Feminism for AI" serves as a critical addition to AI research, encouraging scholars and practitioners to rethink the ethical dimensions of technology development through a feminist lens. As AI continues to evolve, applying these principles may pave the way for more equitable, ethical, and sustainable technological futures.

Youtube Logo Streamline Icon: https://streamlinehq.com