Fairness and Bias in Algorithmic Hiring: a Multidisciplinary Survey (2309.13933v3)
Abstract: Employers are adopting algorithmic hiring technology throughout the recruitment pipeline. Algorithmic fairness is especially applicable in this domain due to its high stakes and structural inequalities. Unfortunately, most work in this space provides partial treatment, often constrained by two competing narratives, optimistically focused on replacing biased recruiter decisions or pessimistically pointing to the automation of discrimination. Whether, and more importantly what types of, algorithmic hiring can be less biased and more beneficial to society than low-tech alternatives currently remains unanswered, to the detriment of trustworthiness. This multidisciplinary survey caters to practitioners and researchers with a balanced and integrated coverage of systems, biases, measures, mitigation strategies, datasets, and legal aspects of algorithmic hiring and fairness. Our work supports a contextualized understanding and governance of this technology by highlighting current opportunities and limitations, providing recommendations for future work to ensure shared benefits for all stakeholders.
- Fairness in representation: quantifying stereotyping as a representational harm. In Proc. of the 2019 SIAM International Conf. on Data Mining, SDM 2019, Calgary, Alberta, Canada, May 2-4, 2019, Tanya Y. Berger-Wolf and Nitesh V. Chawla (Eds.). SIAM, 801–809. https://doi.org/10.1137/1.9781611975673.90
- Directly discriminatory algorithms. The Modern Law Review 86, 1 (2023), 144–175.
- Ifeoma Ajunwa. 2019. The paradox of automation as anti-bias intervention. Cardozo L. Rev. 41 (2019), 1671.
- Discrimination through Optimization: How Facebook’s Ad Delivery Can Lead to Biased Outcomes. Proc. ACM Hum. Comput. Interact. 3, CSCW (2019), 199:1–199:30. https://doi.org/10.1145/3359301
- Are There Gender Differences in Professional Self-Promotion? An Empirical Case Study of LinkedIn Profiles Among Recent MBA Graduates. In Proc. of the Eleventh International Conf. on Web and Social Media, ICWSM 2017, Montréal, Québec, Canada, May 15-18, 2017. AAAI Press, 460–463. https://aaai.org/ocs/index.php/ICWSM/ICWSM17/paper/view/15615
- Jose M Alvarez and Salvatore Ruggieri. 2023. The Initial Screening Order Problem. arXiv preprint arXiv:2307.15398 (2023).
- The disability employment puzzle: A field experiment on employer hiring behavior. ILR Review 71, 2 (2018), 329–364.
- The standards for educational and psychological testing. (2014).
- Lori Andrews and Hannah Bucher. 2022. Automating Discrimination: AI Hiring Practices and Gender Inequality. Cardozo L. Rev. 44 (2022), 145.
- Dozens of Companies Are Using Facebook to Exclude Older Workers From Job Ads. Machine Bias. ProPublica, New York, NY, USA. https://www.propublica.org/article/facebook-ads-age-discrimination-targeting
- End-to-End Bias Mitigation in Candidate Recommender Systems with Fairness Gates. In 2nd Workshop on Recommender Systems for Human Resources, RecSys-in-HR 2022. CEUR-WS, 1–8.
- Data Infrastructure at LinkedIn. In IEEE 28th International Conf. on Data Engineering (ICDE 2012), Washington, DC, USA (Arlington, Virginia), 1-5 April, 2012, Anastasios Kementsietsidis and Marcos Antonio Vaz Salles (Eds.). IEEE Computer Society, 1370–1381. https://doi.org/10.1109/ICDE.2012.147
- Does Artificial Intelligence Help or Hurt Gender Diversity? Evidence from Two Field Experiments on Recruitment in Tech. http://monash-econ-wps.s3.amazonaws.com/RePEc/mos/moswps/2023-09.pdf
- Homophily and the Glass Ceiling Effect in Social Networks. In Proc. of the 2015 Conf. on Innovations in Theoretical Computer Science, ITCS 2015, Rehovot, Israel, January 11-13, 2015, Tim Roughgarden (Ed.). ACM, 41–50. https://doi.org/10.1145/2688073.2688097
- Ghazala Azmat and Barbara Petrongolo. 2014. Gender and the labor market: What have we learned from field and lab experiments? Labour economics 30 (2014), 32–40.
- Ricardo Baeza-Yates. 2018. Bias on the web. Commun. ACM 61, 6 (2018), 54–61. https://doi.org/10.1145/3209581
- Fairness and Machine Learning: Limitations and Opportunities. fairmlbook.org. http://www.fairmlbook.org.
- Solon Barocas and Andrew D Selbst. 2016. Big data’s disparate impact. California law review (2016), 671–732.
- Murray R Barrick and Michael K Mount. 1991. The big five personality dimensions and job performance: a meta-analysis. Personnel psychology 44, 1 (1991), 1–26.
- Age, tenure, and job satisfaction: A tale of two perspectives. Journal of Vocational behavior 40, 1 (1992), 33–48.
- AI Fairness 360: An Extensible Toolkit for Detecting, Understanding, and Mitigating Unwanted Algorithmic Bias. https://arxiv.org/abs/1810.01943
- Jason R Bent. 2019. Is algorithmic affirmative action legal. Geo. LJ 108 (2019), 803.
- Marianne Bertrand and Sendhil Mullainathan. 2004. Are Emily and Greg more employable than Lakisha and Jamal? A field experiment on labor market discrimination. American economic review 94, 4 (2004), 991–1013.
- Fairness in recommendation ranking through pairwise comparisons. In Proc. of the 25th ACM SIGKDD international conference on knowledge discovery & data mining. 2212–2220.
- Data donation as a model for citizen science health research. Citizen Science: Theory and Practice 4, 1 (2019).
- Hitting the “Grass Ceiling”: Golfing CEOs, Exclusionary Schema, and Career Outcomes for Female Executives. Journal of Management (2023), 01492063231161342.
- Demographic Dialectal Variation in Social Media: A Case Study of African-American English. In Proc. of the 2016 Conf. on Empirical Methods in Natural Language Processing, EMNLP 2016, Austin, Texas, USA, November 1-4, 2016, Jian Su, Xavier Carreras, and Kevin Duh (Eds.). The Association for Computational Linguistics, 1119–1130. https://doi.org/10.18653/v1/d16-1120
- Lotte Bloksgaard. 2011. Masculinities, femininities and work–the horizontal gender segregation in the Danish Labour market. Nordic journal of working life studies 1, 2 (2011), 5–21.
- Donna Bobbitt-Zeher. 2011. Gender discrimination at work: Connecting gender stereotypes, institutional policies, and gender composition of workplace. Gender & Society 25, 6 (2011), 764–786.
- Miranda Bogen and Aaron Rieke. 2018. Help Wanted: An Examination of Hiring Algorithms, Equity, and Bias. Technical Report. Upturn.
- Jasmijn C Bol. 2011. The determinants and performance effects of managers’ performance evaluation biases. The Accounting Review 86, 5 (2011), 1549–1575.
- Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word Embeddings. In Advances in Neural Information Processing Systems 29: Annual Conf. on Neural Information Processing Systems 2016, December 5-10, 2016, Barcelona, Spain, Daniel D. Lee, Masashi Sugiyama, Ulrike von Luxburg, Isabelle Guyon, and Roman Garnett (Eds.). 4349–4357. https://proceedings.neurips.cc/paper/2016/hash/a486cd07e4ac3d270571622f4f316ec5-Abstract.html
- Bias and Fairness in Multimodal Machine Learning: A Case Study of Automated Video Interviews. In ICMI ’21: International Conf. on Multimodal Interaction, Montréal, QC, Canada, October 18-22, 2021, Zakia Hammal, Carlos Busso, Catherine Pelachaud, Sharon L. Oviatt, Albert Ali Salah, and Guoying Zhao (Eds.). ACM, 268–277. https://doi.org/10.1145/3462244.3479897
- Frederik Zuiderveen Borgesius. 2020. Price discrimination, algorithmic decision-making, and European non-discrimination law. European Business Law Review 31, 3 (2020).
- Sparks of Artificial General Intelligence: Early experiments with GPT-4. arXiv:2303.12712 [cs.CL]
- Artificial intelligence–challenges and opportunities for international HRM: a review and research agenda. The InTernaTIonal Journal of human resource managemenT 33, 6 (2022), 1065–1097.
- Joy Buolamwini and Timnit Gebru. 2018. Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification. In Conf. on Fairness, Accountability and Transparency, FAT 2018, 23-24 February 2018, New York, NY, USA (Proc. of Machine Learning Research, Vol. 81), Sorelle A. Friedler and Christo Wilson (Eds.). PMLR, 77–91. http://proceedings.mlr.press/v81/buolamwini18a.html
- Fair candidate ranking with spatial partitioning: Lessons from the SIOP ML competition. In Proc. of the First Workshop on Recommender Systems for Human Resources (RecSys in HR 2021) co-located with the 15th ACM Conf. on Recommender Systems (RecSys 2021), Vol. 2967.
- Disentangling and Operationalizing AI Fairness at LinkedIn. In Proc. of the 2023 ACM Conf. on Fairness, Accountability, and Transparency, FAccT 2023, Chicago, IL, USA, June 12-15, 2023. ACM, 1213–1228. https://doi.org/10.1145/3593013.3594075
- Bargaining, sorting, and the gender wage gap: Quantifying the impact of firms on the relative pay of women. The Quarterly journal of economics 131, 2 (2016), 633–686.
- Ben Carterette. 2011. System effectiveness, user models, and user utility: a conceptual framework for investigation. In Proceeding of the 34th International ACM SIGIR Conf. on Research and Development in Information Retrieval, SIGIR 2011, Beijing, China, July 25-29, 2011, Wei-Ying Ma, Jian-Yun Nie, Ricardo Baeza-Yates, Tat-Seng Chua, and W. Bruce Croft (Eds.). ACM, 903–912. https://doi.org/10.1145/2009916.2010037
- Census Bureau. 2023. Current Population Survey. https://stats.bls.gov/news.release/empsit.toc.htm
- What 5 million job advertisements tell us about testing: a preliminary empirical investigation. In SAC ’20: The 35th ACM/SIGAPP Symposium on Applied Computing, online event, [Brno, Czech Republic], March 30 - April 3, 2020, Chih-Cheng Hung, Tomás Cerný, Dongwan Shin, and Alessio Bechini (Eds.). ACM, 1586–1594. https://doi.org/10.1145/3341105.3373961
- Simon Chandler. 2018. The AI Chatbot Will Hire You Now.
- Unequal pain: a sketch of the impact of the Covid-19 pandemic on migrants’ employment in China. Eurasian Geography and Economics 61, 4-5 (2020), 448–463.
- Fairness-aware graph neural networks: A survey. ACM Transactions on Knowledge Discovery from Data (2023).
- A two-step resume information extraction algorithm. Mathematical Problems in Engineering 2018 (2018).
- Investigating the Impact of Gender on Rank in Resume Search Engines. In Proc. of the 2018 CHI Conf. on Human Factors in Computing Systems, CHI 2018, Montreal, QC, Canada, April 21-26, 2018, Regan L. Mandryk, Mark Hancock, Mark Perry, and Anna L. Cox (Eds.). ACM, 651. https://doi.org/10.1145/3173574.3174225
- Correcting for Recency Bias in Job Recommendation. In Proc. of the 28th ACM International Conf. on Information and Knowledge Management, CIKM 2019, Beijing, China, November 3-7, 2019, Wenwu Zhu, Dacheng Tao, Xueqi Cheng, Peng Cui, Elke A. Rundensteiner, David Carmel, Qi He, and Jeffrey Xu Yu (Eds.). ACM, 2185–2188. https://doi.org/10.1145/3357384.3358131
- Cultural stereotypes as gatekeepers: Increasing girls’ interest in computer science and engineering by diversifying stereotypes. Frontiers in psychology (2015), 49.
- Diversifying Society’s Leaders? The Causal Effects of Admission to Highly Selective Private Colleges. Working Paper 31492. National Bureau of Economic Research. https://doi.org/10.3386/w31492
- T Anne Cleary. 1968. Test bias: Prediction of grades of Negro and white students in integrated colleges. Journal of Educational Measurement 5, 2 (1968), 115–124.
- The glass ceiling effect. Social forces 80, 2 (2001), 655–681.
- Council of the European Union. 2000a. Council Directive 2000/43/EC Implementing the Principle of Equal Treatment Between Persons Irrespective of Racial or Ethnic Origin. https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=celex%3A32000L0043
- Council of the European Union. 2000b. Council Directive 2000/78/EC of 27 November 2000 establishing a general framework for equal treatment in employment and occupation. https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=celex%3A32000L0078
- Council of the European Union. 2004. Council Directive 2004/113/EC of 13 December 2004 implementing the principle of equal treatment between men and women in the access to and supply of goods and services. https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=celex%3A32004L0113
- Court of Justice of the European Union. 2008. Centrum voor gelijkheid van kansen en voor racismebestrijding v Firma Feryn NV. https://eur-lex.europa.eu/legal-content/EN/ALL/?uri=CELEX%3A62007CJ0054
- Bo Cowgill. 2018. Bias and productivity in humans and algorithms: Theory and evidence from resume screening. Columbia Business School, Columbia University 29 (2018).
- Terry-Ann Craigie. 2020. Ban the box, convictions, and public employment. Economic Inquiry 58, 1 (2020), 425–445.
- An experimental comparison of click position-bias models. In Proc. of the International Conf. on Web Search and Web Data Mining, WSDM 2008, Palo Alto, California, USA, February 11-12, 2008, Marc Najork, Andrei Z. Broder, and Soumen Chakrabarti (Eds.). ACM, 87–94. https://doi.org/10.1145/1341531.1341545
- Automated Experiments on Ad Privacy Settings. Proc. Priv. Enhancing Technol. 2015, 1 (2015), 92–112. https://doi.org/10.1515/popets-2015-0007
- Skylar Davidson. 2016. Gender inequality: Nonbinary transgender people in the workplace. Cogent Social Sciences 2, 1 (2016), 1236511.
- Bias in Bios: A Case Study of Semantic Representation Bias in a High-Stakes Setting. In Proc. of the Conf. on Fairness, Accountability, and Transparency, FAT* 2019, Atlanta, GA, USA, January 29-31, 2019, danah boyd and Jamie H. Morgenstern (Eds.). ACM, 120–128. https://doi.org/10.1145/3287560.3287572
- Jenny Yang Deirdre Mulligan. 2023. Hearing from the American People: How Are Automated Tools Being Used to Surveil, Monitor, and Manage Workers?
- ArcFace: Additive Angular Margin Loss for Deep Face Recognition. In IEEE Conf. on Computer Vision and Pattern Recognition, CVPR 2019, Long Beach, CA, USA, June 16-20, 2019. Computer Vision Foundation / IEEE, 4690–4699. https://doi.org/10.1109/CVPR.2019.00482
- Megan Denver. 2020. Criminal records, positive credentials and recidivism: Incorporating evidence of rehabilitation into criminal background check employment decisions. Crime & Delinquency 66, 2 (2020), 194–218.
- Anne-Sophie Deprez-Sims and Scott B Morris. 2010. Accents in the workplace: Their effects during a job interview. International Journal of Psychology 45, 6 (2010), 417–426.
- Wealth of two nations: The US racial wealth gap, 1860-2020. Technical Report. National Bureau of Economic Research.
- Mitigating Demographic Bias in AI-based Resume Filtering. In Adjunct Publication of the 28th ACM Conf. on User Modeling, Adaptation and Personalization, UMAP 2020, Genoa, Italy, July 12-18, 2020, Tsvi Kuflik, Ilaria Torre, Robin Burke, and Cristina Gena (Eds.). ACM, 268–275. https://doi.org/10.1145/3386392.3399569
- Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018).
- Fairness in Graph Mining: A Survey. IEEE Trans. Knowl. Data Eng. 35, 10 (2023), 10583–10602. https://doi.org/10.1109/TKDE.2023.3265598
- Gender stereotypes have changed: A cross-temporal meta-analysis of US public opinion polls from 1946 to 2018. American psychologist 75, 3 (2019), 301–315.
- Harrison Edwards and Amos J. Storkey. 2016. Censoring Representations with an Adversary. In 4th International Conf. on Learning Representations, ICLR 2016, San Juan, Puerto Rico, May 2-4, 2016, Conf. Track Proc., Yoshua Bengio and Yann LeCun (Eds.). http://arxiv.org/abs/1511.05897
- EEOC - US Equal Employment Opportunity Commission. 2023. Select Issues: Assessing Adverse Impact in Software, Algorithms, and Artificial Intelligence Used in Employment Selection Procedures Under Title VII of the Civil Rights Act of 1964. https://www.eeoc.gov/select-issues-assessing-adverse-impact-software-algorithms-and-artificial-intelligence-used
- Paul Ekman and Wallace V Friesen. 2003. Unmasking the face: A guide to recognizing emotions from facial clues. Vol. 10. Ishk.
- Naomi Ellemers. 2018. Gender Stereotypes. Annual Review of Psychology 69, 1 (2018), 275–298.
- Equal Employment Opportunity Commission. 2015. Uniform guidelines on employment selection procedures.
- Stefan Eriksson and Jonas Lagerström. 2012. The labor market consequences of gender differences in job search. Journal of Labor Research 33 (2012), 303–327.
- Modeling, recognizing, and explaining apparent personality from videos. IEEE Transactions on Affective Computing 13, 2 (2020), 894–911.
- Ben Eubanks. 2022. Artificial intelligence for HR: Use AI to support and develop a successful workforce. Kogan Page Publishers.
- European Commission. 2020. The gender pay gap situation in the EU. https://commission.europa.eu/strategy-and-policy/policies/justice-and-fundamental-rights/gender-equality/equal-pay/gender-pay-gap-situation-eu_en
- European Institute for Gender Equality. 2020. Gender Equality Index 2020. https://eige.europa.eu/publications/gender-equality-index-2020-key-findings-eu
- European Institute for Gender Equality. 2023. Gender Equality Index. https://eige.europa.eu/gender-equality-index/2022/domain/work
- European Parliament. 2021. Proposal for a regulation of the European Parliament and of the Council laying down harmonised rules on artificial intelligence (artificial intelligence act) and amending certain union legislative acts. https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=celex%3A52021PC0206
- European Parliament. 2023. Artificial Intelligence Act: Amendments adopted by the European Parliament on 14 June 2023 on the proposal for a regulation of the European Parliament and of the Council on laying down harmonised rules on artificial intelligence (Artificial Intelligence Act) and amending certain Union legislative acts. https://www.europarl.europa.eu/doceo/document/TA-9-2023-0236_EN.pdf
- European Parliament and Council of the European Union. 2004. Directive 2006/54/EC of the European Parliament and of the Council of 5 July 2006 on the implementation of the principle of equal opportunities and equal treatment of men and women in matters of employment and occupation (recast). https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=celex%3A32006L0054
- Christine L Exley and Judd B Kessler. 2022. The gender gap in self-promotion. The Quarterly Journal of Economics 137, 3 (2022), 1345–1381.
- Opensmile: the munich versatile and fast open-source audio feature extractor. In Proc. of the 18th ACM international conference on Multimedia. 1459–1462.
- Measuring Fairness Under Unawareness of Sensitive Attributes: A Quantification-Based Approach. J. Artif. Intell. Res. 76 (2023), 1117–1180. https://doi.org/10.1613/jair.1.14033
- Algorithmic fairness datasets: the story so far. Data Min. Knowl. Discov. 36, 6 (2022), 2074–2152. https://doi.org/10.1007/s10618-022-00854-z
- Tackling Documentation Debt: A Survey on Algorithmic Fairness Datasets. In Equity and Access in Algorithms, Mechanisms, and Optimization, EAAMO 2022, Arlington, VA, USA, October 6-9, 2022. ACM, 2:1–2:13. https://doi.org/10.1145/3551624.3555286
- Gender stereotype reinforcement: Measuring the gender bias conveyed by ranking algorithms. Inf. Process. Manag. 57, 6 (2020), 102377. https://doi.org/10.1016/j.ipm.2020.102377
- Lídia Farré. 2016. Parental leave policies and gender equality: a survey of the literature. Studies of Applied Economics 34, 1 (2016), 45–60.
- Certifying and Removing Disparate Impact. In Proc. of the 21th ACM SIGKDD International Conf. on Knowledge Discovery and Data Mining, Sydney, NSW, Australia, August 10-13, 2015, Longbing Cao, Chengqi Zhang, Thorsten Joachims, Geoffrey I. Webb, Dragos D. Margineantu, and Graham Williams (Eds.). ACM, 259–268. https://doi.org/10.1145/2783258.2783311
- Assessing job performance using brief self-report scales: The case of the individual work performance questionnaire. Revista de Psicología del Trabajo y de las Organizaciones 35, 3 (2019), 195–205.
- Gaps in Information Access in Social Networks?. In The World Wide Web Conf., WWW 2019, San Francisco, CA, USA, May 13-17, 2019, Ling Liu, Ryen W. White, Amin Mantrach, Fabrizio Silvestri, Julian J. McAuley, Ricardo Baeza-Yates, and Leila Zia (Eds.). ACM, 480–490. https://doi.org/10.1145/3308558.3313680
- On the Validity of Arrest as a Proxy for Offense: Race and the Likelihood of Arrest for Violent Crimes. In AIES ’21: AAAI/ACM Conf. on AI, Ethics, and Society, Virtual Event, USA, May 19-21, 2021, Marion Fourcade, Benjamin Kuipers, Seth Lazar, and Deirdre K. Mulligan (Eds.). ACM, 100–111. https://doi.org/10.1145/3461702.3462538
- Leeftijdsdiscriminatie in vacatureteksten: Een geautomatiseerde inhoudsanalyse naar verboden leeftijd-gerelateerd taalgebruik in vacatureteksten: Rapport in opdracht van het College voor de Rechten van de Mens.
- World Economic Forum. 2021. Human-Centred Artificial Intelligence for Human Resources: A Toolkit for Human Resources Professionals. https://www3.weforum.org/docs/WEF_Human_Centred_Artificial_Intelligence_for_Human_Resources_2021.pdf
- On the (im) possibility of fairness. arXiv preprint arXiv:1609.07236 (2016).
- Hidden Workers: Untapped Talent. Technical Report. Harvard Business School.
- Evidence that gendered wording in job advertisements exists and sustains gender inequality. Journal of personality and social psychology 101, 1 (2011), 109.
- Fairness-Aware Ranking in Search & Recommendation Systems with Application to LinkedIn Talent Search. In Proc. of the 25th ACM SIGKDD International Conf. on Knowledge Discovery & Data Mining, KDD 2019, Anchorage, AK, USA, August 4-8, 2019, Ankur Teredesai, Vipin Kumar, Ying Li, Rómer Rosales, Evimaria Terzi, and George Karypis (Eds.). ACM, 2221–2231. https://doi.org/10.1145/3292500.3330691
- Talent Search and Recommendation Systems at LinkedIn: Practical Challenges and Lessons Learned. In The 41st International ACM SIGIR Conf. on Research & Development in Information Retrieval, SIGIR 2018, Ann Arbor, MI, USA, July 08-12, 2018, Kevyn Collins-Thompson, Qiaozhu Mei, Brian D. Davison, Yiqun Liu, and Emine Yilmaz (Eds.). ACM, 1353–1354. https://doi.org/10.1145/3209978.3210205
- Measuring Fairness of Rankings under Noisy Sensitive Information. In FAccT ’22: 2022 ACM Conf. on Fairness, Accountability, and Transparency, Seoul, Republic of Korea, June 21 - 24, 2022. ACM, 2263–2279. https://doi.org/10.1145/3531146.3534641
- Automation bias: a systematic review of frequency, effect mediators, and mitigators. J. Am. Medical Informatics Assoc. 19, 1 (2012), 121–127. https://doi.org/10.1136/amiajnl-2011-000089
- A Taxonomy of Team-Assembly Systems: Understanding How People Use Technologies to Form Teams. Proc. ACM Hum. Comput. Interact. 4, CSCW2 (2020), 181:1–181:36. https://doi.org/10.1145/3415252
- Hila Gonen and Yoav Goldberg. 2019. Lipstick on a Pig: Debiasing Methods Cover up Systematic Gender Biases in Word Embeddings But do not Remove Them. In Proc. of the 2019 Conf. of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL-HLT 2019, Minneapolis, MN, USA, June 2-7, 2019, Volume 1 (Long and Short Papers), Jill Burstein, Christy Doran, and Thamar Solorio (Eds.). Association for Computational Linguistics, 609–614. https://doi.org/10.18653/v1/n19-1061
- Carlos Gradín. 2021. Occupational gender segregation in post-apartheid South Africa. Feminist Economics 27, 3 (2021), 102–133.
- Career goals, salary expectations, and salary negotiation among male and female general surgery residents. JAMA surgery 154, 11 (2019), 1023–1029.
- The case for process fairness in learning: Feature selection for fair decision making. In NIPS symposium on machine learning and the law, Vol. 1. Barcelona, Spain, 11.
- Philipp Hacker. 2018. Teaching fairness to artificial intelligence: Existing and novel strategies against algorithmic discrimination under EU law. Common Market Law Review 55, 4 (2018).
- Bias in Online Freelance Marketplaces: Evidence from TaskRabbit and Fiverr. In Proc. of the 2017 ACM Conf. on Computer Supported Cooperative Work and Social Computing, CSCW 2017, Portland, OR, USA, February 25 - March 1, 2017, Charlotte P. Lee, Steven E. Poltrock, Louise Barkhuus, Marcos Borges, and Wendy A. Kellogg (Eds.). ACM, 1914–1933. https://doi.org/10.1145/2998181.2998327
- Equality of opportunity in supervised learning. Advances in neural information processing systems 29 (2016).
- Joyce C He and Sonia K Kang. 2021. Covering in cover letters: Gender and self-presentation in job applications. Academy of Management Journal 64, 4 (2021), 1097–1126.
- Stereotypes at work: Occupational stereotypes predict race and gender segregation in the workforce. Journal of Vocational Behavior 115 (2019), 103318.
- Race, Glass Ceilings, and Lower Pay for Equal Work. Swedish House of Finance Research Paper 21-09 (2022).
- Madeline E Heilman. 2012. Gender stereotypes and workplace bias. Research in organizational Behavior 32 (2012), 113–135.
- Léo Hemamou and William Coleman. 2022. Delivering Fairness in Human Resources AI: Mutual Information to the Rescue. In Proc. of the 2nd Conf. of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conf. on Natural Language Processing, AACL/IJCNLP 2022 - Volume 1: Long Papers, Online Only, November 20-23, 2022, Yulan He, Heng Ji, Yang Liu, Sujian Li, Chia-Hui Chang, Soujanya Poria, Chenghua Lin, Wray L. Buntine, Maria Liakata, Hanqi Yan, Zonghan Yan, Sebastian Ruder, Xiaojun Wan, Miguel Arana-Catania, Zhongyu Wei, Hen-Hsen Huang, Jheng-Long Wu, Min-Yuh Day, Pengfei Liu, and Ruifeng Xu (Eds.). Association for Computational Linguistics, 867–882. https://aclanthology.org/2022.aacl-main.64
- Don’t Judge Me by My Face: An Indirect Adversarial Approach to Remove Sensitive Information From Multimodal Neural Representation in Asynchronous Job Video Interviews. In 9th International Conf. on Affective Computing and Intelligent Interaction, ACII 2021, Nara, Japan, September 28 - Oct. 1, 2021. IEEE, 1–8. https://doi.org/10.1109/ACII52823.2021.9597443
- Bargaining while Black: The role of race in salary negotiations. Journal of Applied Psychology 104, 4 (2019), 581.
- Iñigo Hernandez-Arenaz and Nagore Iriberri. 2019. A review of gender differences in negotiation. Oxford Research Encyclopedia of Economics and Finance (2019).
- On the Moral Justification of Statistical Parity. In FAccT ’21: 2021 ACM Conf. on Fairness, Accountability, and Transparency, Virtual Event / Toronto, Canada, March 3-10, 2021, Madeleine Clare Elish, William Isaac, and Richard S. Zemel (Eds.). ACM, 747–757. https://doi.org/10.1145/3442188.3445936
- HireVue. 2022. Explainability Statement. https://hirevue-api.dev-directory.com/wp-content/uploads/2022/04/HV_AI_Short-Form_Explainability_1pager.pdf
- Will artificial intelligence take over human resources recruitment and selection. Network Intelligence Studies 7, 13 (2019), 21–30.
- Discrimination for the Sake of Fairness: Fairness by Design and Its Legal Framework. Available at SSRN 3773766 (2021).
- Social networking information and pre-employment background check: mediating effects of perceived benefit and organizational branding. International Journal of Manpower (2021).
- Balancing Gender Bias in Job Advertisements With Text-Level Bias Mitigation. Frontiers Big Data 5 (2022), 805713. https://doi.org/10.3389/fdata.2022.805713
- David J Hughes and Mark Batey. 2017. Using personality questionnaires for selection. The Wiley Blackwell handbook of the psychology of recruitment, selection and employee retention (2017), 151–181.
- Ben Hutchinson and Margaret Mitchell. 2019. 50 Years of Test (Un)fairness: Lessons for Machine Learning. In Proc. of the Conf. on Fairness, Accountability, and Transparency, FAT* 2019, Atlanta, GA, USA, January 29-31, 2019, danah boyd and Jamie H. Morgenstern (Eds.). ACM, 49–58. https://doi.org/10.1145/3287560.3287600
- Illinois General Assembly. 2020. Artificial Intelligence Video Interview Act, 820 ILCS 42. https://www.ilga.gov/legislation/ilcs/ilcs3.asp?ActID=4015&ChapterID=68
- Auditing for Discrimination in Algorithms Delivering Job Ads. In WWW ’21: The Web Conf. 2021, Virtual Event / Ljubljana, Slovenia, April 19-23, 2021, Jure Leskovec, Marko Grobelnik, Marc Najork, Jie Tang, and Leila Zia (Eds.). ACM / IW3C2, 3767–3778. https://doi.org/10.1145/3442381.3450077
- Abigail Z. Jacobs and Hanna Wallach. 2021. Measurement and Fairness. In Proc. of the 2021 ACM Conf. on Fairness, Accountability, and Transparency (Virtual Event, Canada) (FAccT ’21). Association for Computing Machinery, New York, NY, USA, 375–385. https://doi.org/10.1145/3442188.3445901
- The acquisition of gender stereotypes about intellectual ability: Intersections with race. Journal of Social Issues 75, 4 (2019), 1192–1215.
- The global landscape of AI ethics guidelines. Nature Machine Intelligence 1, 9 (01 Sep 2019), 389–399. https://doi.org/10.1038/s42256-019-0088-2
- Jobvite. 2021. 2021 Recruiter Nation Report. Technical Report. https://www.jobvite.com/lp/2021-recruiter-nation-report/
- Marc Juarez and Aleksandra Korolova. 2023. “You Can’t Fix What You Can’t Measure”: Privately Measuring Demographic Performance Disparities in Federated Learning. In Workshop on Algorithmic Fairness through the Lens of Causality and Privacy. PMLR, 67–85.
- Noise: a flaw in human judgment. Hachette UK.
- Mitchel Kappen and Marnix Naber. 2021. Objective and bias-free measures of candidate motivation during job applications. Scientific Reports 11, 1 (2021), 21254.
- Navroop Kaur and Sandeep K. Sood. 2017. A Game Theoretic Approach for an IoT-Based Automated Employee Performance Evaluation. IEEE Syst. J. 11, 3 (2017), 1385–1394. https://doi.org/10.1109/JSYST.2015.2469102
- Multi-modal Score Fusion and Decision Trees for Explainable Automatic Job Candidate Screening from Video CVs. In 2017 IEEE Conf. on Computer Vision and Pattern Recognition Workshops, CVPR Workshops 2017, Honolulu, HI, USA, July 21-26, 2017. IEEE Computer Society, 1651–1659. https://doi.org/10.1109/CVPRW.2017.210
- Nicolas Kayser-Bril. 2023. LinkedIn automatically rates “out-of-country” candidates as “not fit” in job applications. Technical Report. AlgorithmWatch.
- Blind Justice: Fairness with Encrypted Sensitive Attributes. In Proc. of the 35th International Conf. on Machine Learning, ICML 2018, Stockholmsmässan, Stockholm, Sweden, July 10-15, 2018 (Proc. of Machine Learning Research, Vol. 80), Jennifer G. Dy and Andreas Krause (Eds.). PMLR, 2635–2644. http://proceedings.mlr.press/v80/kilbertus18a.html
- Pauline T Kim. 2016. Data-driven discrimination at work. Wm. & Mary L. Rev. 58 (2016), 857.
- Pauline T Kim. 2022. Race-aware algorithms: Fairness, nondiscrimination and affirmative action. Cal. L. Rev. 110 (2022), 1539.
- Nicholas J Klein and Michael J Smart. 2017. Car today, gone tomorrow: The ephemeral car in low-income, immigrant and minority families. Transportation 44 (2017), 495–510.
- Systemic discrimination among large US employers. The Quarterly Journal of Economics 137, 4 (2022), 1963–2036.
- Highly accurate, but still discriminatory: A fairness evaluation of algorithmic video analysis in the recruitment context. Business & Information Systems Engineering 63 (2021), 39–54.
- Alina Köchling and Marius Claus Wehner. 2020. Discriminated by an algorithm: a systematic review of discrimination and fairness by algorithmic decision-making in the context of HR recruitment and HR development. Business Research 13, 3 (2020), 795–848.
- BIAS Word inventory for work and employment diversity,(in) equality and inclusivity (Version 1.0). SocArXiv (2022).
- Margaret Bull Kovera. 2019. Racial disparities in the criminal justice system: Prevalence, causes, and a search for solutions. Journal of Social Issues 75, 4 (2019), 1139–1164.
- Kurt Kraiger and J Kevin Ford. 1985. A meta-analysis of ratee race effects in performance ratings. Journal of applied psychology 70, 1 (1985), 56.
- Jasper Krommendijk and Frederik Zuiderveen Borgesius. 2023. EU Law Analysis ‘How to read EU legislation?’. http://eulawanalysis.blogspot.com/p/how-to-read-eu-legislation.html
- Fairness of recommender systems in the recruitment domain: an analysis from technical and legal perspectives. Frontiers in big Data 6 (2023).
- Astrid Kunze and Amalia R Miller. 2017. Women helping women? Evidence from private sector data on workplace hierarchies. Review of Economics and Statistics 99, 5 (2017), 769–775.
- Check the box! How to deal with automation bias in AI-based personnel selection. Frontiers in Psychology 14 (2023), 1118723.
- Digitizing and disclosing personal data: The proliferation of state criminal records on the internet. Law & Social Inquiry 46, 3 (2021), 635–665.
- Anja Lambrecht and Catherine Tucker. 2019. Algorithmic Bias? An Empirical Study of Apparent Gender-Based Discrimination in the Display of STEM Career Ads. Manag. Sci. 65, 7 (2019), 2966–2981. https://doi.org/10.1287/mnsc.2018.3093
- Applicants’ fairness perceptions of algorithm-driven hiring procedures. Journal of Business Ethics (2023), 1–26.
- Hiring women into senior leadership positions is associated with a reduction in gender stereotypes in organizational language. Proc. of the National Academy of Sciences 119, 9 (2022), e2026443119.
- Gender differences in job search: Trading off commute against wage. The Quarterly Journal of Economics 136, 1 (2021), 381–426.
- Yeonjung Lee and Fengyan Tang. 2015. More caregiving, less working: Caregiving roles and gender difference. Journal of Applied Gerontology 34, 4 (2015), 465–483.
- Andreas Leibbrandt and John A List. 2015. Do women avoid salary negotiations? Evidence from a large-scale natural field experiment. Management Science 61, 9 (2015), 2016–2024.
- Are Humans Biased in Assessment of Video Interviews?. In Adjunct of the 2019 International Conf. on Multimodal Interaction, ICMI 2019, Suzhou, China, October 14-18, 2019. ACM, 9:1–9:5. https://doi.org/10.1145/3351529.3360653
- Eve A Levin. 2018. Gender-normed physical-ability tests under Title VII. Columbia Law Review 118, 2 (2018), 567–604.
- Algorithmic hiring in practice: Recruiter and HR Professional’s perspectives on AI use in hiring. In Proc. of the 2021 AAAI/ACM Conf. on AI, Ethics, and Society. 166–176.
- A Survey on Fairness in Large Language Models. arXiv:2308.10149 [cs.CL]
- Trustworthy AI: A Computational Perspective. ACM Trans. Intell. Syst. Technol. 14, 1 (2023), 4:1–4:59. https://doi.org/10.1145/3546872
- Prediction of Employee Promotion Based on Personal Basic Features and Post Features. In Proc. of the International Conf. on Data Processing and Applications, ICDPA 2018, Guangdong, China, May 12-14, 2018. ACM, 5–10. https://doi.org/10.1145/3224207.3224210
- May the bots be with you! Delivering HR cost-effectiveness and individualised employee experiences in an MNE. The International Journal of Human Resource Management 33, 6 (2022), 1148–1178.
- Racial and ethnic disparities in who receives unemployment benefits during COVID-19. SN Business & Economics 2, 8 (2022), 102.
- Fairness in Regression – Analysing a Job Candidates Ranking System. In INFORMATIK 2022, Daniel Demmler, Daniel Krupka, and Hannes Federrath (Eds.). Gesellschaft für Informatik, Bonn, 1275–1285. https://doi.org/10.18420/inf2022_109
- A challenge-based survey of e-recruitment recommendation systems. arXiv preprint arXiv:2209.05112 (2022).
- David A Matsa and Amalia R Miller. 2011. Chipping away at the glass ceiling: Gender spillovers in corporate leadership. American Economic Review 101, 3 (2011), 635–639.
- Roy Maurer. 2021. HireVue Discontinues Facial Analysis Screening. https://www.shrm.org/resourcesandtools/hr-topics/talent-acquisition/pages/hirevue-discontinues-facial-analysis-screening.aspx
- Fine-Grained Job Salary Benchmarking with a Nonparametric Dirichlet Process-Based Latent Factor Model. INFORMS J. Comput. 34, 5 (2022), 2443–2463. https://doi.org/10.1287/ijoc.2022.1182
- Alex Miller. 2018. Want Less-Biased Decisions? Use Algorithms. https://hbr.org/2018/07/want-less-biased-decisions-use-algorithms
- Algorithmic fairness: Choices, assumptions, and definitions. Annual Review of Statistics and Its Application 8 (2021), 141–163.
- Tara Sophia Mohr. 2014. Why Women Don’t Apply for Jobs Unless They’re 100% Qualified. https://hbr.org/2014/08/why-women-dont-apply-for-jobs-unless-theyre-100-qualified
- SensitiveNets: Learning agnostic representations with application to face images. IEEE Transactions on Pattern Analysis and Machine Intelligence 43, 6 (2020), 2158–2164.
- Implicit stereotyping and medical decisions: unconscious stereotype activation in practitioners’ thoughts about African Americans. American journal of public health 102, 5 (2012), 996–1001.
- Corinne A Moss-Racusin and Laurie A Rudman. 2010. Disruptions in women’s self-promotion: The backlash avoidance model. Psychology of women quarterly 34, 2 (2010), 186–202.
- Dena F Mujtaba and Nihar R Mahapatra. 2021. Multi-Task Deep Neural Networks for Multimodal Personality Trait Prediction. In 2021 International Conf. on Computational Science and Computational Intelligence (CSCI). IEEE, 85–91.
- Ann L Mullen. 2009. Elite destinations: Pathways to attending an Ivy League university. British Journal of Sociology of Education 30, 1 (2009), 15–27.
- Achieving Fairness via Post-Processing in Web-Scale Recommender Systems. In FAccT ’22: 2022 ACM Conf. on Fairness, Accountability, and Transparency, Seoul, Republic of Korea, June 21 - 24, 2022. ACM, 715–725. https://doi.org/10.1145/3531146.3533136
- New York City Council. 2021. Automated employment decision tools, 144. https://legistar.council.nyc.gov/LegislationDetail.aspx?ID=4344524&GUID=B051915D-A9AC-451E-81F8-6596032FA3F9&Options=Advanced&Search
- Making AI explainable in the global south: A systematic review. In ACM SIGCAS/SIGCHI Conf. on Computing and Sustainable Societies (COMPASS). 439–452.
- Oracle. 2023. Welcome to Oracle AI Apps for Talent Management. https://docs.oracle.com/en/cloud/saas/talent-management/22d/faimh/welcome-to-ai-apps-for-talent-management.html#u30010414
- ORCAA. 2020. Description of Algorithmic Audit: Pre-built Assessments. Technical Report. https://techinquiry.org/HireVue-ORCAA.pdf
- Training language models to follow instructions with human feedback. Advances in Neural Information Processing Systems 35 (2022), 27730–27744.
- Emir Ozeren. 2014. Sexual orientation discrimination in the workplace: A systematic review of literature. Procedia-Social and Behavioral Sciences 109 (2014), 1203–1215.
- Nudging toward diversity: Applying behavioral design to faculty hiring. Review of Educational Research 90, 3 (2020), 311–348.
- Religious identity and workplace discrimination: A national survey of American Muslim physicians. AJOB Empirical Bioethics 7, 3 (2016), 149–159.
- Prasanna Parasurama and João Sedoc. 2021. Degendering Resumes for Fair Algorithmic Resume Screening. arXiv preprint arXiv:2112.08910 (2021).
- Prasanna Parasurama and João Sedoc. 2022. Gendered information in resumes and its role in algorithmic and human hiring bias. In Academy of Management Proc., Vol. 2022. Academy of Management Briarcliff Manor, NY 10510, 17133.
- Gendered Information in Resumes and Hiring Bias: A Predictive Modeling Approach. Available at SSRN 4074976 (2022).
- The Synthetic data vault. In IEEE International Conference on Data Science and Advanced Analytics (DSAA). 399–410. https://doi.org/10.1109/DSAA.2016.49
- Text mining of industry 4.0 job advertisements. Int. J. Inf. Manag. 50 (2020), 416–431. https://doi.org/10.1016/j.ijinfomgt.2019.07.014
- Bias in Multimodal AI: Testbed for Fair Automatic Recruitment. In 2020 IEEE/CVF Conf. on Computer Vision and Pattern Recognition, CVPR Workshops 2020, Seattle, WA, USA, June 14-19, 2020. Computer Vision Foundation / IEEE, 129–137. https://doi.org/10.1109/CVPRW50498.2020.00022
- What You See Is What You Get? The Impact of Representation Criteria on Human Bias in Hiring. In Proc. of the Seventh AAAI Conf. on Human Computation and Crowdsourcing, HCOMP 2019, Stevenson, WA, USA, October 28-30, 2019, Edith Law and Jennifer Wortman Vaughan (Eds.). AAAI Press, 125–134. https://ojs.aaai.org/index.php/HCOMP/article/view/5281
- Anders Persson. 2016. Implicit Bias in Predictive Data Profiling Within Recruitments. In Privacy and Identity Management. Facing up to Next Steps - 11th IFIP WG 9.2, 9.5, 9.6/11.7, 11.4, 11.6/SIG 9.2.2 International Summer School, Karlstad, Sweden, August 21-26, 2016, Revised Selected Papers (IFIP Advances in Information and Communication Technology, Vol. 498), Anja Lehmann, Diane Whitehouse, Simone Fischer-Hübner, Lothar Fritsch, and Charles D. Raab (Eds.). 212–230. https://doi.org/10.1007/978-3-319-55783-0_15
- Employees recruitment: A prescriptive analytics approach via machine learning and mathematical programming. Decis. Support Syst. 134 (2020), 113290. https://doi.org/10.1016/j.dss.2020.113290
- A large-scale analysis of racial disparities in police stops across the United States. Nature human behaviour 4, 7 (2020), 736–745.
- Rohit Punnoose and Pankaj Ajit. 2016. Prediction of Employee Turnover in Organizations using Machine Learning Algorithms. International Journal of Advanced Research in Artificial Intelligence 5, 9 (2016). https://doi.org/10.14569/IJARAI.2016.050904
- Implicit sources of bias in employment interview judgments and decisions. Organizational Behavior and Human Decision Processes 101, 2 (2006), 152–167.
- PwC. 2017. Artificial Intelligence in HR: a No-brainer. https://www.pwc.nl/nl/assets/documents/artificial-intelligence-in-hr-a-no-brainer.pdf
- Towards Automatic Job Description Generation With Capability-Aware Neural Networks. IEEE Trans. Knowl. Data Eng. 35, 5 (2023), 5341–5355. https://doi.org/10.1109/TKDE.2022.3145396
- Meta-analysis of field experiments shows no change in racial discrimination in hiring over time. Proc. of the National Academy of Sciences 114, 41 (2017), 10870–10875.
- Mitigating bias in algorithmic hiring: evaluating claims and practices. In FAT* ’20: Conf. on Fairness, Accountability, and Transparency, Barcelona, Spain, January 27-30, 2020, Mireille Hildebrandt, Carlos Castillo, L. Elisa Celis, Salvatore Ruggieri, Linnet Taylor, and Gabriela Zanfir-Fortuna (Eds.). ACM, 469–481. https://doi.org/10.1145/3351095.3372828
- Fairness in Language Models Beyond English: Gaps and Challenges. In Findings of the Association for Computational Linguistics: EACL 2023. Association for Computational Linguistics, Dubrovnik, Croatia, 2106–2119. https://aclanthology.org/2023.findings-eacl.157
- Blame it on hip-hop: Anti-rap attitudes as a proxy for prejudice. Group Processes & Intergroup Relations 12, 3 (2009), 361–380.
- Cecil R Reynolds and Lisa A Suzuki. 2012. Bias in psychological assessment: An empirical review and recommendations. Handbook of Psychology, Second Edition 10 (2012).
- Resume Format, LinkedIn URLs and Other Unexpected Influences on AI Personality Prediction in Hiring: Results of an Audit. In AIES ’22: AAAI/ACM Conference on AI, Ethics, and Society, Oxford, United Kingdom, May 19 - 21, 2021, Vincent Conitzer, John Tasioulas, Matthias Scheutz, Ryan Calo, Martina Mara, and Annette Zimmermann (Eds.). ACM, 572–587. https://doi.org/10.1145/3514094.3534189
- An external stability audit framework to test the validity of personality prediction in AI hiring. Data Mining and Knowledge Discovery 36, 6 (2022), 2153–2193.
- Peter A Riach and Judith Rich. 2002. Field experiments of discrimination in the market place. The economic journal 112, 483 (2002), F480–F518.
- Judith Rich. 2014. What do field experiments of discrimination in markets tell us? A meta analysis of studies conducted since 2000. (2014).
- Approaches to Improve Fairness when Deploying AI-based Algorithms in Hiring - Using a Systematic Literature Review to Guide Future Research. In 56th Hawaii International Conf. on System Sciences, HICSS 2023, Maui, Hawaii, USA, January 3-6, 2023, Tung X. Bui (Ed.). ScholarSpace, 216–225. https://hdl.handle.net/10125/102654
- Lauren A Rivera. 2011. Ivies, extracurriculars, and exclusion: Elite employers’ use of educational credentials. Research in Social Stratification and Mobility 29, 1 (2011), 71–90.
- Lauren A Rivera. 2012. Hiring as cultural matching: The case of elite professional service firms. American sociological review 77, 6 (2012), 999–1022.
- David Robotham and Richard Jubb. 1996. Competences: measuring the unmeasurable. Management development review 9, 5 (1996), 25–29.
- Artificial Intelligence Ethics: An Inclusive Global Discourse? arXiv preprint arXiv:2108.09959 (2021).
- Closing the Gender Wage Gap: Adversarial Fairness in Job Recommendation. (9 2022), 10 pages.
- Mary-Ann Russon. 2020. Uber sued by drivers over ‘automated robo-firing’. BBC News 26 (2020).
- The Unequal Opportunities of Large Language Models: Examining Demographic Biases in Job Recommendations by ChatGPT and LLaMA. In Proceedings of the 3rd ACM Conference on Equity and Access in Algorithms, Mechanisms, and Optimization (EAAMO ’23). Association for Computing Machinery. https://doi.org/10.1145/3617694.3623257
- Job-related and psychological effects of sexual harassment in the workplace: empirical evidence from two organizations. Journal of applied Psychology 82, 3 (1997), 401.
- Frederike Scholz. 2020. Taken for granted: ableist norms embedded in the design of online recruitment practices. The Palgrave Handbook of Disability at Work (2020), 451–469.
- Almudena Sevilla and Sarah Smith. 2020. Baby steps: The gender division of childcare during the COVID-19 pandemic. Oxford Review of Economic Policy 36, Supplement_1 (2020), S169–S186.
- Michael A Shields and Stephen Wheatley Price. 2002. Racial harassment, job satisfaction and intentions to quit: evidence from the British nursing profession. Economica 69, 274 (2002), 295–326.
- Jim Sidanius and Marie Crane. 1989. Job evaluation and gender: The case of university faculty. Journal of Applied Social Psychology 19, 2 (1989), 174–197.
- Grading video interviews with fairness considerations. arXiv preprint arXiv:2007.05461 (2020).
- Clea Skopeliti. 2023. ‘I feel constantly watched’: the employees working under surveillance. https://www.theguardian.com/money/2023/may/30/i-feel-constantly-watched-employees-working-under-surveillance-monitorig-software-productivity
- The power of language: Gender, status, and agency in performance evaluations. Sex Roles 80 (2019), 159–171.
- Interrupting the usual: Successful strategies for hiring diverse faculty. The Journal of Higher Education 75, 2 (2004), 133–160.
- Lawrence B Solum. 2004. Procedural justice. S. CAl. l. reV. 78 (2004), 181.
- The Promise and The Peril: Artificial Intelligence and Employment Discrimination. U. Miami L. Rev. 77 (2022), 1.
- Karen Sparck Jones. 1972. A statistical interpretation of term specificity and its application in retrieval. Journal of documentation 28, 1 (1972), 11–21.
- Facial expression analysis with AFFDEX and FACET: A validation study. Behavior research methods 50 (2018), 1446–1460.
- Algorithmic Glass Ceiling in Social Networks: The effects of social recommendations on network diversity. In Proc. of the 2018 World Wide Web Conf. on World Wide Web, WWW 2018, Lyon, France, April 23-27, 2018, Pierre-Antoine Champin, Fabien Gandon, Mounia Lalmas, and Panagiotis G. Ipeirotis (Eds.). ACM, 923–932. https://doi.org/10.1145/3178876.3186140
- Does Fair Ranking Improve Minority Outcomes? Understanding the Interplay of Human and Algorithmic Biases in Online Hiring. In AIES ’21: AAAI/ACM Conf. on AI, Ethics, and Society, Virtual Event, USA, May 19-21, 2021, Marion Fourcade, Benjamin Kuipers, Seth Lazar, and Deirdre K. Mulligan (Eds.). ACM, 989–999. https://doi.org/10.1145/3461702.3462602
- Rachael Tatman. 2017. Gender and Dialect Bias in YouTube’s Automatic Captions. In Proc. of the First ACL Workshop on Ethics in Natural Language Processing, EthNLP@EACL, Valencia, Spain, April 4, 2017, Dirk Hovy, Shannon L. Spruit, Margaret Mitchell, Emily M. Bender, Michael Strube, and Hanna M. Wallach (Eds.). Association for Computational Linguistics, 53–59. https://doi.org/10.18653/v1/w17-1606
- Gender differences and bias in open source: pull request acceptance of women versus men. PeerJ Comput. Sci. 3 (2017), e111. https://doi.org/10.7717/peerj-cs.111
- Rebbeca Tesfai and Kevin JA Thomas. 2020. Dimensions of inequality: Black immigrants’ occupational segregation in the United States. Sociology of Race and Ethnicity 6, 1 (2020), 1–21.
- Kerri A Thompson. 2020. Countenancing Employment Discrimination: Facial Recognition in Background Checks. Tex. A&M L. Rev. 8 (2020), 63.
- Nicholas Tilmes. 2022. Disability, fairness, and algorithmic bias in AI recruitment. Ethics Inf. Technol. 24, 2 (2022), 21. https://doi.org/10.1007/s10676-022-09633-2
- Considerations for AI fairness for people with disabilities. AI Matters 5, 3 (2019), 40–63. https://doi.org/10.1145/3362077.3362086
- James M Tyler and Jennifer Dane McCullough. 2009. Violating prescriptive stereotypes on job resumes: A self-presentational perspective. Management Communication Quarterly 23, 2 (2009), 272–287.
- UNDP - United Nations Development Programme. 2023. Breaking Down Gender Biases: Shifting social norms towards gender equality. https://hdr.undp.org/system/files/documents/hdp-document/gsni202302pdf.pdf
- U.S. Supreme Court. 1971. Griggs v. Duke Power Co., 401 U.S. 424. https://supreme.justia.com/cases/federal/us/401/424/
- U.S. Supreme Court. 1973. McDonnell Douglas Corp. v. Green, 411 U.S. 792. https://supreme.justia.com/cases/federal/us/411/792/
- U.S. Supreme Court. 1989. Price Waterhouse v. Hopkins, 490 U.S. 228. https://supreme.justia.com/cases/federal/us/490/228/
- U.S. Supreme Court. 2009. Ricci v. DeStefano, 557 U.S. 557. https://supreme.justia.com/cases/federal/us/557/557/
- Chris Vallance. 2023. TUC: Government failing to protect workers from AI. https://www.bbc.com/news/technology-65301630
- Marvin Van Bekkum and Frederik Zuiderveen Borgesius. 2023. Using sensitive data to prevent discrimination by artificial intelligence: Does the GDPR need a new exception? Computer Law & Security Review 48 (2023), 105770.
- Hiring Algorithms: An Ethnography of Fairness in Practice. In Proc. of the 40th International Conf. on Information Systems, ICIS 2019, Munich, Germany, December 15-18, 2019, Helmut Krcmar, Jane Fedorowicz, Wai Fong Boh, Jan Marco Leimeister, and Sunil Wattal (Eds.). Association for Information Systems. https://aisel.aisnet.org/icis2019/future_of_work/future_work/6
- Improving Fairness Assessments with Synthetic Data: a Practical Use Case with a Recommender System for Human Resources. In CompJobs ’22: The First International Workshop on Computational Jobs Marketplace (Online). 5 pages.
- Sriram Vasudevan and Krishnaram Kenthapadi. 2020. LiFT: A Scalable Framework for Measuring Fairness in ML Applications. In CIKM ’20: The 29th ACM International Conf. on Information and Knowledge Management, Virtual Event, Ireland, October 19-23, 2020, Mathieu d’Aquin, Stefan Dietze, Claudia Hauff, Edward Curry, and Philippe Cudré-Mauroux (Eds.). ACM, 2773–2780. https://doi.org/10.1145/3340531.3412705
- Giridhari Venkatadri and Alan Mislove. 2020. On the Potential for Discrimination via Composition. In IMC ’20: ACM Internet Measurement Conf., Virtual Event, USA, October 27-29, 2020. ACM, 333–344. https://doi.org/10.1145/3419394.3423641
- Pranshu Verma. 2023. AI is starting to pick who gets laid off. https://www.washingtonpost.com/technology/2023/02/20/layoff-algorithms/
- Why fairness cannot be automated: Bridging the gap between EU non-discrimination law and AI. Computer Law & Security Review 41 (2021), 105567.
- Sean Waite. 2021. Should I stay or should I go? Employment discrimination and workplace harassment against transgender and other minority employees in Canada’s federal public service. Journal of homosexuality 68, 11 (2021), 1833–1859.
- Joseph Walker. 2012. Meet the New Boss: Big Data.
- Against Predictive Optimization: On the Legitimacy of Decision-Making Algorithms that Optimize Predictive Accuracy. Available at SSRN (2022).
- Yu Wang and Tyler Derr. 2022. Degree-Related Bias in Link Prediction. In IEEE International Conference on Data Mining Workshops, ICDM 2022 - Workshops, Orlando, FL, USA, November 28 - Dec. 1, 2022, K. Selçuk Candan, Thang N. Dinh, My T. Thai, and Takashi Washio (Eds.). IEEE, 757–758. https://doi.org/10.1109/ICDMW58026.2022.00103
- A Survey on the Fairness of Recommender Systems. ACM Trans. Inf. Syst. 41, 3 (2023), 52:1–52:43. https://doi.org/10.1145/3547333
- Multi-Target Multiplicity: Flexibility and Fairness in Target Specification under Resource Constraints. In Proc. of the 2023 ACM Conf. on Fairness, Accountability, and Transparency, FAccT 2023, Chicago, IL, USA, June 12-15, 2023. ACM, 297–311. https://doi.org/10.1145/3593013.3593998
- Amy L Wax. 2011. Disparate Impact Realism. Wm. & Mary L. Rev. 53 (2011), 621.
- Fairlearn: Assessing and Improving Fairness of AI Systems. , 8 pages. http://jmlr.org/papers/v24/23-0389.html
- Doris Weichselbaumer and Rudolf Winter-Ebmer. 2005. A meta-analysis of the international gender wage gap. Journal of economic surveys 19, 3 (2005), 479–511.
- Motion Tracker: Camera-based monitoring of bodily movements using motion silhouettes. PloS one 10, 6 (2015), e0130293.
- Building and Auditing Fair Algorithms: A Case Study in Candidate Screening. In FAccT ’21: 2021 ACM Conf. on Fairness, Accountability, and Transparency, Virtual Event / Toronto, Canada, March 3-10, 2021, Madeleine Clare Elish, William Isaac, and Richard S. Zemel (Eds.). ACM, 666–677. https://doi.org/10.1145/3442188.3445928
- Claes Wohlin. 2014. Guidelines for snowballing in systematic literature studies and a replication in software engineering. In Proc. of the 18th international conference on evaluation and assessment in software engineering. 1–10.
- The presence of ethnic minority and disabled men in feminised work: Intersectionality, vertical segregation and the glass escalator. Sex Roles 72 (2015), 277–293.
- Alison T Wynn and Shelley J Correll. 2018. Puncturing the pipeline: Do technology companies alienate women in recruiting sessions? Social studies of science 48, 1 (2018), 149–164.
- Algorithmic Decision Making with Conditional Fairness. In KDD ’20: The 26th ACM SIGKDD Conf. on Knowledge Discovery and Data Mining, Virtual Event, CA, USA, August 23-27, 2020. ACM, 2125–2135. https://doi.org/10.1145/3394486.3403263
- Mitigating Biases in Multimodal Personality Assessment. In ICMI ’20: International Conf. on Multimodal Interaction, Virtual Event, The Netherlands, October 25-29, 2020. ACM, 361–369. https://doi.org/10.1145/3382507.3418889
- Maya Yaneva. 2018. Employee satisfaction vs. employee engagement vs. employee NPS. European Journal of Economics and Business Studies 4, 1 (2018), 221–227.
- Fairness Constraints: Mechanisms for Fair Classification. In AISTATS 2017, 20-22 April 2017, Fort Lauderdale, FL, USA (Proc. of Machine Learning Research, Vol. 54). PMLR, 962–970. http://proceedings.mlr.press/v54/zafar17a.html
- Matching code and law: achieving algorithmic fairness with optimal transport. Data Min. Knowl. Discov. 34, 1 (2020), 163–200. https://doi.org/10.1007/s10618-019-00658-8
- Fairness in Ranking, Part I: Score-Based Ranking. ACM Comput. Surv. 55, 6 (2023), 118:1–118:36. https://doi.org/10.1145/3533379
- Lixuan Zhang and Christopher Yencha. 2022. Examining perceptions towards hiring algorithms. Technology in Society 68 (2022), 101848.
- Shuo Zhang and Peter Kuhn. 2022. Understanding Algorithmic Bias in Job Recommender Systems: An Audit Study Approach. (2022).
- Are Male Candidates Better than Females? Debiasing BERT Resume Retrieval System. In IEEE International Conf. on Systems, Man, and Cybernetics, SMC 2022, Prague, Czech Republic, October 9-12, 2022. IEEE, 616–621. https://doi.org/10.1109/SMC53654.2022.9945184
- Yiguang Zhang and Augustin Chaintreau. 2021. Unequal Opportunities in Multi-hop Referral Programs. arXiv preprint arXiv:2112.00269 (2021).
- Dave Zielinski. 2023. Should Algorithms Make Layoff Decisions? https://www.shrm.org/hr-today/news/hr-magazine/summer-2023/pages/should-algorithms-make-layoff-decisions-.aspx
- Indre Zliobaite. 2015. A survey on measuring indirect discrimination in machine learning. arXiv preprint arXiv:1511.00148 (2015).
- Alessandro Fabris (15 papers)
- Nina Baranowska (2 papers)
- Matthew J. Dennis (1 paper)
- David Graus (10 papers)
- Philipp Hacker (14 papers)
- Jorge Saldivar (2 papers)
- Frederik Zuiderveen Borgesius (12 papers)
- Asia J. Biega (12 papers)