Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Fairness Index Measures to Evaluate Bias in Biometric Recognition (2306.10919v1)

Published 19 Jun 2023 in cs.CV, cs.CY, and cs.LG

Abstract: The demographic disparity of biometric systems has led to serious concerns regarding their societal impact as well as applicability of such systems in private and public domains. A quantitative evaluation of demographic fairness is an important step towards understanding, assessment, and mitigation of demographic bias in biometric applications. While few, existing fairness measures are based on post-decision data (such as verification accuracy) of biometric systems, we discuss how pre-decision data (score distributions) provide useful insights towards demographic fairness. In this paper, we introduce multiple measures, based on the statistical characteristics of score distributions, for the evaluation of demographic fairness of a generic biometric verification system. We also propose different variants for each fairness measure depending on how the contribution from constituent demographic groups needs to be combined towards the final measure. In each case, the behavior of the measure has been illustrated numerically and graphically on synthetic data. The demographic imbalance in benchmarking datasets is often overlooked during fairness assessment. We provide a novel weighing strategy to reduce the effect of such imbalance through a non-linear function of sample sizes of demographic groups. The proposed measures are independent of the biometric modality, and thus, applicable across commonly used biometric modalities (e.g., face, fingerprint, etc.).

Definition Search Book Streamline Icon: https://streamlinehq.com
References (19)
  1. Y. Sun, M. Zhang, Z. Sun, and T. Tan, “Demographic analysis from biometric data: Achievements, challenges, and new frontiers,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 40, no. 2, pp. 332–351, 2018.
  2. J. Howard, Y. Sirotin, and A. Vemury, “The effect of broad and specific demographic homogeneity on the imposter distributions and false match rates in face recognition algorithm performance,” in International Conference on Biometrics Theory, Applications and Systems, pp. 1–8, 2019.
  3. P. Drozdowski, C. Rathgeb, A. Dantcheva, N. Damer, and C. Busch, “Demographic bias in biometrics: A survey on an emerging challenge,” IEEE Transactions on Technology and Society, vol. 1, no. 2, pp. 89–103, 2020.
  4. P. Grother, M. Ngan, and K. Hanaoka, “Face recognition vendor test part 3: Demographic effects,” Dec 2019.
  5. I. Raji and J. Buolamwini, “Actionable auditing: Investigating the impact of publicly naming biased performance results of commercial ai products,” in Proceedings of the AAAI/ACM Conference on AI, Ethics, and Society, pp. 429–435, 2019.
  6. “MIT Technology Review.” https://www.technologyreview.com/2016/07/06/158971/are-face-recognition-systems-accurate-depends-on-your-race. Last accessed: July 10, 2022.
  7. “Racial discrimination in face recognition technology.” https://sitn.hms.harvard.edu/flash/2020/racial-discrimination-in-face-recognition-technology. Last accessed: July 10, 2022.
  8. “The Washington Post.” https://www.washingtonpost.com/world/2021/09/15/un-ai-moratorium. Last accessed: July 10, 2022.
  9. I. Raji et al., “Saving face: Investigating the ethical concerns of facial recognition auditing,” in Proceedings of the AAAI/ACM Conference on AI, Ethics, and Society, pp. 145–151, 2020.
  10. C. Cook, J. Howard, Y. Sirotin, J. Tipton, and A. Vemury, “Demographic effects in facial recognition and their dependence on image acquisition: An evaluation of eleven commercial systems,” IEEE Transactions on Biometrics, Behavior, and Identity Science, vol. 1, no. 1, pp. 32–41, 2019.
  11. T. Sixta et al., “Fairface challenge at eccv 2020: Analyzing bias in face recognition,” in European conference on computer vision, pp. 463–481, Springer, 2020.
  12. R. Garcia, L. Wandzik, L. Grabner, and J. Krueger, “The harms of demographic bias in deep face recognition research,” in International Conference on Biometrics, pp. 1–6, IEEE, 2019.
  13. T. de Freitas Pereira and S. Marcel, “Fairness in biometrics: a figure of merit to assess biometric verification systems,” IEEE Transactions on Biometrics, Behavior, and Identity Science, vol. 4, no. 1, pp. 19–29, 2021.
  14. R. Ramachandra, K. Raja, and C. Busch, “Algorithmic fairness in face morphing attack detection,” in Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, pp. 410–418, 2022.
  15. S. Gong, X. Liu, and A. Jain, “Jointly de-biasing face recognition and demographic attribute estimation,” in Computer Vision – ECCV 2020, (Cham), pp. 330–347, Springer International Publishing, 2020.
  16. I. Hupont and C. Fernández, “Demogpairs: Quantifying the impact of demographic imbalance in deep face recognition,” in International Conference on Automatic Face and Gesture Recognition, pp. 1–7, 2019.
  17. N. Mehrabi, F. Morstatter, N. Saxena, K. Lerman, and A. Galstyan, “A survey on bias and fairness in machine learning,” ACM Computing Surveys, vol. 54, no. 6, pp. 1–35, 2021.
  18. D. Michalski, S.-Y. Yiu, and C. Malec, “The impact of age and threshold variation on facial recognition algorithm performance using images of children,” in International Conference on Biometrics, pp. 217–224, 2018.
  19. J. Lin, “Divergence measures based on the shannon entropy,” IEEE Transactions on Information Theory, vol. 37, no. 1, pp. 145–151, 1991.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Ketan Kotwal (9 papers)
  2. Sebastien Marcel (77 papers)
Citations (7)

Summary

We haven't generated a summary for this paper yet.