Papers
Topics
Authors
Recent
Search
2000 character limit reached

Fairness in Socio-technical Systems: a Case Study of Wikipedia

Published 15 Feb 2023 in cs.CY | (2302.07787v1)

Abstract: Problems broadly known as algorithmic bias frequently occur in the context of complex socio-technical systems (STS), where observed biases may not be directly attributable to a single automated decision algorithm. As a first investigation of fairness in STS, we focus on the case of Wikipedia. We systematically review 75 papers describing different types of bias in Wikipedia, which we classify and relate to established notions of harm from algorithmic fairness research. By analysing causal relationships between the observed phenomena, we demonstrate the complexity of the socio-technical processes causing harm. Finally, we identify the normative expectations of fairness associated with the different problems and discuss the applicability of existing criteria proposed for machine learning-driven decision systems.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.