Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
194 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Data-Driven Analysis of Workers' Earnings on Amazon Mechanical Turk (1712.05796v2)

Published 14 Dec 2017 in cs.CY and cs.HC

Abstract: A growing number of people are working as part of on-line crowd work, which has been characterized by its low wages; yet, we know little about wage distribution and causes of low/high earnings. We recorded 2,676 workers performing 3.8 million tasks on Amazon Mechanical Turk. Our task-level analysis revealed that workers earned a median hourly wage of only ~\$2/h, and only 4% earned more than \$7.25/h. The average requester pays more than \$11/h, although lower-paying requesters post much more work. Our wage calculations are influenced by how unpaid work is included in our wage calculations, e.g., time spent searching for tasks, working on tasks that are rejected, and working on tasks that are ultimately not submitted. We further explore the characteristics of tasks and working patterns that yield higher hourly wages. Our analysis informs future platform design and worker tools to create a more positive future for crowd work.

Citations (423)

Summary

  • The paper provides a comprehensive data-driven analysis of AMT workers' earnings, revealing a median hourly wage of approximately $2 and the impact of unpaid work.
  • The analysis uses logs from 2,676 workers completing 3.8 million HITs to quantify wage disparities and the burden of unpaid tasks.
  • The study highlights the need for redesigned platform features and enhanced worker tools to address the challenges of unpaid labor and earnings inequality.

An Analytical Inquiry into Workers’ Earnings on Amazon Mechanical Turk

This paper provides a comprehensive data-driven analysis of earnings among crowd workers on Amazon Mechanical Turk (AMT), focusing particularly on the distribution of hourly wages and factors that contribute to such earnings. The analysis involves 2,676 workers who completed 3.8 million Human Intelligence Tasks (HITs) between September 2014 and January 2017, utilizing logs collected via the Crowd Workers Chrome plugin. The findings reassert previous qualitative studies by showing that median hourly wages are approximately $2, with only 4% of workers earning above the federal minimum wage of$7.25 per hour. The paper argues that the disparity between average requester pay rates and worker earnings is primarily influenced by task-related unpaid work, necessitating changes to both platform design and worker tools to improve earnings.

Key Findings

The paper quantifies earnings on AMT, revealing that the median per-task hourly wage is significantly lower than what most requesters pay. The core factors influencing worker earnings include:

  • Unpaid Work Activities: Time spent searching for tasks, dealing with task rejections, and working on tasks that are returned, forms a substantial component of unpaid labor, significantly impacting effective hourly wage rates.
  • Disparity Among Requesters: Although the average requester pays above minimum wage, a small subset posts a significant number of low-reward HITs, skewing the earnings distribution negatively.
  • Task and Worker Characteristics: The paper points out variances in earnings based on task types, requester details, and required qualifications. For instance, while qualifications typically imply higher skill levels, they did not consistently result in higher wages for the workers.

Implications for Design and Future Research

Through its thorough quantitative analysis, this paper highlights several implications for the design of crowdsourcing platforms and worker support tools:

  • Platform Design: The results underline the necessity for platforms like AMT to reconsider their design policies to better accommodate fair pay for crowd workers. Ensuring transparency in task quality and allowing workers to gauge genuine opportunities are essential steps toward leveling the playing field.
  • Worker Tools Development: Worker interfaces and tools need enhancements to highlight task selection strategies, prioritize high-paying requesters, and provide better visibility into potential unpaid work. This can help mitigate losses from unpaid tasks and non-completed work.

The theoretical implications extend to future explorations of platform economy labor models. Understanding the effects of invisible (unpaid) labor on contemporary digital workforce dynamics might inform fair labor practices and assist in shaping equitable economic models within gig economies. Emerging research could leverage these insights to develop real-time, adaptive recommendation engines for workers to improve their task selection process, thereby optimizing their earnings.

Conclusion

By furnishing a detailed examination of earnings on AMT, this paper provides a valuable empirical basis for understanding crowd work economics. Such insights are crucial for stakeholders looking to balance technological innovation with ethical labor practices and for policymakers aiming to protect the rights of a burgeoning gig workforce. Future research should continue to interrogate the nuances of microtask platforms and address the broader socio-economic ramifications of digital labor ecosystems.