Papers
Topics
Authors
Recent
Search
2000 character limit reached

Breaking Monotony with Meaning: Motivation in Crowdsourcing Markets

Published 3 Oct 2012 in stat.OT and cs.HC | (1210.0962v1)

Abstract: We conduct the first natural field experiment to explore the relationship between the "meaningfulness" of a task and worker effort. We employed about 2,500 workers from Amazon's Mechanical Turk (MTurk), an online labor market, to label medical images. Although given an identical task, we experimentally manipulated how the task was framed. Subjects in the meaningful treatment were told that they were labeling tumor cells in order to assist medical researchers, subjects in the zero-context condition (the control group) were not told the purpose of the task, and, in stark contrast, subjects in the shredded treatment were not given context and were additionally told that their work would be discarded. We found that when a task was framed more meaningfully, workers were more likely to participate. We also found that the meaningful treatment increased the quantity of output (with an insignificant change in quality) while the shredded treatment decreased the quality of output (with no change in quantity). We believe these results will generalize to other short-term labor markets. Our study also discusses MTurk as an exciting platform for running natural field experiments in economics.

Citations (414)

Summary

  • The paper finds that framing tasks as meaningful boosts worker participation, with an 80.6% rate compared to lower rates in less meaningful settings.
  • The paper employs a natural field experiment with 2,500 MTurk workers divided into meaningful, zero-context, and shredded conditions to assess output quality and quantity.
  • The paper reveals that meaningful task framing increases high-output workers by 23% while modestly lowering the effective hourly wage, highlighting non-pecuniary motivators.

Breaking Monotony with Meaning: Motivation in Crowdsourcing Markets

The paper entitled "Breaking Monotony with Meaning: Motivation in Crowdsourcing Markets" by Dana Chandler and Adam Kapelner offers a comprehensive examination of how the perceived meaningfulness of a task can influence worker behavior in online labor markets, specifically through a natural field experiment conducted on Amazon's Mechanical Turk (MTurk). This study strategically manipulated the context given to workers regarding the task of labeling medical images to investigate its impact on task participation, quantity, and quality of output.

Experimental Design and Methodology

The researchers conducted the first field experiment of its kind, recruiting approximately 2,500 MTurk workers to label images of tumor cells. The task was presented under three different conditions: meaningful, zero-context, and shredded. In the meaningful condition, participants were informed that their work would assist cancer research. The zero-context (control) condition provided no contextual information, while the shredded condition added that their work would be discarded. This methodological approach allowed the researchers to isolate the influence of perceived task meaningfulness on worker motivation.

Key Findings

The study found several compelling results:

  1. Task Participation: A greater proportion of workers were willing to engage with the task when it was framed as being meaningful. Specifically, the meaningful condition saw an 80.6% participation rate compared to 76.2% in the zero-context condition and 72.3% in the shredded condition.
  2. Quantity of Output: Workers in the meaningful condition were more likely to produce a higher quantity of labeled images, with a noted 23% increase in workers classified as "high-output" (those labeling five or more images).
  3. Quality of Output: Quality was measured by accuracy in labeling. The shredded condition resulted in a notable decrease in quality by about 7%, although the increase in quality in the meaningful condition was statistically inconclusive.
  4. Economic Implications: The researchers observed that workers in the meaningful condition were willing to work for a slightly lower effective hourly wage compared to those in other conditions, emphasizing the practicality of leveraging task meaningfulness as a motivational tool in crowdsourcing environments.

Implications and Future Directions

The findings have broad implications in the domain of labor economics and organizational behavior, especially in understanding how non-pecuniary incentives can be strategically utilized to enhance worker productivity and satisfaction. The study provides empirical evidence supporting the inclusion of contextual meaning as an incentive in short-term labor markets. These insights could profoundly impact how online platforms design tasks to maximize worker engagement and output quality.

This research opens doors for further exploration into the generalization of these results across different domains and tasks. Future research might examine varying levels of task complexity, cultural differences in perceived task meaningfulness, or long-term impacts of such motivational techniques on employee retention and satisfaction. Additionally, the paper highlights the potential of MTurk as a powerful platform for conducting natural field experiments, suggesting an intersection between labor market studies and behavioral economics.

In conclusion, this paper contributes significant insights into the mechanics of motivation in online labor markets, offering practical applications for employers seeking to optimize task design and motivational strategies. As the gig economy continues to grow, understanding worker motivations will be crucial for developing effective and engaging work environments.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.