Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Crowdsourcing Paper Screening in Systematic Literature Reviews (1709.05168v1)

Published 15 Sep 2017 in cs.IR and cs.HC

Abstract: Literature reviews allow scientists to stand on the shoulders of giants, showing promising directions, summarizing progress, and pointing out existing challenges in research. At the same time conducting a systematic literature review is a laborious and consequently expensive process. In the last decade, there have a few studies on crowdsourcing in literature reviews. This paper explores the feasibility of crowdsourcing for facilitating the literature review process in terms of results, time and effort, as well as to identify which crowdsourcing strategies provide the best results based on the budget available. In particular we focus on the screening phase of the literature review process and we contribute and assess methods for identifying the size of tests, labels required per paper, and classification functions as well as methods to split the crowdsourcing process in phases to improve results. Finally, we present our findings based on experiments run on Crowdflower.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Evgeny Krivosheev (11 papers)
  2. Fabio Casati (35 papers)
  3. Valentina Caforio (2 papers)
  4. Boualem Benatallah (36 papers)
Citations (15)

Summary

We haven't generated a summary for this paper yet.