Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Crowdsourcing Information Extraction for Biomedical Systematic Reviews (1609.01017v1)

Published 5 Sep 2016 in cs.HC

Abstract: Information extraction is a critical step in the practice of conducting biomedical systematic literature reviews. Extracted structured data can be aggregated via methods such as statistical meta-analysis. Typically highly trained domain experts extract data for systematic reviews. The high expense of conducting biomedical systematic reviews has motivated researchers to explore lower cost methods that achieve similar rigor without compromising quality. Crowdsourcing represents one such promising approach. In this work-in-progress study, we designed a crowdsourcing task for biomedical information extraction. We briefly report the iterative design process and the results of two pilot testings. We found that giving more concrete examples in the task instruction can help workers better understand the task, especially for concepts that are abstract and confusing. We found a few workers completed most of the work, and our payment level appeared more attractive to workers from low-income countries. In the future, we will further evaluate our results with reference to gold standard extractions, thus assessing the feasibility of tasking crowd workers with extracting biomedical intervention information for systematic reviews.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Yalin Sun (4 papers)
  2. Pengxiang Cheng (15 papers)
  3. Shengwei Wang (5 papers)
  4. Hao Lyu (19 papers)
  5. Matthew Lease (57 papers)
  6. Iain Marshall (4 papers)
  7. Byron C. Wallace (82 papers)
Citations (14)