Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Blocking Methods Applied to Casualty Records from the Syrian Conflict (1510.07714v1)

Published 26 Oct 2015 in stat.AP and cs.DB

Abstract: Estimation of death counts and associated standard errors is of great importance in armed conflict such as the ongoing violence in Syria, as well as historical conflicts in Guatemala, Per\'u, Colombia, Timor Leste, and Kosovo. For example, statistical estimates of death counts were cited as important evidence in the trial of General Efra\'in R\'ios Montt for acts of genocide in Guatemala. Estimation relies on both record linkage and multiple systems estimation. A key first step in this process is identifying ways to partition the records such that they are computationally manageable. This step is referred to as blocking and is a major challenge for the Syrian database since it is sparse in the number of duplicate records and feature poor in its attributes. As a consequence, we propose locality sensitive hashing (LSH) methods to overcome these challenges. We demonstrate the computational superiority and error rates of these methods by comparing our proposed approach with others in the literature. We conclude with a discussion of many challenges of merging LSH with record linkage to achieve an estimate of the number of uniquely documented deaths in the Syrian conflict.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Peter Sadosky (1 paper)
  2. Anshumali Shrivastava (102 papers)
  3. Megan Price (3 papers)
  4. Rebecca C. Steorts (28 papers)
Citations (15)

Summary

We haven't generated a summary for this paper yet.