Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Fundamental Limits on Data Acquisition: Trade-offs between Sample Complexity and Query Difficulty (1712.00157v2)

Published 1 Dec 2017 in cs.IT, eess.SP, and math.IT

Abstract: We consider query-based data acquisition and the corresponding information recovery problem, where the goal is to recover $k$ binary variables (information bits) from parity measurements of those variables. The queries and the corresponding parity measurements are designed using the encoding rule of Fountain codes. By using Fountain codes, we can design potentially limitless number of queries, and corresponding parity measurements, and guarantee that the original $k$ information bits can be recovered with high probability from any sufficiently large set of measurements of size $n$. In the query design, the average number of information bits that is associated with one parity measurement is called query difficulty ($\bar{d}$) and the minimum number of measurements required to recover the $k$ information bits for a fixed $\bar{d}$ is called sample complexity ($n$). We analyze the fundamental trade-offs between the query difficulty and the sample complexity, and show that the sample complexity of $n=c\max{k,(k\log k)/\bar{d}}$ for some constant $c>0$ is necessary and sufficient to recover $k$ information bits with high probability as $k\to\infty$.

Citations (8)

Summary

We haven't generated a summary for this paper yet.