Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Collaborative Distributed Hypothesis Testing (1604.01292v2)

Published 5 Apr 2016 in cs.IT, math.IT, and math.PR

Abstract: A collaborative distributed binary decision problem is considered. Two statisticians are required to declare the correct probability measure of two jointly distributed memoryless process, denoted by $Xn=(X_1,\dots,X_n)$ and $Yn=(Y_1,\dots,Y_n)$, out of two possible probability measures on finite alphabets, namely $P_{XY}$ and $P_{\bar{X}\bar{Y}}$. The marginal samples given by $Xn$ and $Yn$ are assumed to be available at different locations. The statisticians are allowed to exchange limited amount of data over multiple rounds of interactions, which differs from previous work that deals mainly with unidirectional communication. A single round of interaction is considered before the result is generalized to any finite number of communication rounds. A feasibility result is shown, guaranteeing the feasibility of an error exponent for general hypotheses, through information-theoretic methods. The special case of testing against independence is revisited as being an instance of this result for which also an unfeasibility result is proven. A second special case is studied where zero-rate communication is imposed (data exchanges grow sub-exponentially with $n$) for which it is shown that interaction does not improve asymptotic performance.

Citations (27)

Summary

We haven't generated a summary for this paper yet.