Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

The maximum mutual information between the output of a discrete symmetric channel and several classes of Boolean functions of its input (1701.05014v2)

Published 18 Jan 2017 in cs.IT and math.IT

Abstract: We prove the Courtade-Kumar conjecture, for several classes of n-dimensional Boolean functions, for all $n \geq 2$ and for all values of the error probability of the binary symmetric channel, $0 \leq p \leq 1/2$. This conjecture states that the mutual information between any Boolean function of an n-dimensional vector of independent and identically distributed inputs to a memoryless binary symmetric channel and the corresponding vector of outputs is upper-bounded by $1-\operatorname{H}(p)$, where $\operatorname{H}(p)$ represents the binary entropy function. That is, let $\mathbf{X}=[X_1 \ldots X_n]$ be a vector of independent and identically distributed Bernoulli(1/2) random variables, which are the input to a memoryless binary symmetric channel, with the error probability in the interval $0 \leq p \leq 1/2$ and $\mathbf{Y}=[Y_1 \ldots Y_n]$ the corresponding output. Let $f:{0,1}n \rightarrow {0,1}$ be an n-dimensional Boolean function. Then, $\operatorname{MI}(f(X),Y) \leq 1-\operatorname{H}(p)$. Our proof employs Karamata's theorem, concepts from probability theory, transformations of random variables and vectors and algebraic manipulations.

Summary

We haven't generated a summary for this paper yet.