Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Minimum--Entropy Couplings and their Applications (1901.07530v1)

Published 19 Jan 2019 in cs.IT, cs.DS, and math.IT

Abstract: Given two discrete random variables $X$ and $Y,$ with probability distributions ${\bf p}=(p_1, \ldots , p_n)$ and ${\bf q}=(q_1, \ldots , q_m)$, respectively, denote by ${\cal C}({\bf p}, {\bf q})$ the set of all couplings of ${\bf p}$ and ${\bf q}$, that is, the set of all bivariate probability distributions that have ${\bf p}$ and ${\bf q}$ as marginals. In this paper, we study the problem of finding a joint probability distribution in ${\cal C}({\bf p}, {\bf q})$ of \emph{minimum entropy} (equivalently, a coupling that \emph{maximizes} the mutual information between $X$ and $Y$), and we discuss several situations where the need for this kind of optimization naturally arises. Since the optimization problem is known to be NP-hard, we give an efficient algorithm to find a joint probability distribution in ${\cal C}({\bf p}, {\bf q})$ with entropy exceeding the minimum possible at most by {1 bit}, thus providing an approximation algorithm with an additive gap of at most 1 bit. Leveraging on this algorithm, we extend our result to the problem of finding a minimum--entropy joint distribution of arbitrary $k\geq 2$ discrete random variables $X_1, \ldots , X_k$, consistent with the known $k$ marginal distributions of the individual random variables $X_1, \ldots , X_k$. In this case, our algorithm has an { additive gap of at most $\log k$ from optimum.} We also discuss several related applications of our findings and {extensions of our results to entropies different from the Shannon entropy.}

Citations (23)

Summary

We haven't generated a summary for this paper yet.