Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On the Hardness of Entropy Minimization and Related Problems (1207.1238v1)

Published 5 Jul 2012 in cs.IT, cs.CC, and math.IT

Abstract: We investigate certain optimization problems for Shannon information measures, namely, minimization of joint and conditional entropies $H(X,Y)$, $H(X|Y)$, $H(Y|X)$, and maximization of mutual information $I(X;Y)$, over convex regions. When restricted to the so-called transportation polytopes (sets of distributions with fixed marginals), very simple proofs of NP-hardness are obtained for these problems because in that case they are all equivalent, and their connection to the well-known \textsc{Subset sum} and \textsc{Partition} problems is revealed. The computational intractability of the more general problems over arbitrary polytopes is then a simple consequence. Further, a simple class of polytopes is shown over which the above problems are not equivalent and their complexity differs sharply, namely, minimization of $H(X,Y)$ and $H(Y|X)$ is trivial, while minimization of $H(X|Y)$ and maximization of $I(X;Y)$ are strongly NP-hard problems. Finally, two new (pseudo)metrics on the space of discrete probability distributions are introduced, based on the so-called variation of information quantity, and NP-hardness of their computation is shown.

Citations (17)

Summary

We haven't generated a summary for this paper yet.