Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Strengthening the Entropy Power Inequality (1602.03033v1)

Published 9 Feb 2016 in cs.IT and math.IT

Abstract: We tighten the Entropy Power Inequality (EPI) when one of the random summands is Gaussian. Our strengthening is closely connected to the concept of strong data processing for Gaussian channels and generalizes the (vector extension of) Costa's EPI. This leads to a new reverse entropy power inequality and, as a corollary, sharpens Stam's inequality relating entropy power and Fisher information. Applications to network information theory are given, including a short self-contained proof of the rate region for the two-encoder quadratic Gaussian source coding problem. Our argument is based on weak convergence and a technique employed by Geng and Nair for establishing Gaussian optimality via rotational-invariance, which traces its roots to a `doubling trick' that has been successfully used in the study of functional inequalities.

Citations (26)

Summary

We haven't generated a summary for this paper yet.