Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A lower bound on the differential entropy of log-concave random vectors with applications (1704.07766v3)

Published 25 Apr 2017 in cs.IT and math.IT

Abstract: We derive a lower bound on the differential entropy of a log-concave random variable $X$ in terms of the $p$-th absolute moment of $X$. The new bound leads to a reverse entropy power inequality with an explicit constant, and to new bounds on the rate-distortion function and the channel capacity. Specifically, we study the rate-distortion function for log-concave sources and distortion measure $| x - \hat x|r$, and we establish that the difference between the rate distortion function and the Shannon lower bound is at most $\log(\sqrt{\pi e}) \approx 1.5$ bits, independently of $r$ and the target distortion $d$. For mean-square error distortion, the difference is at most $\log (\sqrt{\frac{\pi e}{2}}) \approx 1$ bits, regardless of $d$. We also provide bounds on the capacity of memoryless additive noise channels when the noise is log-concave. We show that the difference between the capacity of such channels and the capacity of the Gaussian channel with the same noise power is at most $\log (\sqrt{\frac{\pi e}{2}}) \approx 1$ bits. Our results generalize to the case of vector $X$ with possibly dependent coordinates, and to $\gamma$-concave random variables. Our proof technique leverages tools from convex geometry.

Citations (43)

Summary

We haven't generated a summary for this paper yet.