Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
126 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On the monotonicity of discrete entropy for log-concave random vectors on $\mathbb{Z}^d$ (2401.15462v2)

Published 27 Jan 2024 in math.PR, cs.IT, and math.IT

Abstract: We prove the following type of discrete entropy monotonicity for sums of isotropic, log-concave, independent and identically distributed random vectors $X_1,\dots,X_{n+1}$ on $\mathbb{Z}d$: $$ H(X_1+\cdots+X_{n+1}) \geq H(X_1+\cdots+X_{n}) + \frac{d}{2}\log{\Bigl(\frac{n+1}{n}\Bigr)} +o(1), $$ where $o(1)$ vanishes as $H(X_1) \to \infty$. Moreover, for the $o(1)$-term, we obtain a rate of convergence $ O\Bigl({H(X_1)}{e{-\frac{1}{d}H(X_1)}}\Bigr)$, where the implied constants depend on $d$ and $n$. This generalizes to $\mathbb{Z}d$ the one-dimensional result of the second named author (2023). As in dimension one, our strategy is to establish that the discrete entropy $H(X_1+\cdots+X_{n})$ is close to the differential (continuous) entropy $h(X_1+U_1+\cdots+X_{n}+U_{n})$, where $U_1,\dots, U_n$ are independent and identically distributed uniform random vectors on $[0,1]d$ and to apply the theorem of Artstein, Ball, Barthe and Naor (2004) on the monotonicity of differential entropy. In fact, we show this result under more general assumptions than log-concavity, which are preserved up to constants under convolution. In order to show that log-concave distributions satisfy our assumptions in dimension $d\ge2$, more involved tools from convex geometry are needed because a suitable position is required. We show that, for a log-concave function on $\mathbb{R}d$ in isotropic position, its integral, barycenter and covariance matrix are close to their discrete counterparts. Moreover, in the log-concave case, we weaken the isotropicity assumption to what we call almost isotropicity. One of our technical tools is a discrete analogue to the upper bound on the isotropic constant of a log-concave function, which extends to dimensions $d\ge1$ a result of Bobkov, Marsiglietti and Melbourne (2022).

Definition Search Book Streamline Icon: https://streamlinehq.com
References (27)
  1. Solution of Shannon’s problem on the monotonicity of entropy. Journal of the American Mathematical Society, 17(4):975–982, 2004.
  2. K. Ball. Logarithmically concave functions and sections of convex sets in Rn. Studia Math, 88(1):69–84, 1988.
  3. A. R. Barron. Entropy and the central limit theorem. The Annals of Probability, pages 336–342, 1986.
  4. Concentration functions and entropy bounds for discrete log-concave distributions. Combinatorics, Probability and Computing, 31(1):54–72, 2022.
  5. Geometry of isotropic convex bodies, volume 196. American Mathematical Soc., 2014.
  6. M. Fradelizi. Sections of convex bodies through their centroid. Archiv der Mathematik, 69(6):515–522, 1997.
  7. M. Fradelizi. Hyperplane sections of convex bodies in isotropic position. Beiträge Algebra Geom., 40(1):163–183, 1999.
  8. L. Gavalakis. Approximate discrete entropy monotonicity for log-concave sums. Combinatorics, Probability and Computing, page 1–14, 2023.
  9. L. Gavalakis. Discrete generalised entropy power inequalities for log-concave random variables. In 2023 IEEE International Symposium on Information Theory (ISIT), pages 42–47, 2023.
  10. L. Gavalakis and I. Kontoyiannis. Entropy and the discrete central limit theorem. Stochastic Processes and their Applications, page 104294, 2024.
  11. Adaptive sensing using deterministic partial hadamard matrices. In 2012 IEEE International Symposium on Information Theory Proceedings, pages 1842–1846. Ieee, 2012.
  12. A new entropy power inequality for integer-valued random variables. IEEE transactions on information theory, 60(7):3787–3796, 2014.
  13. An entropy power inequality for the binomial family. JIPAM. J. Inequal. Pure Appl. Math, 4(5):93, 2003.
  14. S. Hoggar. Chromatic polynomials and logarithmic concavity. Journal of Combinatorial Theory, Series B, 16(3):248–254, 1974.
  15. Isoperimetric problems for convex bodies and a localization lemma. Discrete & Computational Geometry, 13:541–559, 1995.
  16. B. Klartag. Logarithmic bounds for isoperimetry and slices of convex sets. arXiv preprint arXiv:2303.14938, 2023.
  17. M. Madiman and A. Barron. Generalized entropy power inequalities and monotonicity properties of information. IEEE Transactions on Information Theory, 53(7):2317–2329, 2007.
  18. Bernoulli sums and Rényi entropy inequalities. Bernoulli, 29(2):1578–1599, 2023.
  19. Majorization and Rényi entropy inequalities via Sperner theory. Discrete Mathematics, 342(10):2911–2923, 2019.
  20. Entropy inequalities for sums in prime cyclic groups. SIAM Journal on Discrete Mathematics, 35(3):1628–1649, 2021.
  21. K. Murota. Discrete convex analysis: monographs on discrete mathematics and applications. Computing Reviews, 45(6):339, 2004.
  22. C. E. Shannon. A mathematical theory of communication. The Bell system technical journal, 27(3):379–423, 1948.
  23. D. Shlyakhtenko. A free analogue of Shannon’s problem on monotonicity of entropy. Adv. Math., 208(2):824–833, 2007.
  24. A. J. Stam. Some inequalities satisfied by the quantities of information of Fisher and Shannon. Information and Control, 2(2):101–112, 1959.
  25. T. Tao. Sumset and inverse sumset theory for Shannon entropy. Combinatorics, Probability and Computing, 19:603–639, 07 2010.
  26. A. M. Tulino and S. Verdú. Monotonic decrease of the non-Gaussianness of the sum of independent random variables: a simple proof. IEEE Trans. Inform. Theory, 52(9):4295–4297, 2006.
  27. J. O. Woo and M. Madiman. A discrete entropy power inequality for uniform distributions. In 2015 IEEE International Symposium on Information Theory (ISIT), pages 1625–1629. IEEE, 2015.

Summary

We haven't generated a summary for this paper yet.