Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
129 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On the entropy and information of Gaussian mixtures (2308.15997v3)

Published 30 Aug 2023 in cs.IT, math.IT, and math.PR

Abstract: We establish several convexity properties for the entropy and Fisher information of mixtures of centered Gaussian distributions. First, we prove that if $X_1, X_2$ are independent scalar Gaussian mixtures, then the entropy of $\sqrt{t}X_1 + \sqrt{1-t}X_2$ is concave in $t \in [0,1]$, thus confirming a conjecture of Ball, Nayar and Tkocz (2016) for this class of random variables. In fact, we prove a generalisation of this assertion which also strengthens a result of Eskenazis, Nayar and Tkocz (2018). For the Fisher information, we extend a convexity result of Bobkov (2022) by showing that the Fisher information matrix is operator convex as a matrix-valued function acting on densities of mixtures in $\mathbb{R}d$. As an application, we establish rates for the convergence of the Fisher information matrix of the sum of weighted i.i.d. Gaussian mixtures in the operator norm along the central limit theorem under mild moment assumptions.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (32)
  1. On the rate of convergence in the entropic central limit theorem. Probability theory and related fields, 129(3):381–390, 2004.
  2. Sharp uniform convexity and smoothness inequalities for trace norms. Invent. Math., 115(3):463–482, 1994.
  3. A reverse entropy power inequality for log-concave random vectors. Studia Mathematica, 235:17–30, 2016.
  4. Andrew R. Barron. Entropy and the central limit theorem. The Annals of Probability, pages 336–342, 1986.
  5. Interpolation spaces. An introduction. Springer-Verlag, Berlin-New York, 1976. Grundlehren der Mathematischen Wissenschaften, No. 223.
  6. Rajendra Bhatia. Matrix analysis, volume 169 of Graduate Texts in Mathematics. Springer-Verlag, New York, 1997.
  7. Mathematical statistics—basic ideas and selected topics. Vol. 1. Texts in Statistical Science Series. CRC Press, Boca Raton, FL, second edition, 2015.
  8. Nelson M. Blachman. The convolution inequality for entropy powers. IEEE Trans. Inform. Theory, IT-11:267–271, 1965.
  9. Reverse Brunn–Minkowski and reverse entropy power inequalities for convex measures. Journal of Functional Analysis, 262(7):3309–3339, 2012.
  10. Sergey G. Bobkov. Upper bounds for Fisher information. Electronic Journal of Probability, 27:1–44, 2022.
  11. Entropy power inequality for the Rényi entropy. IEEE Trans. Inform. Theory, 61(2):708–714, February 2015.
  12. Rate of convergence and Edgeworth-type expansion in the entropic central limit theorem. The Annals of Probability, pages 2479–2512, 2013.
  13. Berry–Esseen bounds in the entropic central limit theorem. Probability Theory and Related Fields, 159(3-4):435–478, 2014.
  14. Fisher information and the central limit theorem. Probability theory and related fields, 159(1-2):1–59, 2014.
  15. On the problem of reversibility of the entropy power inequality. In Limit Theorems in Probability, Statistics and Number Theory: In Honor of Friedrich Götze, pages 61–74. Springer, 2013.
  16. Max Costa. A new entropy power inequality. IEEE Transactions on Information Theory, 31(6):751–760, 1985.
  17. Imre Csiszár. Information-type measures of difference of probability distributions and indirect observation. studia scientiarum Mathematicarum Hungarica, 2:229–318, 1967.
  18. Gaussian mixtures: entropy and geometric inequalities. The Annals of Probability, 46(5):2908–2945, 2018.
  19. Intrinsic dimensional functional inequalities on model spaces. Preprint available at https://arxiv.org/abs/2303.00784, 2023.
  20. The dimension of almost spherical sections of convex bodies. Acta Math., 139(1-2):53–94, 1977.
  21. Leonard Gross. Logarithmic Sobolev inequalities. American Journal of Mathematics, 97(4):1061–1083, 1975.
  22. Fisher information inequalities and the central limit theorem. Probability Theory and Related Fields, 129:391–409, 2004.
  23. Solomon Kullback. A lower bound for discrimination information in terms of variation (corresp.). IEEE transactions on Information Theory, 13(1):126–127, 1967.
  24. Probability in Banach spaces, volume 23 of Ergebnisse der Mathematik und ihrer Grenzgebiete (3) [Results in Mathematics and Related Areas (3)]. Springer-Verlag, Berlin, 1991. Isoperimetry and processes.
  25. Elliott H. Lieb. Proof of an entropy conjecture of Wehrl. Comm. Math. Phys., 62(1):35–41, 1978.
  26. Forward and reverse entropy power inequalities in convex geometry. In Convexity and concentration, pages 427–485. Springer, 2017.
  27. Two remarks on generalized entropy power inequalities. In Geometric aspects of functional analysis. Vol. II, volume 2266 of Lecture Notes in Math., pages 169–185. Springer, Cham, [2020] ©2020.
  28. Inequalities: theory of majorization and its applications. Springer Series in Statistics. Springer, New York, second edition, 2011.
  29. Mark S. Pinsker. Information and information stability of random variables and processes. Holden-Day, Inc., San Francisco, Calif.-London-Amsterdam,, 1964.
  30. Claude E. Shannon. A mathematical theory of communication. The Bell System Technical Journal, 27(3):379–423, 1948.
  31. Aart J. Stam. Some inequalities satisfied by the quantities of information of Fisher and Shannon. Information and Control, 2:101–112, 1959.
  32. Nicole Tomczak-Jaegermann. The moduli of smoothness and convexity and the Rademacher averages of trace classes Sp⁢(1≤p<∞)subscript𝑆𝑝1𝑝S_{p}(1\leq p<\infty)italic_S start_POSTSUBSCRIPT italic_p end_POSTSUBSCRIPT ( 1 ≤ italic_p < ∞ ). Studia Math., 50:163–182, 1974.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com