Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
129 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

The entropic doubling constant and robustness of Gaussian codebooks for additive-noise channels (2403.07209v2)

Published 11 Mar 2024 in cs.IT, math.IT, and math.PR

Abstract: Entropy comparison inequalities are obtained for the differential entropy $h(X+Y)$ of the sum of two independent random vectors $X,Y$, when one is replaced by a Gaussian. For identically distributed random vectors $X,Y$, these are closely related to bounds on the entropic doubling constant, which quantifies the entropy increase when adding an independent copy of a random vector to itself. Consequences of both large and small doubling are explored. For the former, lower bounds are deduced on the entropy increase when adding an independent Gaussian, while for the latter, a qualitative stability result for the entropy power inequality is obtained. In the more general case of non-identically distributed random vectors $X,Y$, a Gaussian comparison inequality with interesting implications for channel coding is established: For additive-noise channels with a power constraint, Gaussian codebooks come within a $\frac{{\sf snr}}{3{\sf snr}+2}$ factor of capacity. In the low-SNR regime this improves the half-a-bit additive bound of Zamir and Erez (2004). Analogous results are obtained for additive-noise multiple access channels, and for linear, additive-noise MIMO channels.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (61)
  1. Solution of Shannon’s problem on the monotonicity of entropy. J. Amer. Math. Soc., 17(4):975–982, April 2004.
  2. Universal sparse superposition codes with spatial coupling and GAMP decoding. IEEE Trans. Inform. Theory, 65(9):5618–5642, September 2019.
  3. Entropy power inequality for the Rényi entropy. IEEE Trans. Inform. Theory, 61(2):708–714, February 2015.
  4. Stability problems in Cramer-type characterization in case of i.i.d. summands. Theory Probab. Appl., 57(4):568–588, 2013.
  5. Regularized distributions and entropic stability of Cramer’s characterization of the normal law. Stochastic Process. Appl., 126(12):3865–3887, December 2016.
  6. Stability of Cramer’s characterization of normal laws in information distances. In High dimensional probability VII, volume 71 of Progr. Probab., pages 3–35. Springer, 2016.
  7. S.G. Bobkov and M. Madiman. On the problem of reversibility of the entropy power inequality. In Limit theorems in probability, statistics and number theory, volume 42 of Springer Proc. Math. Stat., pages 61–74. Springer, Heidelberg, 2013.
  8. Fractional generalizations of Young and Brunn-Minkowski inequalities. In C. Houdré, M. Ledoux, E. Milman, and M. Milman, editors, Concentration, Functional Inequalities and Isoperimetry, volume 545 of Contemp. Math., pages 35–53. Amer. Math. Soc., 2011.
  9. E.A. Carlen and A. Soffer. Entropy production by block variable summation and central limit theorems. Comm. Math. Phys., 140(2):339–371, 1991.
  10. Quantitative stability of the entropy power inequality. IEEE Trans. Inform. Theory, 64(8):5691–5703, August 2018.
  11. Elements of information theory. John Wiley & Sons, New York, NY, second edition, 2012.
  12. H. Cramér. Über eine Eigenschaft der normalen Verteilungsfunktion. Math. Z., 41(1):405–414, 1936.
  13. Information-theoretic inequalities. IEEE Trans. Inform. Theory, 37(6):1501–1518, November 1991.
  14. The worst additive noise under a covariance constraint. IEEE Trans. Inform. Theory, 47(7):3072–3081, November 2001.
  15. The capacity achieving distribution for the amplitude constrained additive Gaussian channel: An upper bound on the number of mass points. IEEE Trans. Inform. Theory, 66(4):2006–2022, April 2020.
  16. E. Ekrem and S. Ulukus. On the secrecy of multiple access wiretap channel. In Proceedings of the 46th Annual Allerton Conference on Communication, Control, and Computing, pages 1014–1021, Monticello, IL, September 2008.
  17. A. El Gamal and Y-H. Kim. Network information theory. Cambridge University Press, Cambridge, U.K., 2011.
  18. R. Eldan and D. Mikulincer. Stability of the Shannon-Stam inequality via the Föllmer process. Probab. Theory Related Fields, 177(3-4):891–922, 2020.
  19. Gaussian mixtures: Entropy and geometric inequalities. Ann. Probab., 46(5):2908–2945, September 2018.
  20. The convexification effect of Minkowski summation. EMS Surv. Math. Sci., 5(1/2):1–64, 2019.
  21. On the volume of the Minkowski sum of zonoids. J. Funct. Anal., 286(3), February 2024.
  22. Sumset estimates in convex geometry. arXiv e-prints, 2204.01565 [math.MG], June 2022.
  23. T.S. Han and S. Verdú. Approximation theory of output statistics. IEEE Trans. Inform. Theory, 39(3):752–772, May 1993.
  24. X. He and A. Yener. The Gaussian many-to-one interference channel with confidential messages. IEEE Trans. Inform. Theory, 57(5):2730–2745, May 2011.
  25. Entropy, inference, and channel coding. In Wireless communications, volume 143 of IMA Vol. Math. Appl., pages 99–124. Springer, New York, NY, 2007.
  26. S. Ihara. On the capacity of channels with additive non-Gaussian noise. Inf. Contr., 37(1):34–39, April 1978.
  27. A. Joseph and A.R. Barron. Least squares superposition codes of moderate dictionary size are reliable at rates up to capacity. IEEE Trans. Inform. Theory, 58(5):2541–2557, May 2012.
  28. Contribution to the theory of Pitman estimators. Zap. Nauch. Sem. P.O.M.I., 408:245–267, 2012. Special issue in honour of I.A. Ibragimov’s 80th birthday.
  29. Random walks on discrete groups: Boundary and entropy. Ann. Probab., 11(3):457–490, August 1983.
  30. I. Kontoyiannis and M. Madiman. Sumset and inverse sumset inequalities for differential entropy and mutual information. IEEE Trans. Inform. Theory, 60(8):4503–4514, August 2014.
  31. I. Kontoyiannis and R. Zamir. Mismatched codebooks and the role of entropy coding in lossy data compression. IEEE Trans. Inform. Theory, 52(5):1922–1938, May 2006.
  32. A. Lapidoth. Nearest neighbor decoding for additive non-Gaussian noise channels. IEEE Trans. Inform. Theory, 42(5):1520–1529, September 1996.
  33. P. Lévy. Théorie de l’addition des variables aléatoires. Gauthier-Villars, Paris, France, 1937.
  34. Decomposition of random variables and vectors, volume 48 of Translations of Mathematical Monographs. American Mathematical Society, Providence, RI, 1977. Translated from the Russian.
  35. D.J. Love. Duplex distortion models for limited feedback MIMO communication. IEEE Trans. Signal Process., 54(2):766–774, February 2006.
  36. M. Madiman and A.R Barron. Generalized entropy power inequalities and monotonicity properties of information. IEEE Trans. Inform. Theory, 53(7):2317–2329, July 2007.
  37. M. Madiman and F. Ghassemi. Combinatorial entropy power inequalities: A preliminary study of the Stam region. IEEE Trans. Inform. Theory, 65(3):1375–1386, March 2019.
  38. M. Madiman and I. Kontoyiannis. The entropies of the sum and the difference of two IID random variables are not too different. In 2010 IEEE International Symposium on Information Theory (ISIT), pages 1369–1372, Austin, TX, June 2010.
  39. M. Madiman and I. Kontoyiannis. Entropy bounds on abelian groups and the Ruzsa divergence. IEEE Trans. Inform. Theory, 64(1):77–92, January 2016.
  40. Entropy and set cardinality inequalities for partition-determined functions. Random Struct. Algorithms, 40(4):399–424, 2012.
  41. Forward and reverse entropy power inequalities in convex geometry. In E. Carlen, M. Madiman, and E. M. Werner, editors, Convexity and Concentration, volume 161 of IMA Volumes in Mathematics and its Applications, pages 427–485. Springer, 2017.
  42. Two remarks on generalized entropy power inequalities. In B. Klartag and E. Milman, editors, Geometric aspects of functional analysis: Israel Seminar (GAFA) 2017-2019, volume 2266 of Lecture Notes in Mathematics, pages 169–185. Springer, 2020.
  43. Sharp moment-entropy inequalities and capacity bounds for log-concave distributions. IEEE Trans. Inform. Theory, 67(1):81–94, January 2021.
  44. M.M. Mahvari and G. Kramer. Stability of Bernstein’s theorem and soft doubling for vector Gaussian channels. IEEE Trans. Inform. Theory, 69(10), October 2023.
  45. T. Philosof and R. Zamir. The cost of uncorrelation and noncooperation in MIMO channels. IEEE Trans. Inform. Theory, 53(11):3904–3920, November 2007.
  46. Y. Polyanskiy and S. Verdú. Empirical distribution of good channel codes with nonvanishing error probability. IEEE Trans. Inform. Theory, 60(1):5–21, January 2014.
  47. N.A. Sapogov. The problem of stability for a theorem of Cramér’s. Vestnik Leningrad. Univ., 10(11):61–64, 1955.
  48. The dispersion of nearest-neighbor decoding for additive non-Gaussian channels. IEEE Trans. Inform. Theory, 63(1):81–92, January 2017.
  49. S. Shamai and S. Verdú. The empirical distribution of good codes. IEEE Trans. Inform. Theory, 43(3):836–846, May 1997.
  50. C.E. Shannon. A mathematical theory of communication. Bell System Tech. J., 27(3):379–423, 623–656, 1948.
  51. D. Shlyakhtenko. A free analogue of Shannon’s problem on monotonicity of entropy. Adv. Math., 208(2):824–833, January 2007.
  52. N. Shulman and M. Feder. The uniform distribution as a universal prior. IEEE Trans. Inform. Theory, 50(6):1356–1362, June 2004.
  53. Interference assisted secret communication. IEEE Trans. Inform. Theory, 57(5):3153–3167, May 2011.
  54. T. Tao. Sumset and inverse sumset theory for Shannon entropy. Comb., Probab. Comput., 19(4):603–639, July 2010.
  55. A. Tchamkerten. On the discreteness of capacity-achieving distributions. IEEE Trans. Inform. Theory, 50(11):2773–2778, November 2004.
  56. A.M. Tulino and S. Verdú. Monotonic decrease of the non-Gaussianness of the sum of independent random variables: A simple proof. IEEE Trans. Inform. Theory, 52(9):4295–4297, September 2006.
  57. Sparse regression codes. Foundations and Trends in Communications and Information Theory, 15(1-2):1–195, June 2019.
  58. S. Verdú. On channel capacity per unit cost. IEEE Trans. Inform. Theory, 36(5):1019–1030, September 1990.
  59. R. Zamir. The half a bit loss of robust source/channel codebooks. In 2002 IEEE Workshop on Information Theory (ITW), pages 123–126, Bangalore, India, October 2002.
  60. R. Zamir and U. Erez. A Gaussian input is not too bad. IEEE Trans. Inform. Theory, 50(6):1362–1367, June 2004.
  61. R. Zamir and M. Feder. On universal quantization by randomized uniform/lattice quantizers. IEEE Trans. Inform. Theory, 38(2):428–436, March 1992.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com