Lower Bounds on Mutual Information for Linear Codes Transmitted over Binary Input Channels, and for Information Combining (2401.14710v1)
Abstract: It has been known for a long time that the mutual information between the input sequence and output of a binary symmetric channel (BSC) is upper bounded by the mutual information between the same input sequence and the output of a binary erasure channel (BEC) with the same capacity. Recently, Samorodintsky discovered that one may also lower bound the BSC mutual information in terms of the mutual information between the same input sequence and a more capable BEC. In this paper, we strengthen Samordnitsky's bound for the special case where the input to the channel is distributed uniformly over a linear code. Furthermore, for a general (not necessarily binary) input distribution $P_X$ and channel $W_{Y|X}$, we derive a new lower bound on the mutual information $I(X;Yn)$ for $n$ transmissions of $X\sim P_X$ through the channel $W_{Y|X}$.
- Y. Polyanskiy and Y. Wu, “Information theory: From coding to learning,” Book draft, 2022.
- ——, “Strong data-processing inequalities for channels and Bayesian networks,” in Convexity and Concentration. Springer, 2017, pp. 211–249.
- A. Makur and Y. Polyanskiy, “Comparison of channels: Criteria for domination by a symmetric channel,” IEEE Transactions on Information Theory, vol. 64, no. 8, pp. 5704–5725, 2018.
- O. Ordentlich and Y. Polyanskiy, “Strong data processing constant is achieved by binary inputs,” IEEE Transactions on Information Theory, vol. 68, no. 3, pp. 1480–1481, 2021.
- E. Sasoglu, “Polar coding theorems for discrete systems,” EPFL, Tech. Rep., 2011.
- A. Samorodnitsky, “On the entropy of a noisy function,” IEEE Transactions on Information Theory, vol. 62, no. 10, pp. 5446–5464, 2016.
- I. Land, J. Huber et al., “Information combining,” Foundations and Trends® in Communications and Information Theory, vol. 3, no. 3, pp. 227–330, 2006.
- I. Sutskover, S. Shamai, and J. Ziv, “Extremes of information combining,” IEEE Transactions on Information Theory, vol. 51, no. 4, pp. 1313–1325, 2005.
- R. Ahlswede and P. Gács, “Spreading of sets in product spaces and hypercontraction of the markov operator,” The annals of probability, pp. 925–939, 1976.
- O. Ordentlich, “Novel lower bounds on the entropy rate of binary hidden markov processes,” in 2016 IEEE International Symposium on Information Theory (ISIT). IEEE, 2016, pp. 690–694.
- A. Wyner and J. Ziv, “A theorem on the entropy of certain binary sequences and applications–I,” IEEE Transactions on Information Theory, vol. 19, no. 6, pp. 769–772, 1973.
- N. Tishby, F. C. Pereira, and W. Bialek, “The information bottleneck method,” arXiv preprint physics/0004057, 2000.
- H. Witsenhausen and A. Wyner, “A conditional entropy bound for a pair of discrete random variables,” IEEE Transactions on Information Theory, vol. 21, no. 5, pp. 493–501, 1975.
- V. Anantharam, A. Gohari, S. Kamath, and C. Nair, “On maximal correlation, hypercontractivity, and the data processing inequality studied by Erkip and Cover,” arXiv preprint arXiv:1304.6133, 2013.
- A. Makur, “Information contraction and decomposition,” Ph.D. dissertation, Massachusetts Institute of Technology, 2019.
- E. Erkip and T. M. Cover, “The efficiency of investment information,” IEEE Transactions on information theory, vol. 44, no. 3, pp. 1026–1040, 1998.
- J. Hązła, A. Samorodnitsky, and O. Sberlo, “On codes decoding a constant fraction of errors on the BSC,” in Proceedings of the 53rd Annual ACM SIGACT Symposium on Theory of Computing, 2021, pp. 1479–1488.
- M. Pathegama and A. Barg, “Smoothing of binary codes, uniform distributions, and applications,” Entropy, vol. 25, no. 11, p. 1515, 2023.