Alternating Optimization Approach for Computing $α$-Mutual Information and $α$-Capacity (2404.10950v5)
Abstract: This study presents alternating optimization (AO) algorithms for computing $\alpha$-mutual information ($\alpha$-MI) and $\alpha$-capacity based on variational characterizations of $\alpha$-MI using a reverse channel. Specifically, we derive several variational characterizations of Sibson, Arimoto, Augustin--Csisz{\' a}r, and Lapidoth--Pfister MI and introduce novel AO algorithms for computing $\alpha$-MI and $\alpha$-capacity; their performances for computing $\alpha$-capacity are also compared. The comparison results show that the AO algorithm based on the Sibson MI's characterization has the fastest convergence speed.
- C. E. Shannon, “A mathematical theory of communication,” The Bell System Technical Journal, vol. 27, pp. 379–423, 1948.
- R. Sibson, “Information radius,” Zeitschrift für Wahrscheinlichkeitstheorie und Verwandte Gebiete, vol. 14, pp. 149–160, 1969.
- S. Arimoto, “Information measures and capacity of order α𝛼\alphaitalic_α for discrete memoryless channels,” in 2nd Colloquium, Keszthely, Hungary, 1975, I. Csiszar and P. Elias, Eds., vol. 16. Amsterdam, Netherlands: North Holland: Colloquia Mathematica Societatis Jano’s Bolyai, 1977, pp. 41–52.
- U. Augustin, “Noisy channels,” Ph.D. dissertation, Habilitation thesis, Universitä Erlangen-Nürnberg, 1978.
- I. Csiszár, “Generalized cutoff rates and renyi’s information measures,” IEEE Transactions on Information Theory, vol. 41, no. 1, pp. 26–34, 1995.
- A. Lapidoth and C. Pfister, “Two measures of dependence,” Entropy, vol. 21, no. 8, 2019. [Online]. Available: https://www.mdpi.com/1099-4300/21/8/778
- M. Tomamichel and M. Hayashi, “Operational interpretation of rényi information measures via composite hypothesis testing against product and markov distributions,” IEEE Transactions on Information Theory, vol. 64, no. 2, pp. 1064–1082, 2018.
- S. Verdú, “α𝛼\alphaitalic_α-mutual information,” in 2015 Information Theory and Applications Workshop (ITA), 2015, pp. 1–6.
- S. Verdú, “Error exponents and α𝛼\alphaitalic_α-mutual information,” Entropy, vol. 23, no. 2, 2021. [Online]. Available: https://www.mdpi.com/1099-4300/23/2/199
- S. Ho and S. Verdú, “Convexity/concavity of renyi entropy and α𝛼\alphaitalic_α-mutual information,” in 2015 IEEE International Symposium on Information Theory (ISIT), 2015, pp. 745–749.
- C. Cai and S. Verdú, “Conditional rényi divergence saddlepoint and the maximization of α𝛼\alphaitalic_α-mutual information,” Entropy, vol. 21, no. 10, 2019. [Online]. Available: https://www.mdpi.com/1099-4300/21/10/969
- G. Aishwarya and M. Madiman, “Conditional rényi entropy and the relationships between rényi capacities,” Entropy, vol. 22, no. 5, 2020. [Online]. Available: https://www.mdpi.com/1099-4300/22/5/526
- B. Nakiboğlu, “The augustin capacity and center,” Problems of Information Transmission, vol. 55, no. 4, pp. 299–342, 2019. [Online]. Available: https://doi.org/10.1134/S003294601904001X
- O. Shayevitz, “On rényi measures and hypothesis testing,” in 2011 IEEE International Symposium on Information Theory Proceedings, 2011, pp. 894–898.
- G. Aishwarya and M. Madiman, “Remarks on rényi versions of conditional entropy and mutual information,” in 2019 IEEE International Symposium on Information Theory (ISIT), 2019, pp. 1117–1121.
- B. Nakiboğlu, “The rényi capacity and center,” IEEE Transactions on Information Theory, vol. 65, no. 2, pp. 841–860, 2019.
- I. Sason and S. Verdú, “Arimoto-rényi conditional entropy and bayesian hypothesis testing,” in 2017 IEEE International Symposium on Information Theory (ISIT), 2017, pp. 2965–2969.
- J. Liao, O. Kosut, L. Sankar, and F. du Pin Calmon, “Tunable measures for information leakage and applications to privacy-utility tradeoffs,” IEEE Transactions on Information Theory, vol. 65, no. 12, pp. 8043–8066, 2019.
- V. M. Ilić and I. B. Djordjević, “On the α𝛼\alphaitalic_α-q-mutual information and the α𝛼\alphaitalic_α-q-capacities,” Entropy, vol. 23, no. 6, 2021. [Online]. Available: https://www.mdpi.com/1099-4300/23/6/702
- D. Karakos, S. Khudanpur, and C. E. Priebe, “Computation of csiszár’s mutual information of order α𝛼\alphaitalic_α,” in 2008 IEEE International Symposium on Information Theory, 2008, pp. 2106–2110.
- S. Arimoto, “An algorithm for computing the capacity of arbitrary discrete memoryless channels,” IEEE Transactions on Information Theory, vol. 18, no. 1, pp. 14–20, 1972.
- R. Blahut, “Computation of channel capacity and rate-distortion functions,” IEEE Transactions on Information Theory, vol. 18, no. 4, pp. 460–473, 1972.
- A. Rényi, “On measures of entropy and information,” in Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, Volume 1: Contributions to the Theory of Statistics, vol. 4. University of California Press, 1961, pp. 547–562.
- S. Arimoto, “Computation of random coding exponent functions,” IEEE Transactions on Information Theory, vol. 22, no. 6, pp. 665–671, 1976.
- A. Kamatsuka, Y. Ishikawa, K. Kazama, and T. Yoshida, “New algorithms for computing sibson capacity and arimoto capacity,” 2024. [Online]. Available: https://arxiv.org/abs/2401.14241
- T. van Erven and P. Harremos, “Rényi divergence and kullback-leibler divergence,” IEEE Transactions on Information Theory, vol. 60, no. 7, pp. 3797–3820, 2014.
- M. Sion, “On general minimax theorems.” Pacific Journal of Mathematics, vol. 8, no. 1, pp. 171 – 176, 1958.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.