Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 82 tok/s
Gemini 2.5 Pro 48 tok/s Pro
GPT-5 Medium 40 tok/s Pro
GPT-5 High 38 tok/s Pro
GPT-4o 96 tok/s Pro
Kimi K2 185 tok/s Pro
GPT OSS 120B 465 tok/s Pro
Claude Sonnet 4 30 tok/s Pro
2000 character limit reached

Alternating Optimization Approach for Computing $α$-Mutual Information and $α$-Capacity (2404.10950v5)

Published 16 Apr 2024 in cs.IT and math.IT

Abstract: This study presents alternating optimization (AO) algorithms for computing $\alpha$-mutual information ($\alpha$-MI) and $\alpha$-capacity based on variational characterizations of $\alpha$-MI using a reverse channel. Specifically, we derive several variational characterizations of Sibson, Arimoto, Augustin--Csisz{\' a}r, and Lapidoth--Pfister MI and introduce novel AO algorithms for computing $\alpha$-MI and $\alpha$-capacity; their performances for computing $\alpha$-capacity are also compared. The comparison results show that the AO algorithm based on the Sibson MI's characterization has the fastest convergence speed.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (27)
  1. C. E. Shannon, “A mathematical theory of communication,” The Bell System Technical Journal, vol. 27, pp. 379–423, 1948.
  2. R. Sibson, “Information radius,” Zeitschrift für Wahrscheinlichkeitstheorie und Verwandte Gebiete, vol. 14, pp. 149–160, 1969.
  3. S. Arimoto, “Information measures and capacity of order α𝛼\alphaitalic_α for discrete memoryless channels,” in 2nd Colloquium, Keszthely, Hungary, 1975, I. Csiszar and P. Elias, Eds., vol. 16.   Amsterdam, Netherlands: North Holland: Colloquia Mathematica Societatis Jano’s Bolyai, 1977, pp. 41–52.
  4. U. Augustin, “Noisy channels,” Ph.D. dissertation, Habilitation thesis, Universitä Erlangen-Nürnberg, 1978.
  5. I. Csiszár, “Generalized cutoff rates and renyi’s information measures,” IEEE Transactions on Information Theory, vol. 41, no. 1, pp. 26–34, 1995.
  6. A. Lapidoth and C. Pfister, “Two measures of dependence,” Entropy, vol. 21, no. 8, 2019. [Online]. Available: https://www.mdpi.com/1099-4300/21/8/778
  7. M. Tomamichel and M. Hayashi, “Operational interpretation of rényi information measures via composite hypothesis testing against product and markov distributions,” IEEE Transactions on Information Theory, vol. 64, no. 2, pp. 1064–1082, 2018.
  8. S. Verdú, “α𝛼\alphaitalic_α-mutual information,” in 2015 Information Theory and Applications Workshop (ITA), 2015, pp. 1–6.
  9. S. Verdú, “Error exponents and α𝛼\alphaitalic_α-mutual information,” Entropy, vol. 23, no. 2, 2021. [Online]. Available: https://www.mdpi.com/1099-4300/23/2/199
  10. S. Ho and S. Verdú, “Convexity/concavity of renyi entropy and α𝛼\alphaitalic_α-mutual information,” in 2015 IEEE International Symposium on Information Theory (ISIT), 2015, pp. 745–749.
  11. C. Cai and S. Verdú, “Conditional rényi divergence saddlepoint and the maximization of α𝛼\alphaitalic_α-mutual information,” Entropy, vol. 21, no. 10, 2019. [Online]. Available: https://www.mdpi.com/1099-4300/21/10/969
  12. G. Aishwarya and M. Madiman, “Conditional rényi entropy and the relationships between rényi capacities,” Entropy, vol. 22, no. 5, 2020. [Online]. Available: https://www.mdpi.com/1099-4300/22/5/526
  13. B. Nakiboğlu, “The augustin capacity and center,” Problems of Information Transmission, vol. 55, no. 4, pp. 299–342, 2019. [Online]. Available: https://doi.org/10.1134/S003294601904001X
  14. O. Shayevitz, “On rényi measures and hypothesis testing,” in 2011 IEEE International Symposium on Information Theory Proceedings, 2011, pp. 894–898.
  15. G. Aishwarya and M. Madiman, “Remarks on rényi versions of conditional entropy and mutual information,” in 2019 IEEE International Symposium on Information Theory (ISIT), 2019, pp. 1117–1121.
  16. B. Nakiboğlu, “The rényi capacity and center,” IEEE Transactions on Information Theory, vol. 65, no. 2, pp. 841–860, 2019.
  17. I. Sason and S. Verdú, “Arimoto-rényi conditional entropy and bayesian hypothesis testing,” in 2017 IEEE International Symposium on Information Theory (ISIT), 2017, pp. 2965–2969.
  18. J. Liao, O. Kosut, L. Sankar, and F. du Pin Calmon, “Tunable measures for information leakage and applications to privacy-utility tradeoffs,” IEEE Transactions on Information Theory, vol. 65, no. 12, pp. 8043–8066, 2019.
  19. V. M. Ilić and I. B. Djordjević, “On the α𝛼\alphaitalic_α-q-mutual information and the α𝛼\alphaitalic_α-q-capacities,” Entropy, vol. 23, no. 6, 2021. [Online]. Available: https://www.mdpi.com/1099-4300/23/6/702
  20. D. Karakos, S. Khudanpur, and C. E. Priebe, “Computation of csiszár’s mutual information of order α𝛼\alphaitalic_α,” in 2008 IEEE International Symposium on Information Theory, 2008, pp. 2106–2110.
  21. S. Arimoto, “An algorithm for computing the capacity of arbitrary discrete memoryless channels,” IEEE Transactions on Information Theory, vol. 18, no. 1, pp. 14–20, 1972.
  22. R. Blahut, “Computation of channel capacity and rate-distortion functions,” IEEE Transactions on Information Theory, vol. 18, no. 4, pp. 460–473, 1972.
  23. A. Rényi, “On measures of entropy and information,” in Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, Volume 1: Contributions to the Theory of Statistics, vol. 4.   University of California Press, 1961, pp. 547–562.
  24. S. Arimoto, “Computation of random coding exponent functions,” IEEE Transactions on Information Theory, vol. 22, no. 6, pp. 665–671, 1976.
  25. A. Kamatsuka, Y. Ishikawa, K. Kazama, and T. Yoshida, “New algorithms for computing sibson capacity and arimoto capacity,” 2024. [Online]. Available: https://arxiv.org/abs/2401.14241
  26. T. van Erven and P. Harremos, “Rényi divergence and kullback-leibler divergence,” IEEE Transactions on Information Theory, vol. 60, no. 7, pp. 3797–3820, 2014.
  27. M. Sion, “On general minimax theorems.” Pacific Journal of Mathematics, vol. 8, no. 1, pp. 171 – 176, 1958.
Citations (3)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-Up Questions

We haven't generated follow-up questions for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube