Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Shared Information for a Markov Chain on a Tree (2307.15844v2)

Published 29 Jul 2023 in cs.IT and math.IT

Abstract: Shared information is a measure of mutual dependence among multiple jointly distributed random variables with finite alphabets. For a Markov chain on a tree with a given joint distribution, we give a new proof of an explicit characterization of shared information. The Markov chain on a tree is shown to possess a global Markov property based on graph separation; this property plays a key role in our proofs. When the underlying joint distribution is not known, we exploit the special form of this characterization to provide a multiarmed bandit algorithm for estimating shared information, and analyze its error performance.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (37)
  1. S. A. Abdallah and M. D. Plumbley, “A measure of statistical complexity based on predictive information,” ArXiv, vol. abs/1012.1890, 2010.
  2. A. Antos and I. Kontoyiannis, “Convergence properties of functional estimates for discrete distributions,” Random Structures & Algorithms, vol. 19, no. 3‐4, pp. 163–193, 2001.
  3. S. Bhattacharya and P. Narayan, “Shared information for a Markov chain on a tree,” in 2022 IEEE International Symposium on Information Theory, 2022, pp. 3049–3054.
  4. ——, “Shared information for the cliqueylon graph,” in 2023 IEEE International Symposium on Information Theory (ISIT).   IEEE, Jun. 2023.
  5. V. P. Boda and P. Narayan, “Universal sampling rate distortion,” IEEE Transactions on Information Theory, vol. 64, no. 12, pp. 7742–7758, Dec. 2018.
  6. V. P. Boda and L. A. Prashanth, “Correlated bandits or: How to minimize mean-squared error online,” in Proceedings of the 36th International Conference on Machine Learning, vol. 97.   PMLR, 6 2019, pp. 686–694.
  7. C. Chan, “On tightness of mutual dependence upperbound for secret-key capacity of multiple terminals,” ArXiv, vol. abs/0805.3200, 2008.
  8. ——, “The hidden flow of information,” 2011 IEEE International Symposium on Information Theory Proceedings, pp. 978–982, 2011.
  9. ——, “Linear perfect secret key agreement,” in 2011 IEEE Information Theory Workshop, 2011, pp. 723–726.
  10. C. Chan, A. Al-Bashabsheh, J. B. Ebrahimi, T. Kaced, and T. Liu, “Multivariate mutual information inspired by secret-key agreement,” Proceedings of the IEEE, vol. 103, no. 10, pp. 1883–1913, 2015.
  11. C. Chan, A. Al-Bashabsheh, and Q. Zhou, “Agglomerative info-clustering: Maximizing normalized total correlation,” IEEE Transactions on Information Theory, vol. 67, no. 3, pp. 2001–2011, 2021.
  12. C. Chan, A. Al-Bashabsheh, Q. Zhou, N. Ding, T. Liu, and A. Sprintson, “Successive omniscience,” IEEE Transactions on Information Theory, vol. 62, no. 6, pp. 3270–3289, 2016.
  13. C. Chan, A. Al-Bashabsheh, Q. Zhou, T. Kaced, and T. Liu, “Info-clustering: A mathematical theory for data clustering,” IEEE Transactions on Molecular, Biological and Multi-Scale Communications, pp. 64–91, 2016.
  14. C. Chan and L. Zheng, “Mutual dependence for secret key agreement,” in 2010 44th Annual Conference on Information Sciences and Systems (CISS), 2010, pp. 1–6.
  15. C. Chow and C. Liu, “Approximating discrete probability distributions with dependence trees,” IEEE Transactions on Information Theory, pp. 462–467, 1968.
  16. C. Chow and T. Wagner, “Consistency of an estimate of tree-dependent probability distributions (corresp.),” IEEE Transactions on Information Theory, vol. 19, pp. 369–371, 1973.
  17. T. A. Courtade and T. R. Halford, “Coded cooperative data exchange for a secret key,” IEEE Transactions on Information Theory, vol. 62, pp. 3785–3795, 2016.
  18. I. Csiszár and P. Narayan, “Secrecy capacities for multiple terminals,” IEEE Transactions on Information Theory, vol. 50, no. 12, pp. 3047–3061, Dec. 2004.
  19. W. Feng, N. K. Vishnoi, and Y. Yin, “Dynamic sampling from graphical models,” Proceedings of the 51st Annual ACM SIGACT Symposium on Theory of Computing, 2019.
  20. P. Gács and J. Körner, “Common information is far less than mutual information,” Problems of Control and Information Theory, vol. 2, 01 1973.
  21. V. D. Goppa, “Nonprobabilistic mutual information without memory,” Problems of Control and Information Theory, vol. 4, pp. 97–102, 1975.
  22. T. S. Han, “Nonnegative entropy measures of multivariate symmetric correlations,” Information and Control, vol. 36, no. 2, pp. 133–156, 1978.
  23. J. Jiao, K. Venkat, Y. Han, and T. Weissman, “Minimax estimation of functionals of discrete distributions,” IEEE Transactions on Information Theory, vol. 61, no. 5, pp. 2835–2885, 2015.
  24. T. Lattimore and C. Szepesvari, “Bandit algorithms,” 2017.
  25. W. Liu, G. Xu, and B. Chen, “The common information of n dependent random variables,” 2010 48th Annual Allerton Conference on Communication, Control, and Computing (Allerton), pp. 836–843, 2010.
  26. P. Narayan, “Omniscience and secrecy,” 2012, plenary Talk, IEEE International Symposium on Information Theory, Cambridge, MA.
  27. P. Narayan and H. Tyagi, “Multiterminal secrecy by public discussion,” Foundations and Trends in Communications and Information Theory, vol. 13, no. 2-3, pp. 129–275, 2016.
  28. S. Nitinawarat and P. Narayan, “Perfect omniscience, perfect secrecy, and Steiner tree packing,” IEEE Transactions on Information Theory, vol. 56, pp. 6490–6500, 2010.
  29. S. Nitinawarat, C. Ye, A. Barg, P. Narayan, and A. Reznik, “Secret key generation for a pairwise independent network model,” IEEE Transactions on Information Theory, vol. 56, no. 12, p. 6482–6489, 12 2010.
  30. L. Paninski, “Estimation of entropy and mutual information,” Neural Computation, vol. 15, no. 6, p. 1191–1253, 6 2003.
  31. J. Pearl, “Reverend Bayes on inference engines: A distributed hierarchical approach,” in Proceedings of the Second AAAI Conference on Artificial Intelligence, ser. AAAI’82.   AAAI Press, 1982, p. 133–136.
  32. H. Tyagi, “Common information and secret key capacity,” IEEE Transactions on Information Theory, vol. 59, pp. 5627–5640, 2013.
  33. H. Tyagi and P. Narayan, “How many queries will resolve common randomness?” IEEE Transactions on Information Theory, vol. 59, no. 9, pp. 5363–5378, 2013.
  34. H. Tyagi and S. Watanabe, “Converses for secret key agreement and secure computing,” IEEE Transactions on Information Theory, vol. 61, pp. 4809–4827, 2015.
  35. S. Watanabe, “Information theoretical analysis of multivariate correlation,” IBM Journal of Research and Development, vol. 4, no. 1, pp. 66–82, 1960.
  36. A. Wyner, “The common information of two dependent random variables,” IEEE Transactions on Information Theory, vol. 21, no. 2, pp. 163–179, 1975.
  37. J. yves Audibert, S. Bubeck, and R. Munos, “Best arm identification in multi-armed bandits,” in Proceedings of the Twenty-Third Annual Conference on Learning Theory, 2010, pp. 41–53.
Citations (2)

Summary

We haven't generated a summary for this paper yet.