Papers
Topics
Authors
Recent
Search
2000 character limit reached

Optimum Achievable Rates in Two Random Number Generation Problems with $f$-Divergences Using Smooth Rényi Entropy

Published 17 Apr 2024 in cs.IT and math.IT | (2404.11097v2)

Abstract: Two typical fixed-length random number generation problems in information theory are considered for general sources. One is the source resolvability problem and the other is the intrinsic randomness problem. In each of these problems, the optimum achievable rate with respect to the given approximation measure is one of our main concerns and has been characterized using two different information quantities: the information spectrum and the smooth R\'enyi entropy. Recently, optimum achievable rates with respect to $f$-divergences have been characterized using the information spectrum quantity. The $f$-divergence is a general non-negative measure between two probability distributions on the basis of a convex function $f$. The class of f-divergences includes several important measures such as the variational distance, the KL divergence, the Hellinger distance and so on. Hence, it is meaningful to consider the random number generation problems with respect to $f$-divergences. However, optimum achievable rates with respect to $f$-divergences using the smooth R\'enyi entropy have not been clarified yet in both of two problems. In this paper we try to analyze the optimum achievable rates using the smooth R\'enyi entropy and to extend the class of $f$-divergence. To do so, we first derive general formulas of the first-order optimum achievable rates with respect to $f$-divergences in both problems under the same conditions as imposed by previous studies. Next, we relax the conditions on $f$-divergence and generalize the obtained general formulas. Then, we particularize our general formulas to several specified functions $f$. As a result, we reveal that it is easy to derive optimum achievable rates for several important measures from our general formulas. Furthermore, a kind of duality between the resolvability and the intrinsic randomness is revealed in terms of the smooth R\'enyi entropy.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (37)
  1. R. Nomura and H. Yagi, “Optimum source resolvability rate with respect to f𝑓fitalic_f-divergences using the smooth Rényi entropy,” in Proc. 2020 IEEE International Symposium on Information Theory (ISIT), 2020, pp. 2286–2291.
  2. ——, “Optimum intrinsic randomness rate with respect to f𝑓fitalic_f-divergences using the smooth min entropy,” in Proc. 2021 IEEE International Symposium on Information Theory (ISIT), 2021, pp. 1784–1789.
  3. T. S. Han and S. Verdú, “Approximation theory of output statistics,” IEEE Trans. Inf. Theory, vol. 39, no. 3, pp. 752–772, 1993.
  4. Y. Steinberg and S. Verdú, “Simulation of random processes and rate-distortion theory,” IEEE Trans. Inf. Theory, vol. 42, no. 1, pp. 63–86, 1996.
  5. R. Nomura, “Source resolvability with Kullback-Leibler divergence,” in Proc. 2018 IEEE International Symposium on Information Theory, 2018, pp. 2042–2046.
  6. R. Nomura, “Source resolvability and intrinsic randomness: two random number generation problems with respect to a subclass of f𝑓fitalic_f-divergences,” IEEE Trans. Inf. Theory, vol. 66, no. 12, pp. 7588–7601, 2020.
  7. R. Nomura and T. S. Han, “Second-order resolvability, intrinsic randomness, and fixed-length source coding for mixed sources: Information spectrum approach,” IEEE Trans. Inf. Theory, vol. 59, no. 1, pp. 1–16, 2013.
  8. T. Uyematsu, “Relating source coding and resolvability: A direct approach,” in Proc. 2010 IEEE International Symposium on Information Theory, June 2010, pp. 1350–1354.
  9. S. Vembu and S. Verdú, “Generating random bits from an arbitrary source: Fundamental limits,” IEEE Trans. Inf. Theory, vol. 41, no. 5, pp. 1322–1332, 1995.
  10. M. Hayashi, “Second-order asymptotics in fixed-length source coding and intrinsic randomness,” IEEE Trans. Inf. Theory, vol. 54, no. 10, pp. 4619–4637, 2008.
  11. T. Uyematsu and S. Kunimatsu, “A new unified method for intrinsic randomness problems of general sources,” in Proc. 2013 IEEE Information Theory Workshop (ITW), Sept 2013, pp. 1–5.
  12. J. Liu, P. Cuff, and S. Verdú, “Eγsubscript𝐸𝛾{E}_{\gamma}italic_E start_POSTSUBSCRIPT italic_γ end_POSTSUBSCRIPT-resolvability,” IEEE Trans. Inf. Theory, vol. 63, no. 5, pp. 2629–2658, 2017.
  13. H. Yagi and T. S. Han, “Variable-length resolvability for mixed sources and its application to variable-length source coding,” CoRR, vol. abs/1801.04439, 2018. [Online]. Available: http://arxiv.org/abs/1801.04439
  14. W. Kumagai and M. Hayashi, “Second-order asymptotics of conversions of distributions and entangled states based on rayleigh-normal probability distributions,” IEEE Trans. Inf. Theory, vol. 63, no. 3, pp. 1829–1857, 2017.
  15. ——, “Random number conversion and LOCC conversion via restricted storage,” IEEE Trans. Inf. Theory, vol. 63, no. 4, pp. 2504–2532, 2017.
  16. L. Yu and V. Y. F. Tan, “Simulation of random variables under Rényi divergence measures of all orders,” IEEE Trans. Inf. Theory, vol. 65, no. 6, pp. 3349–3383, June 2019.
  17. H. Yagi and T. S. Han, “Variable-length resolvability for general sources and channels,” Entropy, vol. 25, no. 10, 2023.
  18. I. Csiszár and P. C. Shields, “Information theory and statistics: A tutorial,” Foundations and Trends® in Communications and Information Theory, vol. 1, no. 4, pp. 417–528, 2004.
  19. I. Sason and S. Verdú, “f𝑓fitalic_f-divergence inequalities,” IEEE Trans. Inf. Theory, vol. 62, no. 11, pp. 5973–6006, 2016.
  20. R. Renner and S. Wolf, “Smooth Rényi entropy and applications,” in Proc. 2004 IEEE International Symposium onInformation Theory (ISIT), 2004, p. 233.
  21. T. Holenstein and R. Renner, “On the randomness of independent experiments,” IEEE Trans. Inf. Theory, vol. 57, no. 4, pp. 1865–1871, 2011.
  22. T. Uyematsu, “A new unified method for fixed-length source coding problems of general sources,” IEICE Trans. Fundamentals, vol. E93-A, no. 11, pp. 1868–1877, 2010.
  23. M. Hayashi, “Information spectrum approach to second-order coding rate in channel coding,” IEEE Trans. Inf. Theory, vol. 55, no. 11, pp. 4947–4966, 2009.
  24. Y. Polyanskiy, H. Poor, and S. Verdú, “Channel coding rate in the finite blocklength regime,” IEEE Trans. Inf. Theory, vol. 56, no. 5, pp. 2307–2359, 2010.
  25. A. Ingber and Y. Kochman, “The dispersion of lossy source coding,” in Proc. Data Compression Conference (DCC), 2011, pp. 53–62.
  26. V. Kostina and S. Verdú, “Fixed-length lossy compression in the finite blocklength regime,” IEEE Trans. Inf. Theory, vol. 58, no. 6, pp. 3309–3338, 2012.
  27. I. Kontoyiannis and S. Verdú, “Optimal lossless compression: Source varentropy and despersion,” in Proc. 2013 IEEE International Symposium on Information Theory, 2013, pp. 1739–1742.
  28. V. Y. F. Tan and O. Kosut, “On the dispersions of three network information theory problems,” IEEE Trans. Inf. Theory, vol. 60, no. 2, pp. 881–903, 2014.
  29. R. Nomura and T. S. Han, “Second-order Slepian-Wolf coding theorems for non-mixed and mixed sources,” IEEE Trans. Inf. Theory, vol. 60, no. 9, pp. 5553–5572, 2014.
  30. H. Yagi, T. S. Han, and R. Nomura, “First- and second-order coding theorems for mixed memoryless channels with general mixture,” IEEE Trans. Inf. Theory, vol. 62, no. 8, pp. 4395–4412, 2016.
  31. S. Watanabe, “Second-order region for Gray-Wyner network,” IEEE Trans. Inf. Theory, vol. 63, no. 2, pp. 1006–1018, 2017.
  32. S. Tagashira and T. Uyematsu, “The second order asymptotic rates in fixed-length coding and resolvability problem in terms of smooth rényi entropy (in japanese),” in IEICE Technical Report, IT2012-60, 2013, pp. 65–70.
  33. E. Namekawa and T. Uyematsu, “The second order asymptotic rates in intrinsic randomness problem in terms of smooth rényi entropy (in japanese),” in IEICE Technical Report, IT2014-54, 2015, pp. 1–6.
  34. S. Vembu, S. Verdú, and Y. Steinberg, “The source-channel separation theorem revisited,” IEEE Trans. Inf. Theory, vol. 41, no. 1, pp. 44–54, 1995.
  35. P. O. Chen and F. Alajaji, “Optimistic Shannon coding theorems for arbitrary single-user systems,” IEEE Trans. Inf. Theory, vol. 45, no. 7, pp. 2623–2629, 1999.
  36. H. Koga, “Four limits in probability and their roles in source coding,” IEICE Trans. Fundamentals, vol. 94, no. 11, pp. 2073–2082, 2011.
  37. H. Yagi and R. Nomura, “Variable-length coding with cost allowing non-vanishing error probability,” IEICE Trans. Fundamentals, vol. E100-A, pp. 1683–1692, 2017.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 0 likes about this paper.