Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

The Rate-Distortion-Perception Trade-off: The Role of Private Randomness (2404.01111v1)

Published 1 Apr 2024 in cs.IT, math.IT, and stat.ML

Abstract: In image compression, with recent advances in generative modeling, the existence of a trade-off between the rate and the perceptual quality (realism) has been brought to light, where the realism is measured by the closeness of the output distribution to the source. It has been shown that randomized codes can be strictly better under a number of formulations. In particular, the role of common randomness has been well studied. We elucidate the role of private randomness in the compression of a memoryless source $Xn=(X_1,...,X_n)$ under two kinds of realism constraints. The near-perfect realism constraint requires the joint distribution of output symbols $(Y_1,...,Y_n)$ to be arbitrarily close the distribution of the source in total variation distance (TVD). The per-symbol near-perfect realism constraint requires that the TVD between the distribution of output symbol $Y_t$ and the source distribution be arbitrarily small, uniformly in the index $t.$ We characterize the corresponding asymptotic rate-distortion trade-off and show that encoder private randomness is not useful if the compression rate is lower than the entropy of the source, however limited the resources in terms of common randomness and decoder private randomness may be.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (25)
  1. M. Li, J. Klejsa, and W. B. Kleijn, “Distribution Preserving Quantization With Dithering and Transformation,” IEEE Signal Processing Letters, vol. 17, no. 12, 2010.
  2. M. Li, J. Klejsa, and W. Kleijn, “On Distribution Preserving Quantization,” 2011, arXiv:1108.3728.
  3. M. Li, J. Klejsa, A. Ozerov, and W. B. Kleijn, “Audio coding with power spectral density preserving quantization,” in IEEE International Conference on Acoustics, Speech and Signal Processing, 2012.
  4. J. Klejsa, G. Zhang, M. Li, and W. B. Kleijn, “Multiple Description Distribution Preserving Quantization,” IEEE Transactions on Signal Processing, vol. 61, no. 24, 2013.
  5. N. Saldi, T. Linder, and S. Yüksel, “Randomized Quantization and Source Coding With Constrained Output Distribution,” IEEE Transactions on Information Theory, vol. 61, no. 1, 2015.
  6. ——, “Output Constrained Lossy Source Coding With Limited Common Randomness,” IEEE Transactions on Information Theory, vol. 61, no. 9, 2015.
  7. E. J. Delp and O. R. Mitchell, “Moment preserving quantization (signal processing),” IEEE Transactions on Communications, vol. 39, no. 11, 1991.
  8. E. Agustsson, M. Tschannen, F. Mentzer, R. Timofte, and L. V. Gool, “Generative Adversarial Networks for Extreme Learned Image Compression,” in IEEE/CVF International Conference on Computer Vision, 2019.
  9. Y. Blau and T. Michaeli, “Rethinking Lossy Compression: The Rate-Distortion-Perception Tradeoff,” in 36th International Conference on Machine Learning, 2019.
  10. ——, “The Perception-Distortion Tradeoff,” in IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2018.
  11. R. Matsumoto, “Introducing the perception-distortion tradeoff into the rate-distortion theory of general information sources,” IEICE Communications Express, vol. 7, no. 11, 2018.
  12. ——, “Rate-distortion-perception tradeoff of variable-length source coding for general information sources,” IEICE Communications Express, vol. 8, no. 2, 2019.
  13. J. Chen, L. Yu, J. Wang, W. Shi, Y. Ge, and W. Tong, “On the Rate-Distortion-Perception Function,” IEEE Journal on Selected Areas in Information Theory, vol. 3, no. 4, 2022.
  14. Y. Qiu, A. B. Wagner, J. Ballé, and L. Theis, “Wasserstein Distortion: Unifying Fidelity and Realism,” 2023, arXiv:2310.03629.
  15. A. B. Wagner, “The Rate-Distortion-Perception Tradeoff: The Role of Common Randomness,” 2022, arXiv:2202.04147.
  16. X. Niu, D. Gündüz, B. Bai, and W. Han, “Conditional Rate-Distortion-Perception Trade-Off,” in IEEE International Symposium on Information Theory, 2023.
  17. Y. Hamdi and D. Gündüz, “The Rate-Distortion-Perception Trade-off with Side Information,” in IEEE International Symposium on Information Theory, 2023.
  18. G. Zhang, J. Qian, J. Chen, and A. Khisti, “Universal Rate-Distortion-Perception Representations for Lossy Compression,” in 35th Annual Conference on Neural Information Processing Systems, 2021.
  19. J. Qian, G. Zhang, J. Chen, and A. Khisti, “A Rate-Distortion-Perception Theory for Binary Sources,” in International Zurich Seminar on Information and Communication, 2022.
  20. P. Cuff, “Distributed Channel Synthesis,” IEEE Transactions on Information Theory, vol. 59, no. 11, 2013.
  21. E. Agustsson and L. Theis, “Universally Quantized Neural Compression,” in 34th Annual Conference on Neural Information Processing Systems, 2020.
  22. L. Theis, T. Salimans, M. D. Hoffman, and F. Mentzer, “Lossy compression with gaussian diffusion,” 2022, arXiv.2206.08889.
  23. B. Hasircioglu and D. Gunduz, “Communication Efficient Private Federated Learning Using Dithering,” 2023, arXiv:2309.07809.
  24. M. Hegazy, R. Leluc, C. T. Li, and A. Dieuleveut, “Compression with Exact Error Distribution for Federated Learning,” 2023, arXiv:2310.20682.
  25. L. Theis and A. B. Wagner, “A coding theorem for the rate-distortion-perception function,” in Neural Compression: From Information Theory to Applications – workshop at the International Conference on Learning Representations 2021.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Yassine Hamdi (3 papers)
  2. Aaron B. Wagner (49 papers)
  3. Deniz Gündüz (144 papers)
Citations (3)