2000 character limit reached
Output-Constrained Lossy Source Coding With Application to Rate-Distortion-Perception Theory (2403.14849v1)
Published 21 Mar 2024 in cs.IT, cs.LG, and math.IT
Abstract: The distortion-rate function of output-constrained lossy source coding with limited common randomness is analyzed for the special case of squared error distortion measure. An explicit expression is obtained when both source and reconstruction distributions are Gaussian. This further leads to a partial characterization of the information-theoretic limit of quadratic Gaussian rate-distortion-perception coding with the perception measure given by Kullback-Leibler divergence or squared quadratic Wasserstein distance.
- Y. Blau and T. Michaeli, “The perception-distortion tradeoff,” in Proc. IEEE Conf. Comp. Vision and Pattern Recog. (CVPR), 2018, pp. 6288–6237.
- Y. Blau and T. Michaeli, “Rethinking lossy compression: The rate-distortion-perception tradeoff,” in International Conference on Machine Learning, pp. 675–685, 2019.
- R. Matsumoto, “Introducing the perception-distortion tradeoff into the rate-distortion theory of general information sources,” IEICE Comm. Express, vol. 7, no. 11, pp. 427–431, 2018.
- R. Matsumoto, “Rate-distortion-perception tradeoff of variable-length source coding for general information sources,? IEICE Comm. Express, vol. 8, no. 2, pp. 38–42, 2019.
- L. Theis and A. B. Wagner, “A coding theorem for the rate-distortion-perception function,” ICLR 2021 neural compression workshop.
- J. Chen, L. Yu, J. Wang, W. Shi, Y. Ge, and W. Tong, “On the rate-distortion-perception function,” IEEE Journal on Selected Areas in Information Theory, vol. 3, no. 4, pp. 664–673, Dec. 2022.
- A. B. Wagner, “The rate-distortion-perception tradeoff: The role of common randomness,” 2022, arXiv:2202.04147. [Online] Available: https://arxiv.org/abs/2202.04147
- M. Li, J. Klejsa, and W. B. Kleijn, “Distribution preserving quantization with dithering and transformation,” IEEE Signal Process. Lett., vol. 17, no. 12, pp. 1014–1017, Dec. 2010.
- M. Li, J. Klejsa, and W. B. Kleijn. (2011). “On distribution preserving quantization. [Online]. Available: http://arxiv.org/abs/1108.3728
- J. Klejsa, G. Zhang, M. Li, and W. B. Kleijn, “Multiple description distribution preserving quantization,” IEEE Trans. Signal Process., vol. 61, no. 24, pp. 6410–6422, Dec. 2013.
- N. Saldi, T. Linder, and S. Yüksel, “Randomized quantization and source coding with constrained output distribution,” IEEE Trans. Inf. Theory, vol. 61, no. 1, pp. 91–106, Jan. 2015.
- N. Saldi, T. Linder, and S. Yüksel, “Output constrained lossy source coding with limited common randomness,” IEEE Trans. Inf. Theory, vol. 61, no. 9, pp. 4984–4998, Sep. 2015.
- D. C. Dowson and B. V. Landau, “ The Fréchet distance between multivariate normal distributions,” J. Multivariate Anal., vol. 12, no. 3, pp. 450–-455, 1982.
- C. R. Givens and R. M. Shortt, “A class of Wasserstein metrics for probability,” Michigan Math. J., vol. 31, no. 2, pp. 231–240, 1984.
- T. A. Atif, M. A. Sohail, and S. S. Pradhan, “Lossy quantum source coding with a global error criterion based on a posterior reference map,” 2023, arXiv:2302.00625. [Online] Available: https://arxiv.org/abs/2302.00625
- S. Salehkalaibar, J. Chen, A. Khisti, and W. Yu, “Rate-distortion-perception tradeoff based on the conditional-distribution perception measure,” 2024, arXiv:2401.12207. [Online] Available: https://arxiv.org/abs/2401.12207
- L. Theis and E. Agustsson, “On the advantages of stochastic encoders,” ICLR 2021 neural compression workshop.
- Z. Yan, F. Wen, R. Ying, C. Ma, and P. Liu, “On perceptual lossy compression: The cost of perceptual reconstruction and an optimal training framework,” in International Conference on Machine Learning, 2021.
- H. Liu, G. Zhang, J. Chen, A. Khisti, “Lossy compression with distribution shift as entropy constrained optimal transport,” in International Conference on Learning Representations, 2022.
- H. Liu, G. Zhang, J. Chen and A. Khisti, “Cross-domain lossy compression as entropy constrained optimal transport,” IEEE Journal on Selected Areas in Information Theory, vol. 3, no. 3, pp. 513–527, Sep. 2022.
- O. Rioul, “Information theoretic proofs of entropy power inequalities,” IEEE Trans. Inf. Theory, vol. 57, no. 1, pp. 33–55, Jan. 2011.
- S. Salehkalaibar, B. Phan, J. Chen, W. Yu, and A. Khisti, “On the choice of perception loss function for learned video compression,” in Conference on Neural Information Processing Systems, 2023.
- J. Qian, S. Salehkalaibar, J. Chen, A. Khisti, W. Yu, W. Shi, Y. Ge, and W. Tong, “Rate-Distortion-Perception Tradeoff for Vector Gaussian Sources,” IEEE Journal on Selected Areas in Information Theory, submitted for publication.