On the Rényi Rate-Distortion-Perception Function and Functional Representations
Abstract: We extend the Rate-Distortion-Perception (RDP) framework to the Rényi information-theoretic regime, utilizing Sibson's $α$-mutual information to characterize the fundamental limits under distortion and perception constraints. For scalar Gaussian sources, we derive closed-form expressions for the Rényi RDP function, showing that the perception constraint induces a feasible interval for the reproduction variance. Furthermore, we establish a Rényi-generalized version of the Strong Functional Representation Lemma. Our analysis reveals a phase transition in the complexity of optimal functional representations: for $0.5<α< 1$, the coding cost is bounded by the $α$-divergence of order $α+1$, necessitating a codebook with heavy-tailed polynomial decay; conversely, for $α> 1$, the representation collapses to one with finite support, offering new insights into the compression of shared randomness under generalized notions of mutual information.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.