2000 character limit reached
Staggered Quantizers for Perfect Perceptual Quality: A Connection between Quantizers with Common Randomness and Without (2406.19248v1)
Published 27 Jun 2024 in cs.IT and math.IT
Abstract: The rate-distortion-perception (RDP) framework has attracted significant recent attention due to its application in neural compression. It is important to understand the underlying mechanism connecting procedures with common randomness and those without. Different from previous efforts, we study this problem from a quantizer design perspective. By analyzing an idealized setting, we provide an interpretation of the advantage of dithered quantization in the RDP setting, which further allows us to make a conceptual connection between randomized (dithered) quantizers and quantizers without common randomness. This new understanding leads to a new procedure for RDP coding based on staggered quantizers.
- Ruida Zhou (39 papers)
- Chao Tian (78 papers)