Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 105 tok/s
Gemini 2.5 Pro 53 tok/s Pro
GPT-5 Medium 41 tok/s
GPT-5 High 42 tok/s Pro
GPT-4o 104 tok/s
GPT OSS 120B 474 tok/s Pro
Kimi K2 256 tok/s Pro
2000 character limit reached

A Framework for Robust Lossy Compression of Heavy-Tailed Sources (2411.08549v1)

Published 13 Nov 2024 in cs.IT, eess.SP, and math.IT

Abstract: We study the rate-distortion problem for both scalar and vector memoryless heavy-tailed $\alpha$-stable sources ($0 < \alpha < 2$). Using a recently defined notion of ``strength" as a power measure, we derive the rate-distortion function for $\alpha$-stable sources subject to a constraint on the strength of the error, and show it to be logarithmic in the strength-to-distortion ratio. We showcase how our framework paves the way to finding optimal quantizers for $\alpha$-stable sources and more generally to heavy-tailed ones. In addition, we study high-rate scalar quantizers and show that uniform ones are asymptotically optimal under the strength measure. We compare uniform Gaussian and Cauchy quantizers and show that more representation points for the Cauchy source are required to guarantee the same quantization quality. Our findings generalize the well-known rate-distortion and quantization results of Gaussian sources ($\alpha = 2$) under a quadratic distortion measure.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube