Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
175 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Lower bounds for the error decay incurred by coarse quantization schemes (1004.3517v1)

Published 20 Apr 2010 in cs.IT and math.IT

Abstract: Several analog-to-digital conversion methods for bandlimited signals used in applications, such as Sigma Delta quantization schemes, employ coarse quantization coupled with oversampling. The standard mathematical model for the error accrued from such methods measures the performance of a given scheme by the rate at which the associated reconstruction error decays as a function of the oversampling ratio L > 1. It was recently shown that exponential accuracy of the form O(2(-r L)) can be achieved by appropriate one-bit Sigma Delta modulation schemes. However, the best known achievable rate constants r in this setting differ significantly from the general information theoretic lower bound. In this paper, we provide the first lower bound specific to coarse quantization, thus narrowing the gap between existing upper and lower bounds. In particular, our results imply a quantitative correspondence between the maximal signal amplitude and the best possible error decay rate. Our method draws from the theory of large deviations.

Citations (20)

Summary

We haven't generated a summary for this paper yet.