Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Quantization for Low-Rank Matrix Recovery (1709.09803v2)

Published 28 Sep 2017 in cs.IT and math.IT

Abstract: We study Sigma-Delta quantization methods coupled with appropriate reconstruction algorithms for digitizing randomly sampled low-rank matrices. We show that the reconstruction error associated with our methods decays polynomially with the oversampling factor, and we leverage our results to obtain root-exponential accuracy by optimizing over the choice of quantization scheme. Additionally, we show that a random encoding scheme, applied to the quantized measurements, yields a near-optimal exponential bit-rate. As an added benefit, our schemes are robust both to noise and to deviations from the low-rank assumption. In short, we provide a full generalization of analogous results, obtained in the classical setup of bandlimited function acquisition, and more recently, in the finite frame and compressed sensing setups to the case of low-rank matrices sampled with sub-Gaussian linear operators. Finally, we believe our techniques for generalizing results from the compressed sensing setup to the analogous low-rank matrix setup is applicable to other quantization schemes.

Citations (3)

Summary

We haven't generated a summary for this paper yet.