Error Decay of (almost) Consistent Signal Estimations from Quantized Gaussian Random Projections (1406.0022v3)
Abstract: This paper provides new error bounds on "consistent" reconstruction methods for signals observed from quantized random projections. Those signal estimation techniques guarantee a perfect matching between the available quantized data and a new observation of the estimated signal under the same sensing model. Focusing on dithered uniform scalar quantization of resolution $\delta>0$, we prove first that, given a Gaussian random frame of $\mathbb RN$ with $M$ vectors, the worst-case $\ell_2$-error of consistent signal reconstruction decays with high probability as $O(\frac{N}{M}\log\frac{M}{\sqrt N})$ uniformly for all signals of the unit ball $\mathbb BN \subset \mathbb RN$. Up to a log factor, this matches a known lower bound in $\Omega(N/M)$ and former empirical validations in $O(N/M)$. Equivalently, if $M$ exceeds a minimal number of frame coefficients growing like $O(\frac{N}{\epsilon_0}\log \frac{\sqrt N}{\epsilon_0})$, any vectors in $\mathbb BN$ with $M$ identical quantized projections are at most $\epsilon_0$ apart with high probability. Second, in the context of Quantized Compressed Sensing with $M$ Gaussian random measurements and under the same scalar quantization scheme, consistent reconstructions of $K$-sparse signals of $\mathbb RN$ have a worst-case error that decreases with high probability as $O(\tfrac{K}{M}\log\tfrac{MN}{\sqrt K3})$ uniformly for all such signals. Finally, we show that the proximity of vectors whose quantized random projections are only approximately consistent can still be bounded with high probability. A certain level of corruption is thus allowed in the quantization process, up to the appearance of a systematic bias in the reconstruction error of (almost) consistent signal estimates.