Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Compressed sensing with sparse, structured matrices (1207.2853v2)

Published 12 Jul 2012 in cs.IT, cond-mat.dis-nn, cond-mat.stat-mech, and math.IT

Abstract: In the context of the compressed sensing problem, we propose a new ensemble of sparse random matrices which allow one (i) to acquire and compress a {\rho}0-sparse signal of length N in a time linear in N and (ii) to perfectly recover the original signal, compressed at a rate {\alpha}, by using a message passing algorithm (Expectation Maximization Belief Propagation) that runs in a time linear in N. In the large N limit, the scheme proposed here closely approaches the theoretical bound {\rho}0 = {\alpha}, and so it is both optimal and efficient (linear time complexity). More generally, we show that several ensembles of dense random matrices can be converted into ensembles of sparse random matrices, having the same thresholds, but much lower computational complexity.

Citations (10)

Summary

We haven't generated a summary for this paper yet.