Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
194 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On the Fundamental Recovery Limit of Orthogonal Least Squares (1910.10976v2)

Published 24 Oct 2019 in cs.IT and math.IT

Abstract: Orthogonal least squares (OLS) is a classic algorithm for sparse recovery, function approximation, and subset selection. In this paper, we analyze the performance guarantee of the OLS algorithm. Specifically, we show that OLS guarantees the exact reconstruction of any $K$-sparse vector in $K$ iterations, provided that a sensing matrix has unit $\ell_{2}$-norm columns and satisfies the restricted isometry property (RIP) of order $K+1$ with \begin{align*} \delta_{K+1} &<C_{K} = \begin{cases} \frac{1}{\sqrt{K}}, & K=1, \ \frac{1}{\sqrt{K+\frac{1}{4}}}, & K=2, \ \frac{1}{\sqrt{K+\frac{1}{16}}}, & K=3, \ \frac{1}{\sqrt{K}}, & K \ge 4. \end{cases} \end{align*} Furthermore, we show that the proposed guarantee is optimal in the sense that if $\delta_{K+1} \ge C_{K}$, then there exists a counterexample for which OLS fails the recovery.

Citations (3)

Summary

We haven't generated a summary for this paper yet.