On the Fundamental Recovery Limit of Orthogonal Least Squares (1910.10976v2)
Abstract: Orthogonal least squares (OLS) is a classic algorithm for sparse recovery, function approximation, and subset selection. In this paper, we analyze the performance guarantee of the OLS algorithm. Specifically, we show that OLS guarantees the exact reconstruction of any $K$-sparse vector in $K$ iterations, provided that a sensing matrix has unit $\ell_{2}$-norm columns and satisfies the restricted isometry property (RIP) of order $K+1$ with \begin{align*} \delta_{K+1} &<C_{K} = \begin{cases} \frac{1}{\sqrt{K}}, & K=1, \ \frac{1}{\sqrt{K+\frac{1}{4}}}, & K=2, \ \frac{1}{\sqrt{K+\frac{1}{16}}}, & K=3, \ \frac{1}{\sqrt{K}}, & K \ge 4. \end{cases} \end{align*} Furthermore, we show that the proposed guarantee is optimal in the sense that if $\delta_{K+1} \ge C_{K}$, then there exists a counterexample for which OLS fails the recovery.