Papers
Topics
Authors
Recent
Search
2000 character limit reached

Sparse Recovery using Smoothed $\ell^0$ (SL0): Convergence Analysis

Published 28 Jan 2010 in cs.IT and math.IT | (1001.5073v1)

Abstract: Finding the sparse solution of an underdetermined system of linear equations has many applications, especially, it is used in Compressed Sensing (CS), Sparse Component Analysis (SCA), and sparse decomposition of signals on overcomplete dictionaries. We have recently proposed a fast algorithm, called Smoothed $\ell0$ (SL0), for this task. Contrary to many other sparse recovery algorithms, SL0 is not based on minimizing the $\ell1$ norm, but it tries to directly minimize the $\ell0$ norm of the solution. The basic idea of SL0 is optimizing a sequence of certain (continuous) cost functions approximating the $\ell0$ norm of a vector. However, in previous papers, we did not provide a complete convergence proof for SL0. In this paper, we study the convergence properties of SL0, and show that under a certain sparsity constraint in terms of Asymmetric Restricted Isometry Property (ARIP), and with a certain choice of parameters, the convergence of SL0 to the sparsest solution is guaranteed. Moreover, we study the complexity of SL0, and we show that whenever the dimension of the dictionary grows, the complexity of SL0 increases with the same order as Matching Pursuit (MP), which is one of the fastest existing sparse recovery methods, while contrary to MP, its convergence to the sparsest solution is guaranteed under certain conditions which are satisfied through the choice of parameters.

Citations (40)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.