Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Fast Active Set Block Coordinate Descent Algorithm for $\ell_1$-regularized least squares (1403.1738v5)

Published 7 Mar 2014 in math.OC, cs.IT, and math.IT

Abstract: The problem of finding sparse solutions to underdetermined systems of linear equations arises in several applications (e.g. signal and image processing, compressive sensing, statistical inference). A standard tool for dealing with sparse recovery is the $\ell_1$-regularized least-squares approach that has been recently attracting the attention of many researchers. In this paper, we describe an active set estimate (i.e. an estimate of the indices of the zero variables in the optimal solution) for the considered problem that tries to quickly identify as many active variables as possible at a given point, while guaranteeing that some approximate optimality conditions are satisfied. A relevant feature of the estimate is that it gives a significant reduction of the objective function when setting to zero all those variables estimated active. This enables to easily embed it into a given globally converging algorithmic framework. In particular, we include our estimate into a block coordinate descent algorithm for $\ell_1$-regularized least squares, analyze the convergence properties of this new active set method, and prove that its basic version converges with linear rate. Finally, we report some numerical results showing the effectiveness of the approach.

Citations (35)

Summary

We haven't generated a summary for this paper yet.