Papers
Topics
Authors
Recent
Search
2000 character limit reached

A highly efficient semismooth Newton augmented Lagrangian method for solving Lasso problems

Published 19 Jul 2016 in math.OC | (1607.05428v3)

Abstract: We develop a fast and robust algorithm for solving large scale convex composite optimization models with an emphasis on the $\ell_1$-regularized least squares regression (Lasso) problems. Despite the fact that there exist a large number of solvers in the literature for the Lasso problems, we found that no solver can efficiently handle difficult large scale regression problems with real data. By leveraging on available error bound results to realize the asymptotic superlinear convergence property of the augmented Lagrangian algorithm, and by exploiting the second order sparsity of the problem through the semismooth Newton method, we are able to propose an algorithm, called {\sc Ssnal}, to efficiently solve the aforementioned difficult problems. Under very mild conditions, which hold automatically for Lasso problems, both the primal and the dual iteration sequences generated by {\sc Ssnal} possess a fast linear convergence rate, which can even be superlinear asymptotically. Numerical comparisons between our approach and a number of state-of-the-art solvers, on real data sets, are presented to demonstrate the high efficiency and robustness of our proposed algorithm in solving difficult large scale Lasso problems.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (3)

Collections

Sign up for free to add this paper to one or more collections.