Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On the complexity of proximal gradient and proximal Newton-CG methods for \(\ell_1\)-regularized Optimization (2504.15752v1)

Published 22 Apr 2025 in math.OC

Abstract: In this paper, we propose two second-order methods for solving the (\ell_1)-regularized composite optimization problem, which are developed based on two distinct definitions of approximate second-order stationary points. We introduce a hybrid proximal gradient and negative curvature method, as well as a proximal Newton-CG method, to find a strong* approximate second-order stationary point and a weak approximate second-order stationary point for (\ell_1)-regularized optimization problems, respectively. Comprehensive analyses are provided regarding the iteration complexity, computational complexity, and the local superlinear convergence rates of the first phases of these two methods under specific error bound conditions. Specifically, we demonstrate that the proximal Newton-CG method achieves the best-known iteration complexity for attaining the proposed weak approximate second-order stationary point, which is consistent with the results for finding an approximate second-order stationary point in unconstrained optimization. Through a toy example, we show that our proposed methods can effectively escape the first-order approximate solution. Numerical experiments implemented on the (\ell_1)-regularized Student's t-regression problem validate the effectiveness of both methods.

Summary

We haven't generated a summary for this paper yet.