Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
194 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A FISTA-type accelerated gradient algorithm for solving smooth nonconvex composite optimization problems (1905.07010v3)

Published 16 May 2019 in math.OC

Abstract: In this paper, we describe and establish iteration-complexity of two accelerated composite gradient (ACG) variants to solve a smooth nonconvex composite optimization problem whose objective function is the sum of a nonconvex differentiable function $ f $ with a Lipschitz continuous gradient and a simple nonsmooth closed convex function $ h $. When $f$ is convex, the first ACG variant reduces to the well-known FISTA for a specific choice of the input, and hence the first one can be viewed as a natural extension of the latter one to the nonconvex setting. The first variant requires an input pair $(M,m)$ such that $f$ is $m$-weakly convex, $\nabla f$ is $M$-Lipschitz continuous, and $m \le M$ (possibly $m<M$), which is usually hard to obtain or poorly estimated. The second variant on the other hand can start from an arbitrary input pair $(M,m)$ of positive scalars and its complexity is shown to be not worse, and better in some cases, than that of the first variant for a large range of the input pairs. Finally, numerical results are provided to illustrate the efficiency of the two ACG variants.

Summary

We haven't generated a summary for this paper yet.