Papers
Topics
Authors
Recent
2000 character limit reached

Nonconvex Proximal Incremental Aggregated Gradient Method with Linear Convergence

Published 7 Apr 2018 in math.OC | (1804.02571v1)

Abstract: In this paper, we study the proximal incremental aggregated gradient(PIAG) algorithm for minimizing the sum of L-smooth nonconvex component functions and a proper closed convex function. By exploiting the L-smooth property and with the help of an error bound condition, we can show that the PIAG method still enjoys some nice linear convergence properties even for nonconvex minimization. To illustrate this, we first demonstrate that the generated sequence globally converges to the stationary point set. Then, there exists a threshold such that the objective function value sequence and the iterate point sequence are R-linearly convergent when the stepsize is chosen below this threshold.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (3)

Collections

Sign up for free to add this paper to one or more collections.