Papers
Topics
Authors
Recent
Search
2000 character limit reached

A Note on Nesterov's Accelerated Method in Nonconvex Optimization: a Weak Estimate Sequence Approach

Published 15 Jun 2020 in math.OC | (2006.08548v1)

Abstract: We present a variant of accelerated gradient descent algorithms, adapted from Nesterov's optimal first-order methods, for weakly-quasi-convex and weakly-quasi-strongly-convex functions. We show that by tweaking the so-called estimate sequence method, the derived algorithm achieves optimal convergence rate for weakly-quasi-convex and weakly-quasi-strongly-convex in terms of oracle complexity. In particular, for a weakly-quasi-convex function with Lipschitz continuous gradient, we require $O(\frac{1}{\sqrt{\varepsilon}})$ iterations to acquire an $\varepsilon$-solution; for weakly-quasi-strongly-convex functions, the iteration complexity is $O\left( \ln\left(\frac{1}{\varepsilon}\right) \right)$. Furthermore, we discuss the implications of these algorithms for linear quadratic optimal control problem.

Authors (2)
Citations (11)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.