Dice Question Streamline Icon: https://streamlinehq.com

Generalizing acceleration analyses from convex to nonconvex settings

Develop rigorous generalizations of analytical frameworks for accelerated gradient methods from convex objectives to nonconvex objectives, establishing conditions and guarantees under which accelerated methods retain convergence properties in nonconvex settings.

Information Square Streamline Icon: https://streamlinehq.com

Background

A substantial literature interprets accelerated gradient descent through Lyapunov functions, differential equations, and geometric viewpoints, but these treatments predominantly rely on convexity or knowledge of a global minimizer, limiting applicability to nonconvex problems.

The authors explicitly note uncertainty about extending these acceleration analyses to nonconvex settings, motivating future work to develop nonconvex counterparts that provide computable progress measures and convergence guarantees.

References

Most of this work is tailored to the convex setting, and it is unclear and nontrivial to generalize the results to a nonconvex setting.

Accelerated Gradient Descent Escapes Saddle Points Faster than Gradient Descent (1711.10456 - Jin et al., 2017) in Subsection: Related Work (Acceleration)