Generalizing acceleration analyses from convex to nonconvex settings
Develop rigorous generalizations of analytical frameworks for accelerated gradient methods from convex objectives to nonconvex objectives, establishing conditions and guarantees under which accelerated methods retain convergence properties in nonconvex settings.
References
Most of this work is tailored to the convex setting, and it is unclear and nontrivial to generalize the results to a nonconvex setting.
— Accelerated Gradient Descent Escapes Saddle Points Faster than Gradient Descent
(1711.10456 - Jin et al., 2017) in Subsection: Related Work (Acceleration)