Accelerated Methods for $α$-Weakly-Quasi-Convex Problems (1710.00797v7)
Abstract: We provide a quick overview of the class of $\alpha$-weakly-quasi-convex problems and its relationships with other problem classes. We show that the previously known Sequential Subspace Optimization method retains its optimal convergence rate when applied to minimization problems with smooth $\alpha$-weakly-quasi-convex objectives. We also show that Nemirovski's conjugate gradients method of strongly convex minimization achieves its optimal convergence rate under weaker conditions of $\alpha$-weak-quasi-convexity and quad-ratic growth. Previously known results only capture the special case of 1-weak-quasi-convexity or give convergence rates with worse dependence on the parameter $\alpha$.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.