Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 73 tok/s
Gemini 2.5 Pro 40 tok/s Pro
GPT-5 Medium 32 tok/s Pro
GPT-5 High 28 tok/s Pro
GPT-4o 75 tok/s Pro
Kimi K2 184 tok/s Pro
GPT OSS 120B 466 tok/s Pro
Claude Sonnet 4.5 35 tok/s Pro
2000 character limit reached

Tight convergence rates of the gradient method on smooth hypoconvex functions (2203.00775v2)

Published 1 Mar 2022 in math.OC

Abstract: We perform the first tight convergence analysis of the gradient method with varying step sizes when applied to smooth hypoconvex (weakly convex) functions. Hypoconvex functions are smooth nonconvex functions whose curvature is bounded and assumed to belong to the interval $[\mu, L]$, with $\mu<0$. Our convergence rates improve and extend the existing analysis for smooth nonconvex functions with $L$-Lipschitz gradient (which corresponds to the case $\mu=-L$), and smoothly interpolates between that class and the class of smooth convex functions. We obtain our results using the performance estimation framework adapted to hypoconvex functions, for which new interpolation conditions are derived. We derive explicit upper bounds on the minimum gradient norm of the iterates for a large range of step sizes, explain why all such rates share a common structure, and prove that these rates are tight when step sizes are smaller or equal to $1/L$. Finally, we identify the optimal constant step size that minimizes the worst-case of the gradient method applied to hypoconvex functions.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.