Papers
Topics
Authors
Recent
Search
2000 character limit reached

General Holder Smooth Convergence Rates Follow From Specialized Rates Assuming Growth Bounds

Published 20 Apr 2021 in math.OC | (2104.10196v2)

Abstract: Often in the analysis of first-order methods for both smooth and nonsmooth optimization, assuming the existence of a growth/error bound or KL condition facilitates much stronger convergence analysis. Hence separate analysis is typically needed for the general case and for the growth bounded cases. We give meta-theorems for deriving general convergence rates from those assuming a growth lower bound. Applying this simple but conceptually powerful tool to the proximal point, subgradient, bundle, dual averaging, gradient descent, Frank-Wolfe, and universal accelerated methods immediately recovers their known convergence rates for general convex optimization problems from their specialized rates. New convergence results follow for bundle methods, dual averaging, and Frank-Wolfe. Our results can lift any rate based on H\"older continuous gradients and H\"older growth bounds. Moreover, our theory provides simple proofs of optimal convergence lower bounds under H\"older growth from textbook examples without growth bounds.

Authors (1)
Citations (4)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.