2000 character limit reached
An optimal gradient method for smooth strongly convex minimization (2101.09741v3)
Published 24 Jan 2021 in math.OC, cs.NA, and math.NA
Abstract: We present an optimal gradient method for smooth strongly convex optimization. The method is optimal in the sense that its worst-case bound on the distance to an optimal point exactly matches the lower bound on the oracle complexity for the class of problems, meaning that no black-box first-order method can have a better worst-case guarantee without further assumptions on the class of problems at hand. In addition, we provide a constructive recipe for obtaining the algorithmic parameters of the method and illustrate that it can be used for deriving methods for other optimality criteria as well.