Papers
Topics
Authors
Recent
Search
2000 character limit reached

Variants of the A-HPE and large-step A-HPE algorithms for strongly convex problems with applications to accelerated high-order tensor methods

Published 3 Feb 2021 in math.OC | (2102.02045v2)

Abstract: For solving strongly convex optimization problems, we propose and study the global convergence of variants of the A-HPE and large-step A-HPE algorithms of Monteiro and Svaiter. We prove linear and the superlinear $\mathcal{O}\left(k{\,-k\left(\frac{p-1}{p+1}\right)}\right)$ global rates for the proposed variants of the A-HPE and large-step A-HPE methods, respectively. The parameter $p\geq 2$ appears in the (high-order) large-step condition of the new large-step A-HPE algorithm. We apply our results to high-order tensor methods, obtaning a new inexact (relative-error) tensor method for (smooth) strongly convex optimization with iteration-complexity $\mathcal{O}\left(k{\,-k\left(\frac{p-1}{p+1}\right)}\right)$. In particular, for $p=2$, we obtain an inexact Newton-proximal algorithm with fast global $\mathcal{O}\left(k{\,-k/3}\right)$ convergence rate.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.