Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Variants of the A-HPE and large-step A-HPE algorithms for strongly convex problems with applications to accelerated high-order tensor methods (2102.02045v2)

Published 3 Feb 2021 in math.OC

Abstract: For solving strongly convex optimization problems, we propose and study the global convergence of variants of the A-HPE and large-step A-HPE algorithms of Monteiro and Svaiter. We prove linear and the superlinear $\mathcal{O}\left(k{\,-k\left(\frac{p-1}{p+1}\right)}\right)$ global rates for the proposed variants of the A-HPE and large-step A-HPE methods, respectively. The parameter $p\geq 2$ appears in the (high-order) large-step condition of the new large-step A-HPE algorithm. We apply our results to high-order tensor methods, obtaning a new inexact (relative-error) tensor method for (smooth) strongly convex optimization with iteration-complexity $\mathcal{O}\left(k{\,-k\left(\frac{p-1}{p+1}\right)}\right)$. In particular, for $p=2$, we obtain an inexact Newton-proximal algorithm with fast global $\mathcal{O}\left(k{\,-k/3}\right)$ convergence rate.

Summary

We haven't generated a summary for this paper yet.