Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
194 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Tensor Methods for Minimizing Convex Functions with Hölder Continuous Higher-Order Derivatives (1904.12559v4)

Published 29 Apr 2019 in math.OC

Abstract: In this paper we study $p$-order methods for unconstrained minimization of convex functions that are $p$-times differentiable ($p\geq 2$) with $\nu$-H\"{o}lder continuous $p$th derivatives. We propose tensor schemes with and without acceleration. For the schemes without acceleration, we establish iteration complexity bounds of $\mathcal{O}\left(\epsilon{-1/(p+\nu-1)}\right)$ for reducing the functional residual below a given $\epsilon\in (0,1)$. Assuming that $\nu$ is known, we obtain an improved complexity bound of $\mathcal{O}\left(\epsilon{-1/(p+\nu)}\right)$ for the corresponding accelerated scheme. For the case in which $\nu$ is unknown, we present a universal accelerated tensor scheme with iteration complexity of $\mathcal{O}\left(\epsilon{-p/[(p+1)(p+\nu-1)]}\right)$. A lower complexity bound of $\mathcal{O}\left(\epsilon{-2/[3(p+\nu)-2]}\right)$ is also obtained for this problem class.

Summary

We haven't generated a summary for this paper yet.