Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Computationally Faster Newton Methods by Lazy Evaluations (2501.17488v1)

Published 29 Jan 2025 in math.OC

Abstract: This paper studies second-order optimization methods solving monotone nonlinear equation problems (MNE) and minimization problems (Min) in a $d$ dimensional vector space $\mathbb{R}d$. In their seminal work, Monteiro and Svaiter (SIOPT 2012, 2013) proposed the Newton Proximal Extragradient (NPE) for MNE and its accelerated variation (A-NPE) for Min to find an $\epsilon$ solution to problems in $\mathcal{O}(\epsilon{-{2}/{3}})$ and $\tilde{\mathcal{O}}(\epsilon{-{2}/{7}})$ iterations, respectively. In subsequent work, it was proved that these results are (near)-optimal and match the lower bounds up to logarithmic factors. However, the existing lower bound only applies to algorithms that query gradients and Hessians simultaneously. This paper improves the computational cost of Monteiro and Svaiter's methods by reusing Hessian across iterations. We propose the Lazy Extra Newton (LEN) method for MNE and its acceleration (A-LEN) for Min. The computational complexity bounds of our proposed methods match the optimal second-order methods in $\epsilon$ while reducing their dependency on the dimension by a factor of $d{{(\omega-2)}/{3}}$ and $d{{2(\omega-2)}/{7}}$ for MNE and Min, respectively, where $d\omega$ is the computation complexity to solve the matrix inverse. We further generalize these methods to the strongly monotone cases and show that similar improvements still hold by using the restart strategy.

Summary

We haven't generated a summary for this paper yet.