Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
133 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

An Enhanced Levenberg--Marquardt Method via Gram Reduction (2412.08561v1)

Published 11 Dec 2024 in math.OC

Abstract: This paper studied the problem of solving the system of nonlinear equations ${\bf F}({\bf x})={\bf 0}$, where ${\bf F}:{\mathbb R}{d}\to{\mathbb R}d$. We propose Gram-Reduced Levenberg--Marquardt method which updates the Gram matrix ${\bf J}(\cdot)\top{\bf J}(\cdot)$ in every $m$ iterations, where ${\bf J}(\cdot)$ is the Jacobian of ${\bf F}(\cdot)$. Our method has a global convergence guarantee without relying on any step of line-search or solving sub-problems. We prove our method takes at most $\mathcal{O}(m2+m{-0.5}\epsilon{-2.5})$ iterations to find an $\epsilon$-stationary point of $\frac{1}{2}|{\bf F}(\cdot)|2$, which leads to overall computation cost of $\mathcal{O}(d3\epsilon{-1}+d2\epsilon{-2})$ by taking $m=\Theta(\epsilon{-1})$. Our results are strictly better than the cost of $\mathcal{O}(d3\epsilon{-2})$ for existing Levenberg--Marquardt methods. We also show the proposed method enjoys local superlinear convergence rate under the non-degenerate assumption. We provide experiments on real-world applications in scientific computing and machine learning to validate the efficiency of the proposed methods.

Summary

We haven't generated a summary for this paper yet.