Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
8 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Flexible Modification of Gauss-Newton Method and Its Stochastic Extension (2102.00810v2)

Published 1 Feb 2021 in math.OC

Abstract: This work presents a novel version of recently developed Gauss-Newton method for solving systems of nonlinear equations, based on upper bound of solution residual and quadratic regularization ideas. We obtained for such method global convergence bounds and under natural non-degeneracy assumptions we present local quadratic convergence results. We developed stochastic optimization algorithms for presented Gauss-Newton method and justified sub-linear and linear convergence rates for these algorithms using weak growth condition (WGC) and Polyak-Lojasiewicz (PL) inequality. We show that Gauss-Newton method in stochastic setting can effectively find solution under WGC and PL condition matching convergence rate of the deterministic optimization method. The suggested method unifies most practically used Gauss-Newton method modifications and can easily interpolate between them providing flexible and convenient method easily implementable using standard techniques of convex optimization.

Summary

We haven't generated a summary for this paper yet.