Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
166 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Nonlinear Schwarz preconditioning for Quasi-Newton methods (2211.14403v1)

Published 25 Nov 2022 in math.NA and cs.NA

Abstract: We propose the nonlinear restricted additive Schwarz (RAS) preconditioning strategy to improve the convergence speed of limited memory quasi-Newton (QN) methods. We consider both "left-preconditioning" and "right-preconditioning" strategies. As the application of the nonlinear preconditioning changes the standard gradients and Hessians to their preconditioned counterparts, the standard secant pairs cannot be used to approximate the preconditioned Hessians. We discuss how to construct the secant pairs in the preconditioned QN framework. Finally, we demonstrate the robustness and efficiency of the preconditioned QN methods using numerical experiments.

Citations (2)

Summary

We haven't generated a summary for this paper yet.