Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
133 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Accelerating the convergence of Newton's method for nonlinear elliptic PDEs using Fourier neural operators (2403.03021v2)

Published 5 Mar 2024 in math.NA and cs.NA

Abstract: It is well known that Newton's method can have trouble converging if the initial guess is too far from the solution. Such a problem particularly occurs when this method is used to solve nonlinear elliptic partial differential equations (PDEs) discretized via finite differences. This work focuses on accelerating Newton's method convergence in this context. We seek to construct a mapping from the parameters of the nonlinear PDE to an approximation of its discrete solution, independently of the mesh resolution. This approximation is then used as an initial guess for Newton's method. To achieve these objectives, we elect to use a Fourier neural operator (FNO). The loss function is the sum of a data term (i.e., the comparison between known solutions and outputs of the FNO) and a physical term (i.e., the residual of the PDE discretization). Numerical results, in one and two dimensions, show that the proposed initial guess accelerates the convergence of Newton's method by a large margin compared to a naive initial guess, especially for highly nonlinear and anisotropic problems, with larger gains on coarse grids.

Citations (1)

Summary

We haven't generated a summary for this paper yet.