Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A parameter-dependent smoother for the multigrid method (2008.00927v1)

Published 3 Aug 2020 in math.NA and cs.NA

Abstract: The solution of parameter-dependent linear systems, by classical methods, leads to an arithmetic effort that grows exponentially in the number of parameters. This renders the multigrid method, which has a well understood convergence theory, infeasible. A parameter-dependent representation, e.g., a low-rank tensor format, can avoid this exponential dependence, but in these it is unknown how to calculate the inverse directly within the representation. The combination of these representations with the multigrid method requires a parameter-dependent version of the classical multigrid theory and a parameter-dependent representation of the linear system, the smoother, the prolongation and the restriction. A derived parameter-dependent version of the smoothing property, fulfilled by parameter-dependent versions of the Richardson and Jacobi methods, together with the approximation property prove the convergence of the multigrid method for arbitrary parameter-dependent representations. For a model problem low-rank tensor formats represent the parameter-dependent linear system, prolongation and restriction. The smoother, a damped Jacobi method, is directly approximated in the low-rank tensor format by using exponential sums. Proving the smoothing property for this approximation guarantees the convergence of the parameter-dependent method. Numerical experiments for the parameter-dependent model problem, with bounded parameter value range, indicate a grid size independent convergence rate.

Citations (1)

Summary

We haven't generated a summary for this paper yet.