Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Preconditioners for model order reduction by interpolation and random sketching of operators (2104.12177v1)

Published 25 Apr 2021 in math.NA and cs.NA

Abstract: The performance of projection-based model order reduction methods for solving parameter-dependent systems of equations highly depends on the properties of the operator, which can be improved by preconditioning. In this paper we present strategies to construct a parameter-dependent preconditioner by an interpolation of operator's inverse. The interpolation is obtained by minimizing a discrepancy between the (preconditioned) operator and the matrix defining the metric of interest. The discrepancy measure is chosen such that its minimization can be efficiently performed online for each parameter value by the solution of a small least-squares problem. Furthermore, we show how to tune the discrepancy measure for improving the quality of Petrov-Galerkin projection or residual-based error estimation. This paper also addresses preconditioning for the randomized model order reduction methods from [Balabanov and Nouy 2019, Part I]. Our methodology can be readily used for efficient and stable solution of ill-conditioned parametric systems and an effective error estimation/certification without the need to estimate expensive stability constants. The proposed approach involves heavy computations in both offline and online stages that are circumvented by random sketching. The norms of high-dimensional matrices and vectors are estimated by l2-norms of their low-dimensional images, called sketches, through random embeddings. For this we extend the framework from [Balabanov and Nouy 2019, Part I] to random embeddings of operators.

Citations (3)

Summary

We haven't generated a summary for this paper yet.