Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Domain-decomposition least-squares Petrov-Galerkin (DD-LSPG) nonlinear model reduction (2007.11835v2)

Published 23 Jul 2020 in math.NA and cs.NA

Abstract: A novel domain-decomposition least-squares Petrov-Galerkin (DD-LSPG) model-reduction method applicable to parameterized systems of nonlinear algebraic equations (e.g., arising from discretizing a parameterized partial-differential-equations problem) is proposed. In contrast with previous works, we adopt an algebraically non-overlapping decomposition strategy rather than a spatial-decomposition strategy, which facilitates application to different spatial-discretization schemes. Rather than constructing a low-dimensional subspace for the entire state space in a monolithic fashion, the methodology constructs separate subspaces for the different subdomains/components characterizing the original model. In the offline stage, the method constructs low-dimensional bases for the interior and interface of components. In the online stage, the approach constructs an LSPG ROM for each component and enforces strong or weak compatibility on the 'ports' connecting them. We propose four different ways to construct reduced bases on the interface/ports of subdomains and several ways to enforce compatibility across connecting ports. We derive a posteriori and a priori error bounds for the DD-LSPG solutions. Numerical results performed on nonlinear benchmark problems in heat transfer and fluid dynamics demonstrate that the proposed method performs well in terms of both accuracy and computational cost, with different choices of basis and compatibility constraints yielding different performance profiles.

Citations (60)

Summary

We haven't generated a summary for this paper yet.