Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
158 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Localized model reduction for nonlinear elliptic partial differential equations: localized training, partition of unity, and adaptive enrichment (2202.09872v1)

Published 20 Feb 2022 in math.NA and cs.NA

Abstract: We propose a component-based (CB) parametric model order reduction (pMOR) formulation for parameterized {nonlinear} elliptic partial differential equations (PDEs). CB-pMOR is designed to deal with large-scale problems for which full-order solves are not affordable in a reasonable time frame or parameters' variations induce topology changes that prevent the application of monolithic pMOR techniques. We rely on the partition-of-unity method (PUM) to devise global approximation spaces from local reduced spaces, and on Galerkin projection to compute the global state estimate. We propose a randomized data compression algorithm based on oversampling for the construction of the components' reduced spaces: the approach exploits random boundary conditions of controlled smoothness on the oversampling boundary. We further propose an adaptive residual-based enrichment algorithm that exploits global reduced-order solves on representative systems to update the local reduced spaces. We prove exponential convergence of the enrichment procedure for linear coercive problems; we further present numerical results for a two-dimensional nonlinear diffusion problem to illustrate the many features of our proposal and demonstrate its effectiveness.

Citations (23)

Summary

We haven't generated a summary for this paper yet.