Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A new certified hierarchical and adaptive RB-ML-ROM surrogate model for parametrized PDEs (2204.13454v2)

Published 28 Apr 2022 in math.NA and cs.NA

Abstract: We present a new surrogate modeling technique for efficient approximation of input-output maps governed by parametrized PDEs. The model is hierarchical as it is built on a full order model (FOM), reduced order model (ROM) and machine-learning (ML) model chain. The model is adaptive in the sense that the ROM and ML model are adapted on-the-fly during a sequence of parametric requests to the model. To allow for a certification of the model hierarchy, as well as to control the adaptation process, we employ rigorous a posteriori error estimates for the ROM and ML models. In particular, we provide an example of an ML-based model that allows for rigorous analytical quality statements. We demonstrate the efficiency of the modeling chain on a Monte Carlo and a parameter-optimization example. Here, the ROM is instantiated by Reduced Basis Methods and the ML model is given by a neural network or a VKOGA kernel model.

Citations (16)

Summary

We haven't generated a summary for this paper yet.