Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

An Online Efficient Two-Scale Reduced Basis Approach for the Localized Orthogonal Decomposition (2111.08643v1)

Published 16 Nov 2021 in math.NA and cs.NA

Abstract: We are concerned with employing Model Order Reduction (MOR) to efficiently solve parameterized multiscale problems using the Localized Orthogonal Decomposition (LOD) multiscale method. Like many multiscale methods, the LOD follows the idea of separating the problem into localized fine-scale subproblems and an effective coarse-scale system derived from the solutions of the local problems. While the Reduced Basis (RB) method has already been used to speed up the solution of the fine-scale problems, the resulting coarse system remained untouched, thus limiting the achievable speed up. In this work we address this issue by applying the RB methodology to a new two-scale formulation of the LOD. By reducing the entire two-scale system, this two-scale Reduced Basis LOD (TSRBLOD) approach, yields reduced order models that are completely independent from the size of the coarse mesh of the multiscale approach, allowing an efficient approximation of the solutions of parameterized multiscale problems even for very large domains. A rigorous and efficient a posteriori estimator bounds the model reduction error, taking into account the approximation error for both the local fine-scale problems and the global coarse-scale system.

Citations (4)

Summary

We haven't generated a summary for this paper yet.