Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Pointwise error bounds in POD methods without difference quotients (2407.17159v1)

Published 24 Jul 2024 in math.NA and cs.NA

Abstract: In this paper we consider proper orthogonal decomposition (POD) methods that do not include difference quotients (DQs) of snapshots in the data set. The inclusion of DQs have been shown in the literature to be a key element in obtaining error bounds that do not degrade with the number of snapshots. More recently, the inclusion of DQs has allowed to obtain pointwise (as opposed to averaged) error bounds that decay with the same convergence rate (in terms of the POD singular values) as averaged ones. In the present paper, for POD methods not including DQs in their data set, we obtain error bounds that do not degrade with the number of snapshots if the function from where the snapshots are taken has certain degree of smoothness. Moreover, the rate of convergence is as close as that of methods including DQs as the smoothness of the function providing the snapshots allows. We do this by obtaining discrete counterparts of Agmon and interpolation inequalities in Sobolev spaces. Numerical experiments validating these estimates are also presented.

Citations (3)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com