Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
175 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A modified discrepancy principle to attain optimal convergence rates under unknown noise (2103.03545v3)

Published 5 Mar 2021 in math.NA and cs.NA

Abstract: We consider a linear ill-posed equation in the Hilbert space setting. Multiple independent unbiased measurements of the right hand side are available. A natural approach is to take the average of the measurements as an approximation of the right hand side and to estimate the data error as the inverse of the square root of the number of measurements. We calculate the optimal convergence rate (as the number of measurements tends to infinity) under classical source conditions and introduce a modified discrepancy principle, which asymptotically attains this rate.

Citations (6)

Summary

We haven't generated a summary for this paper yet.