Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
133 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On the asymptotical regularization for linear inverse problems in presence of white noise (2004.04451v1)

Published 9 Apr 2020 in math.NA, cs.NA, math.ST, and stat.TH

Abstract: We interpret steady linear statistical inverse problems as artificial dynamic systems with white noise and introduce a stochastic differential equation (SDE) system where the inverse of the ending time $T$ naturally plays the role of the squared noise level. The time-continuous framework then allows us to apply classical methods from data assimilation, namely the Kalman-Bucy filter and 3DVAR, and to analyze their behaviour as a regularization method for the original problem. Such treatment offers some connections to the famous asymptotical regularization method, which has not yet been analyzed in the context of random noise. We derive error bounds for both methods in terms of the mean-squared error under standard assumptions and discuss commonalities and differences between both approaches. If an additional tuning parameter $\alpha$ for the initial covariance is chosen appropriately in terms of the ending time $T$, one of the proposed methods gains order optimality. Our results extend theoretical findings in the discrete setting given in the paper Iglesias et al. (2017). Numerical examples confirm our theoretical results.

Citations (8)

Summary

We haven't generated a summary for this paper yet.