Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
120 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Preconditioning the prior to overcome saturation in Bayesian inverse problems (1409.6496v1)

Published 23 Sep 2014 in math.ST and stat.TH

Abstract: We study Bayesian inference in statistical linear inverse problems with Gaussian noise and priors in Hilbert space. We focus our interest on the posterior contraction rate in the small noise limit. Existing results suffer from a certain saturation phenomenon, when the data generating element is too smooth compared to the smoothness inherent in the prior. We show how to overcome this saturation in an empirical Bayesian framework by using a non-centered data-dependent prior. The center is obtained from a preconditioning regularization step, which provides us with additional information to be used in the Bayesian framework. We use general techniques known from regularization theory. To highlight the significance of the findings we provide several examples. In particular, our approach allows to obtain and, using preconditioning improve after saturation, minimax rates of contraction established in previous studies. We also establish minimax contraction rates in cases which have not been considered so far.

Summary

We haven't generated a summary for this paper yet.