Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
166 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Low Rank Independence Samplers in Bayesian Inverse Problems (1609.07180v3)

Published 22 Sep 2016 in math.NA and stat.CO

Abstract: In Bayesian inverse problems, the posterior distribution is used to quantify uncertainty about the reconstructed solution. In practice, Markov chain Monte Carlo algorithms often are used to draw samples from the posterior distribution. However, implementations of such algorithms can be computationally expensive. We present a computationally efficient scheme for sampling high-dimensional Gaussian distributions in ill-posed Bayesian linear inverse problems. Our approach uses Metropolis-Hastings independence sampling with a proposal distribution based on a low-rank approximation of the prior-preconditioned Hessian. We show the dependence of the acceptance rate on the number of eigenvalues retained and discuss conditions under which the acceptance rate is high. We demonstrate our proposed sampler by using it with Metropolis-Hastings-within-Gibbs sampling in numerical experiments in image deblurring, computerized tomography, and NMR relaxometry.

Citations (4)

Summary

We haven't generated a summary for this paper yet.