Bernstein-von Mises theorems and uncertainty quantification for linear inverse problems
Abstract: We consider the statistical inverse problem of recovering an unknown function $f$ from a linear measurement corrupted by additive Gaussian white noise. We employ a nonparametric Bayesian approach with standard Gaussian priors, for which the posterior-based reconstruction of $f$ corresponds to a Tikhonov regulariser $\bar f$ with a reproducing kernel Hilbert space norm penalty. We prove a semiparametric Bernstein-von Mises theorem for a large collection of linear functionals of $f$, implying that semiparametric posterior estimation and uncertainty quantification are valid and optimal from a frequentist point of view. The result is applied to study three concrete examples that cover both the mildly and severely ill-posed cases: specifically, an elliptic inverse problem, an elliptic boundary value problem and the heat equation. For the elliptic boundary value problem, we also obtain a nonparametric version of the theorem that entails the convergence of the posterior distribution to a prior-independent infinite-dimensional Gaussian probability measure with minimal covariance. As a consequence, it follows that the Tikhonov regulariser $\bar f$ is an efficient estimator of $f$, and we derive frequentist guarantees for certain credible balls centred at $\bar{f}$.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.