Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Consistency of Bayesian inference with Gaussian process priors in an elliptic inverse problem (1910.07343v3)

Published 16 Oct 2019 in math.ST, cs.NA, math.AP, math.NA, and stat.TH

Abstract: For $\mathcal{O}$ a bounded domain in $\mathbb{R}d$ and a given smooth function $g:\mathcal{O}\to\mathbb{R}$, we consider the statistical nonlinear inverse problem of recovering the conductivity $f>0$ in the divergence form equation $$ \nabla\cdot(f\nabla u)=g\ \textrm{on}\ \mathcal{O}, \quad u=0\ \textrm{on}\ \partial\mathcal{O}, $$ from $N$ discrete noisy point evaluations of the solution $u=u_f$ on $\mathcal O$. We study the statistical performance of Bayesian nonparametric procedures based on a flexible class of Gaussian (or hierarchical Gaussian) process priors, whose implementation is feasible by MCMC methods. We show that, as the number $N$ of measurements increases, the resulting posterior distributions concentrate around the true parameter generating the data, and derive a convergence rate $N{-\lambda}, \lambda>0,$ for the reconstruction error of the associated posterior means, in $L2(\mathcal{O})$-distance.

Citations (56)

Summary

We haven't generated a summary for this paper yet.