Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Optimal low-rank approximations for linear Gaussian inverse problems on Hilbert spaces, Part I: posterior covariance approximation (2411.01112v3)

Published 2 Nov 2024 in math.ST and stat.TH

Abstract: For linear inverse problems with Gaussian priors and Gaussian observation noise, the posterior is Gaussian, with mean and covariance determined by the conditioning formula. Using the Feldman-Hajek theorem, we analyse the prior-to-posterior update and its low-rank approximation for infinite-dimensional Hilbert parameter spaces and finite-dimensional observations. We show that the posterior distribution differs from the prior on a finite-dimensional subspace, and construct low-rank approximations to the posterior covariance, while keeping the mean fixed. Since in infinite dimensions, not all low-rank covariance approximations yield approximate posterior distributions which are equivalent to the posterior and prior distribution, we characterise the low-rank covariance approximations which do yield this equivalence, and their respective inverses, or `precisions'. For such approximations, a family of measure approximation problems is solved by identifying the low-rank approximations which are optimal for various losses simultaneously. These loss functions include the family of R\'enyi divergences, the Amari $\alpha$-divergences for $\alpha\in(0,1)$, the Hellinger metric and the Kullback-Leibler divergence. Our results extend those of Spantini et al. (SIAM J. Sci. Comput. 2015) to Hilbertian parameter spaces, and provide theoretical underpinning for the construction of low-rank approximations of discretised versions of the infinite-dimensional inverse problem, by formulating discretisation independent results.

Summary

We haven't generated a summary for this paper yet.