Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
89 tokens/sec
Gemini 2.5 Pro Premium
41 tokens/sec
GPT-5 Medium
23 tokens/sec
GPT-5 High Premium
19 tokens/sec
GPT-4o
96 tokens/sec
DeepSeek R1 via Azure Premium
88 tokens/sec
GPT OSS 120B via Groq Premium
467 tokens/sec
Kimi K2 via Groq Premium
197 tokens/sec
2000 character limit reached

Subspace Splitting Fast Sampling from Gaussian Posterior Distributions of Linear Inverse Problems (2502.05703v1)

Published 8 Feb 2025 in math.NA and cs.NA

Abstract: It is well-known that the posterior density of linear inverse problems with Gaussian prior and Gaussian likelihood is also Gaussian, hence completely described by its covariance and expectation. Sampling from a Gaussian posterior may be important in the analysis of various non-Gaussian inverse problems in which a estimates from a Gaussian posterior distribution constitute an intermediate stage in a Bayesian workflow. Sampling from a Gaussian distribution is straightforward if the Cholesky factorization of the covariance matrix or its inverse is available, however when the unknown is high dimensional, the computation of the posterior covariance maybe unfeasible. If the linear inverse problem is underdetermined, it is possible to exploit the orthogonality of the fundamental subspaces associated with the coefficient matrix together with the idea behind the Randomize-Then-Optimize approach to design a low complexity posterior sampler that does not require the posterior covariance to be formed. The performance of the proposed sampler is illustrated with a few computed examples, including non-Gaussian problems with non-linear forward model, and hierarchical models comprising a conditionally Gaussian submodel.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.