Bayesian inversion with Student's t priors based on Gaussian scale mixtures (2403.13665v2)
Abstract: Many inverse problems focus on recovering a quantity of interest that is a priori known to exhibit either discontinuous or smooth behavior. Within the Bayesian approach to inverse problems, such structural information can be encoded using Markov random field priors. We propose a class of priors that combine Markov random field structure with Student's t distribution. This approach offers flexibility in modeling diverse structural behaviors depending on available data. Flexibility is achieved by including the degrees of freedom parameter of Student's t distribution in the formulation of the Bayesian inverse problem. To facilitate posterior computations, we employ Gaussian scale mixture representation for the Student's t Markov random field prior, which allows expressing the prior as a conditionally Gaussian distribution depending on auxiliary hyperparameters. Adopting this representation, we can derive most of the posterior conditional distributions in a closed form and utilize the Gibbs sampler to explore the posterior. We illustrate the method with two numerical examples: signal deconvolution and image deblurring.
- Y. M. Marzouk, H. N. Najm, Dimensionality reduction and polynomial chaos acceleration of Bayesian inference in inverse problems, Journal of Computational Physics 228 (2009).
- Likelihood-informed dimension reduction for nonlinear inverse problems, Inverse Problems 30 (2014) 0114015.
- Bayesian inference with subset simulation in varying dimensions applied to the Karhunen–Loève expansion, International Journal for Numerical Methods in Engineering 122 (2021) 5100–5127.
- D. F. Andrews, C. L. Mallows, Scale mixtures of normal distributions, Journal of the Royal Statistical Society: Series B (Methodological) 36 (1974) 99–102.
- S. Geman, D. Geman, Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images, IEEE Transactions on Pattern Analysis and Machine Intelligence PAMI-6 (1984) 721–741.
- J. Geweke, Bayesian treatment of the independent Student-t linear model, Journal of Applied Econometrics 8 (1993) S19–S40.
- C. Fernandez, M. F. J. Steel, On Bayesian modeling of fat tails and skewness, Journal of the American Statistical Association 93 (1998) 359–371.
- Objective Bayesian analysis for the Student-t regression model, Biometrika 95 (2008) 325–333.
- M. Juárez, M. Steel, Model-based clustering of non-Gaussian panel data based on skew-t distributions, Journal of Business & Economic Statistics 28 (2010) 52–66.
- Penalising Model Component Complexity: A Principled, Practical Approach to Constructing Priors, Statistical Science 32 (2017) 1 – 28.
- Objective Bayesian analysis for the Student-t linear regression, Bayesian Analysis 16 (2021) 129–145.
- S. Y. Lee, The use of a log-normal prior for the Student t-distribution, Axioms 11 (2022).
- J. M. Bardsley, Gaussian Markov random field priors for inverse problems, Inverse Problems and Imaging 7 (2013) 397–416.
- Whittle-Matérn priors for Bayesian statistical inversion with applications in electrical impedance tomography, Inverse Problems and Imaging 8 (2014) 561–586.
- J. M. Bardsley, Laplace-distributed increments, the Laplace prior, and edge-preserving regularization, Journal of Inverse and Ill-Posed Problems 20 (2012) 271–285.
- Cauchy Markov random field priors for Bayesian inversion, Statistics and Computing 32 (2022) 1573–1375.
- Geometry parameter estimation for sparse X-ray log imaging, Journal of Mathematical Imaging and Vision 66 (2023) 154–166.
- Bayesian inversion with α𝛼\alphaitalic_α-stable priors, Inverse Problems 39 (2023) 105007.
- Horseshoe priors for edge-preserving linear Bayesian inversion, SIAM Journal on Scientific Computing 45 (2023) B337–B365.
- D. Calvetti, E. Somersalo, A Gaussian hypermodel to recover blocky objects, Inverse problems 23 (2007) 733–754.
- D. Calvetti, E. Somersalo, Hypermodels in the Bayesian imaging framework, Inverse problems 24 (2008) 034013 (20pp).
- Sparse reconstructions from few noisy data: Analysis of hierarchical Bayesian models with generalized gamma hyperpriors, Inverse Problems 36 (2020) 025010.
- K.-C. Chu, Estimation and decision for linear systems with elliptical random processes, IEEE Transactions on Automatic Control 18 (1973) 499–505.
- S. T. B. Choy, A. F. M. Smith, Hierarchical models with scale mixtures of normal distributions, Test 6 (1997).
- M. D. Hoffman, A. Gelman, The No-U-Turn Sampler: Adaptively setting path lengths in Hamiltonian Monte Carlo, Journal of Machine Learning Research 15 (2014) 1593–1623.
- P. Spanos, R. Ghanem, Stochastic finite element expansion for random media, Journal of Engineering Mechanics 115 (1989) 1035–1053.
- Discretization-invariant Bayesian inversion and Besov space priors, Inverse Problems & Imaging 3 (2009) 87–122.
- Bayesian neural network priors for edge-preserving inversion, Inverse Problems and Imaging 0 (2022) 1–26.
- M. West, On scale mixtures of normal distibutions, Biometrika 74 (1987) 646–648.
- M. J. Wainwright, E. Simoncelli, Scale mixtures of Gaussians and the statistics of natural images, in: Advances in Neural Information Processing Systems, volume 12, MIT Press, 1999.
- D. Higdon, A primer on space-time modeling from a Bayesian perspective, in: Statistical Methods for Spatio-Temporal Systems, Chapman & Hall/CRC, 2007, pp. 217–279.
- Sparsity promoting hybrid solvers for hierarchical Bayesian inverse problems, SIAM Journal on Scientific Computing 42 (2020) A3761–A3784.
- K. J. H. Law, V. Zankin, Sparse online variational Bayesian regression, SIAM/ASA Journal on Uncertainty Quantification 10 (2022) 1070–1100.
- C. Villa, S. Walker, Objective prior for the number of degrees of freedom of a t distribution, Bayesian Analysis 9 (2013) 1–24.
- MCMC methods for functions: Modifying old algorithms to make them faster, Statistical Science 28 (2013) 424–446.
- C. Andrieu, J. Thoms, A tutorial on adaptive MCMC, Statistics and Computing 18 (2008) 343–373.
- J. Besag, Spatial interaction and the statistical analysis of lattice systems, Journal of the Royal Statistical Society. Series B (Methodological) 36 (1974) 192–236.
- G. O. Roberts, A. F. M. Smith, Simple conditions for the convergence of the Gibbs sampler and Metropolis-Hastings algorithms, Stochastic Processes and their Applications 49 (1994) 207–216.
- L. Tierney, Markov chains for exploring posterior distributions, The Annals of Statistics 22 (1994) 1701–1762.
- Integral equation models for image restoration: High accuracy methods and fast algorithms, Inverse Problems 26 (2010) 045006.