Papers
Topics
Authors
Recent
2000 character limit reached

Well-posed Bayesian Inverse Problems with Infinitely-Divisible and Heavy-Tailed Prior Measures

Published 23 Sep 2016 in math.PR, math.NA, math.ST, and stat.TH | (1609.07532v2)

Abstract: We present a new class of prior measures in connection to $\ell_p$ regularization techniques when $p \in(0,1)$ which is based on the generalized Gamma distribution. We show that the resulting prior measure is heavy-tailed, non-convex and infinitely divisible. Motivated by this observation we discuss the class of infinitely divisible prior measures and draw a connection between their tail behavior and the tail behavior of their L{\'evy} measures. Next, we use the laws of pure jump L{\'e}vy processes in order to define new classes of prior measures that are concentrated on the space of functions with bounded variation. These priors serve as an alternative to the classic total variation prior and result in well-defined inverse problems. We then study the well-posedness of Bayesian inverse problems in a general enough setting that encompasses the above mentioned classes of prior measures. We establish that well-posedness relies on a balance between the growth of the log-likelihood function and the tail behavior of the prior and apply our results to special cases such as additive noise models and linear problems. Finally, we discuss some of the practical aspects of Bayesian inverse problems such as their consistent approximation and present three concrete examples of well-posed Bayesian inverse problems with heavy-tailed or stochastic process prior measures.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.