Papers
Topics
Authors
Recent
2000 character limit reached

An objective prior that unifies objective Bayes and information-based inference

Published 2 Jun 2015 in stat.ML, cs.LG, and physics.data-an | (1506.00745v2)

Abstract: There are three principle paradigms of statistical inference: (i) Bayesian, (ii) information-based and (iii) frequentist inference. We describe an objective prior (the weighting or $w$-prior) which unifies objective Bayes and information-based inference. The $w$-prior is chosen to make the marginal probability an unbiased estimator of the predictive performance of the model. This definition has several other natural interpretations. From the perspective of the information content of the prior, the $w$-prior is both uniformly and maximally uninformative. The $w$-prior can also be understood to result in a uniform density of distinguishable models in parameter space. Finally we demonstrate the the $w$-prior is equivalent to the Akaike Information Criterion (AIC) for regular models in the asymptotic limit. The $w$-prior appears to be generically applicable to statistical inference and is free of {\it ad hoc} regularization. The mechanism for suppressing complexity is analogous to AIC: model complexity reduces model predictivity. We expect this new objective-Bayes approach to inference to be widely-applicable to machine-learning problems including singular models.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.