Papers
Topics
Authors
Recent
Search
2000 character limit reached

On the correspondence between thermodynamics and inference

Published 5 Jun 2017 in math.ST, physics.data-an, and stat.TH | (1706.01428v5)

Abstract: We expand upon a natural analogy between Bayesian statistics and statistical physics in which sample size corresponds to inverse temperature. This analogy motivates the definition of two novel statistical quantities: a learning capacity and a Gibbs entropy. The analysis of the learning capacity, corresponding to the heat capacity in thermal physics, leads to new insight into the mechanism of learning and explains why some models have anomalously-high learning performance. We explore the properties of the learning capacity in a number of examples, including a sloppy model. Next, we propose that the Gibbs entropy provides a natural device for counting distinguishable distributions in the context of Bayesian inference. We use this device to define a generalized principle of indifference (GPI) in which every distinguishable model is assigned equal a priori probability. This principle results in a new solution to a long-standing problem in Bayesian inference: the definition of an objective or uninformative prior. A key characteristic of this new approach is that it can be applied to analyses where the model dimension is unknown and circumvents the automatic rejection of higher-dimensional models in Bayesian inference.

Citations (22)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 2 tweets with 30 likes about this paper.