2000 character limit reached
An intuition for physicists: information gain from experiments (2205.00009v3)
Published 29 Apr 2022 in cond-mat.stat-mech, astro-ph.IM, and physics.data-an
Abstract: How much one has learned from an experiment is quantifiable by the information gain, also known as the Kullback-Leibler divergence. The narrowing of the posterior parameter distribution $P(\theta|D)$ compared with the prior parameter distribution $\pi(\theta)$, is quantified in units of bits, as: $ D_{\mathrm{KL}}(P|\pi)=\int\log_{2}\left(\frac{P(\theta|D)}{\pi(\theta)}\right)\,P(\theta|D)\,d\theta $. This research note gives an intuition what one bit of information gain means. It corresponds to a Gaussian shrinking its standard deviation by a factor of three.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.