Papers
Topics
Authors
Recent
2000 character limit reached

An intuition for physicists: information gain from experiments (2205.00009v3)

Published 29 Apr 2022 in cond-mat.stat-mech, astro-ph.IM, and physics.data-an

Abstract: How much one has learned from an experiment is quantifiable by the information gain, also known as the Kullback-Leibler divergence. The narrowing of the posterior parameter distribution $P(\theta|D)$ compared with the prior parameter distribution $\pi(\theta)$, is quantified in units of bits, as: $ D_{\mathrm{KL}}(P|\pi)=\int\log_{2}\left(\frac{P(\theta|D)}{\pi(\theta)}\right)\,P(\theta|D)\,d\theta $. This research note gives an intuition what one bit of information gain means. It corresponds to a Gaussian shrinking its standard deviation by a factor of three.

Citations (3)

Summary

We haven't generated a summary for this paper yet.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

Sign up for free to view the 1 tweet with 1 like about this paper.