Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
120 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Kolmogorov's Algorithmic Mutual Information Is Equivalent to Bayes' Law (1907.02943v1)

Published 29 Jun 2019 in cs.IT and math.IT

Abstract: Given two events $A$ and $B$, Bayes' law is based on the argument that the probability of $A$ given $B$ is proportional to the probability of $B$ given $A$. When probabilities are interpreted in the Bayesian sense, Bayes' law constitutes a learning algorithm which shows how one can learn from a new observation to improve their belief in a theory that is consistent with that observation. Kolmogorov's notion of algorithmic information, which is based on the theory of algorithms, proposes an objective measure of the amount of information in a finite string about itself and concludes that for any two finite strings $x$ and $y$, the amount of information in $x$ about $y$ is almost equal to the amount of information in $y$ about $x$. We view this conclusion of Kolmogorov as the algorithmic information version of Bayes' law. This can be easily demonstrated if one considers the work of Levin on prefix Kolmogorov complexity and then expresses the amount of Kolmogorov mutual information between two finite strings using Solomonoff's a priori probability.

Citations (1)

Summary

We haven't generated a summary for this paper yet.