Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
126 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Predictions and algorithmic statistics for infinite sequence (2005.03467v3)

Published 14 Mar 2020 in cs.IT and math.IT

Abstract: Consider the following prediction problem. Assume that there is a block box that produces bits according to some unknown computable distribution on the binary tree. We know first $n$ bits $x_1 x_2 \ldots x_n$. We want to know the probability of the event that that the next bit is equal to $1$. Solomonoff suggested to use universal semimeasure $m$ for solving this task. He proved that for every computable distribution $P$ and for every $b \in {0,1}$ the following holds: $$\sum_{n=1}{\infty}\sum_{x: l(x)=n} P(x) (P(b | x) - m(b | x))2 < \infty\ .$$ However, Solomonoff's method has a negative aspect: Hutter and Muchnik proved that there are an universal semimeasure $m$, computable distribution $P$ and a random (in Martin-L{\"o}f sense) sequence $x_1 x_2\ldots$ such that $\lim_{n \to \infty} P(x_{n+1} | x_1\ldots x_n) - m(x_{n+1} | x_1\ldots x_n) \nrightarrow 0$. We suggest a new way for prediction. For every finite string $x$ we predict the new bit according to the best (in some sence) distribution for $x$. We prove the similar result as Solomonoff theorem for our way of prediction. Also we show that our method of prediction has no that negative aspect as Solomonoff's method.

Summary

We haven't generated a summary for this paper yet.