Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
166 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Random noise increases Kolmogorov complexity and Hausdorff dimension (1808.04626v3)

Published 14 Aug 2018 in cs.IT and math.IT

Abstract: Consider a binary string $x$ of length $n$ whose Kolmogorov complexity is $\alpha n$ for some $\alpha<1$. We want to increase the complexity of $x$ by changing a small fraction of bits in $x$. This is always possible: Buhrman, Fortnow, Newman and Vereshchagin (2005) showed that the increase can be at least $\delta n$ for large $n$ (where $\delta$ is some positive number that depends on $\alpha$ and the allowed fraction of changed bits). We consider a related question: what happens with the complexity of $x$ when we randomly change a small fraction of the bits (changing each bit independently with some probability $\tau$)? It turns out that a linear increase in complexity happens with high probability, but this increase is smaller than in the case of arbitrary change. We note that the amount of the increase depends on $x$ (strings of the same complexity could behave differently), and give an exact lower and upper bounds for this increase (with $o(n)$ precision). The proof uses the combinatorial and probabilistic technique that goes back to Ahlswede, G\'acs and K\"orner (1976). For the reader's convenience (and also because we need a slightly stronger statement) we provide a simplified exposition of this technique, so the paper is self-contained.

Citations (3)

Summary

We haven't generated a summary for this paper yet.