Random noise increases Kolmogorov complexity and Hausdorff dimension (1808.04626v3)
Abstract: Consider a binary string $x$ of length $n$ whose Kolmogorov complexity is $\alpha n$ for some $\alpha<1$. We want to increase the complexity of $x$ by changing a small fraction of bits in $x$. This is always possible: Buhrman, Fortnow, Newman and Vereshchagin (2005) showed that the increase can be at least $\delta n$ for large $n$ (where $\delta$ is some positive number that depends on $\alpha$ and the allowed fraction of changed bits). We consider a related question: what happens with the complexity of $x$ when we randomly change a small fraction of the bits (changing each bit independently with some probability $\tau$)? It turns out that a linear increase in complexity happens with high probability, but this increase is smaller than in the case of arbitrary change. We note that the amount of the increase depends on $x$ (strings of the same complexity could behave differently), and give an exact lower and upper bounds for this increase (with $o(n)$ precision). The proof uses the combinatorial and probabilistic technique that goes back to Ahlswede, G\'acs and K\"orner (1976). For the reader's convenience (and also because we need a slightly stronger statement) we provide a simplified exposition of this technique, so the paper is self-contained.