Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
194 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Algorithmic information, plane Kakeya sets, and conditional dimension (1511.00442v4)

Published 2 Nov 2015 in cs.CC

Abstract: We formulate the conditional Kolmogorov complexity of x given y at precision r, where x and y are points in Euclidean spaces and r is a natural number. We demonstrate the utility of this notion in two ways. 1. We prove a point-to-set principle that enables one to use the (relativized, constructive) dimension of a single point in a set E in a Euclidean space to establish a lower bound on the (classical) Hausdorff dimension of E. We then use this principle, together with conditional Kolmogorov complexity in Euclidean spaces, to give a new proof of the known, two-dimensional case of the Kakeya conjecture. This theorem of geometric measure theory, proved by Davies in 1971, says that every plane set containing a unit line segment in every direction has Hausdorff dimension 2. 2. We use conditional Kolmogorov complexity in Euclidean spaces to develop the lower and upper conditional dimensions dim(x|y) and Dim(x|y) of x given y, where x and y are points in Euclidean spaces. Intuitively these are the lower and upper asymptotic algorithmic information densities of x conditioned on the information in y. We prove that these conditional dimensions are robust and that they have the correct information-theoretic relationships with the well-studied dimensions dim(x) and Dim(x) and the mutual dimensions mdim(x:y) and Mdim(x:y).

Citations (59)

Summary

We haven't generated a summary for this paper yet.