Choices and intervals (1402.3931v2)
Abstract: We consider a random interval splitting process, in which the splitting rule depends on the empirical distribution of interval lengths. We show that this empirical distribution converges to a limit almost surely as the number of intervals goes to infinity. We give a characterization of this limit as a solution of an ODE and use this to derive precise tail estimates. The convergence is established by showing that the size-biased empirical distribution evolves in the limit according to a certain deterministic evolution equation. Although this equation involves a non-local, non-linear operator, it can be studied thanks to a carefully chosen norm with respect to which this operator is contractive. In finite-dimensional settings, convergence results like this usually go under the name of stochastic approximation and can be approached by a general method of Kushner and Clark. An important technical contribution of this article is the extension of this method to an infinite-dimensional setting.