Non asymptotic distributional bounds for the Dickman Approximation of the running time of the Quickselect algorithm
Abstract: Given a non-negative random variable $W$ and $\theta>0$, let the generalized Dickman transformation map the distribution of $W$ to that of $$ W*=_d U{1/\theta}(W+1), $$ where $U \sim {\cal U}[0,1]$, a uniformly distributed variable on the unit interval, independent of $W$, and where $=d$ denotes equality in distribution. It is well known that $W*$ and $W$ are equal in distribution if and only if $W$ has the generalized Dickman distribution ${\cal D}\theta$. We demonstrate that the Wasserstein distance $d_1$ between $W$, a non-negative random variable with finite mean, and $D_\theta$ having distribution ${\cal D}\theta$ obeys the inequality $$ d_1(W,D\theta) \le (1+\theta)d_1(W,W*). $$ The specialization of this bound to the case $\theta=1$ and coupling constructions yield $$ d_1(W_{n,1},D) \le \frac{8\log (n/2)+10}{n} \quad \mbox{for all $n \ge 1$, where} \quad W_{n,1}=\frac{1}{n}C_{n,1}-1, $$ and $C_{n,m}$ is the number of comparisons made by the Quickselect algorithm to find the $m{th}$ smallest element of a list of $n$ distinct numbers. A similar bound holds for $m \ge 2$, and together recover the results of [12] that show distributional convergence of $W_n$ to the standard Dickman distribution in the asymptotic regime $m=o(n)$. By developing an exact expression for the expected running time $E[C_{n,m}]$, lower bounds are provided that show the rate is not improvable for all $m \not = 2$.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.