Efficient Rejection Sampling in the Entropy-Optimal Range (2504.04267v1)
Abstract: The problem of generating a random variate $X$ from a finite discrete probability distribution $P$ using an entropy source of independent unbiased coin flips is considered. The Knuth and Yao complexity theory of nonuniform random number generation furnishes a family of "entropy-optimal" sampling algorithms that consume between $H(P)$ and $H(P)+2$ coin flips per generated output, where $H$ is the Shannon entropy function. However, the space complexity of entropy-optimal samplers scales exponentially with the number of bits required to encode $P$. This article introduces a family of efficient rejection samplers and characterizes their entropy, space, and time complexity. Within this family is a distinguished sampling algorithm that requires linearithmic space and preprocessing time, and whose expected entropy cost always falls in the entropy-optimal range $[H(P), H(P)+2)$. No previous sampler for discrete probability distributions is known to achieve these characteristics. Numerical experiments demonstrate performance improvements in runtime and entropy of the proposed algorithm compared to the celebrated alias method.
Collections
Sign up for free to add this paper to one or more collections.