Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 86 tok/s
Gemini 2.5 Pro 51 tok/s Pro
GPT-5 Medium 43 tok/s
GPT-5 High 37 tok/s Pro
GPT-4o 98 tok/s
GPT OSS 120B 466 tok/s Pro
Kimi K2 225 tok/s Pro
2000 character limit reached

Efficient Online Random Sampling via Randomness Recycling (2505.18879v1)

Published 24 May 2025 in cs.DS, cs.DM, cs.IT, math.IT, math.PR, and stat.CO

Abstract: ``Randomness recycling'' is a powerful algorithmic technique for reusing a fraction of the random information consumed by a randomized algorithm to reduce its entropy requirements. This article presents a family of efficient randomness recycling algorithms for sampling a sequence $X_1, X_2, X_3, \dots$ of discrete random variables whose joint distribution follows an arbitrary stochastic process. We develop randomness recycling strategies to reduce the entropy cost of a variety of prominent sampling algorithms, which include uniform sampling, inverse transform sampling, lookup table sampling, alias sampling, and discrete distribution generating (DDG) tree sampling. Our method achieves an expected amortized entropy cost of $H(X_1,\dots,X_k)/k + \varepsilon$ input random bits per output sample using $O(\log(1/\varepsilon))$ space, which is arbitrarily close to the optimal Shannon entropy rate. The combination of space, time, and entropy properties of our method improve upon the Han and Hoshi interval algorithm and Knuth and Yao entropy-optimal algorithm for sampling a discrete random sequence. An empirical evaluation of the algorithm shows that it achieves state-of-the-art runtime performance on the Fisher-Yates shuffle when using a cryptographically secure pseudorandom number generator. Accompanying the manuscript is a performant random sampling library in the C programming language.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.