Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 79 tok/s
Gemini 2.5 Pro 49 tok/s Pro
GPT-5 Medium 45 tok/s
GPT-5 High 43 tok/s Pro
GPT-4o 103 tok/s
GPT OSS 120B 475 tok/s Pro
Kimi K2 215 tok/s Pro
2000 character limit reached

Variance Reduction for Sequential Sampling in Stochastic Programming (2005.02458v2)

Published 5 May 2020 in math.OC and stat.ME

Abstract: This paper investigates the variance reduction techniques Antithetic Variates (AV) and Latin Hypercube Sampling (LHS) when used for sequential sampling in stochastic programming and presents a comparative computational study. It shows conditions under which the sequential sampling with AV and LHS satisfy finite stopping guarantees and are asymptotically valid, discussing LHS in detail. It computationally compares their use in both the sequential and non-sequential settings through a collection of two-stage stochastic linear programs with different characteristics. The numerical results show that while both AV and LHS can be preferable to random sampling in either setting, LHS typically dominates in the non-sequential setting while performing well sequentially and AV gains some advantages in the sequential setting. These results imply that, given the ease of implementation of these variance reduction techniques, armed with the same theoretical properties and improved empirical performance relative to random sampling, AV and LHS sequential procedures present attractive alternatives in practice for a class of stochastic programs.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.