Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
96 tokens/sec
Gemini 2.5 Pro Premium
42 tokens/sec
GPT-5 Medium
20 tokens/sec
GPT-5 High Premium
27 tokens/sec
GPT-4o
100 tokens/sec
DeepSeek R1 via Azure Premium
86 tokens/sec
GPT OSS 120B via Groq Premium
464 tokens/sec
Kimi K2 via Groq Premium
181 tokens/sec
2000 character limit reached

Uncertainty Quantification using Simulation Output: Batching as an Inferential Device (2311.04159v2)

Published 7 Nov 2023 in stat.ME and math.PR

Abstract: We present batching as an omnibus device for uncertainty quantification using simulation output. We consider the classical context of a simulationist performing uncertainty quantification on an estimator $\theta_n$ (of an unknown fixed quantity $\theta$) using only the output data $(Y_1,Y_2,\ldots,Y_n)$ gathered from a simulation. By uncertainty quantification, we mean approximating the sampling distribution of the error $\theta_n-\theta$ toward: (A) estimating an assessment functional $\psi$, e.g., bias, variance, or quantile; or (B) constructing a $(1-\alpha)$-confidence region on $\theta$. We argue that batching is a remarkably simple and effective device for this purpose, and is especially suited for handling dependent output data such as what one frequently encounters in simulation contexts. We demonstrate that if the number of batches and the extent of their overlap are chosen appropriately, batching retains bootstrap's attractive theoretical properties of strong consistency and higher-order accuracy. For constructing confidence regions, we characterize two limiting distributions associated with a Studentized statistic. Our extensive numerical experience confirms theoretical insight, especially about the effects of batch size and batch overlap.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube