Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Recovering Best Statistical Guarantees via the Empirical Divergence-based Distributionally Robust Optimization (1605.09349v1)

Published 30 May 2016 in math.OC

Abstract: We investigate the use of distributionally robust optimization (DRO) as a tractable tool to recover the asymptotic statistical guarantees provided by the Central Limit Theorem, for maintaining the feasibility of an expected value constraint under ambiguous probability distributions. We show that using empirically defined Burg-entropy divergence balls to construct the DRO can attain such guarantees. These balls, however, are not reasoned from the standard data-driven DRO framework since by themselves they can have low or even zero probability of covering the true distribution. Rather, their superior statistical performances are endowed by linking the resulting DRO with empirical likelihood and empirical processes. We show that the sizes of these balls can be optimally calibrated using chi-square process excursion. We conduct numerical experiments to support our theoretical findings.

Summary

We haven't generated a summary for this paper yet.