Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
158 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Maximum Mean Discrepancy Distributionally Robust Nonlinear Chance-Constrained Optimization with Finite-Sample Guarantee (2204.11564v1)

Published 25 Apr 2022 in math.OC, cs.SY, and eess.SY

Abstract: This paper is motivated by addressing open questions in distributionally robust chance-constrained programs (DRCCP) using the popular Wasserstein ambiguity sets. Specifically, the computational techniques for those programs typically place restrictive assumptions on the constraint functions and the size of the Wasserstein ambiguity sets is often set using costly cross-validation (CV) procedures or conservative measure concentration bounds. In contrast, we propose a practical DRCCP algorithm using kernel maximum mean discrepancy (MMD) ambiguity sets, which we term MMD-DRCCP, to treat general nonlinear constraints without using ad-hoc reformulation techniques. MMD-DRCCP can handle general nonlinear and non-convex constraints with a proven finite-sample constraint satisfaction guarantee of a dimension-independent $\mathcal{O}(\frac{1}{\sqrt{N}})$ rate, achievable by a practical algorithm. We further propose an efficient bootstrap scheme for constructing sharp MMD ambiguity sets in practice without resorting to CV. Our algorithm is validated numerically on a portfolio optimization problem and a tube-based distributionally robust model predictive control problem with non-convex constraints.

Citations (8)

Summary

We haven't generated a summary for this paper yet.