Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 95 tok/s
Gemini 2.5 Pro 46 tok/s Pro
GPT-5 Medium 15 tok/s Pro
GPT-5 High 19 tok/s Pro
GPT-4o 90 tok/s Pro
GPT OSS 120B 449 tok/s Pro
Kimi K2 192 tok/s Pro
2000 character limit reached

Beyond Chance-Constrained Convex Mixed-Integer Optimization: A Generalized Calafiore-Campi Algorithm and the notion of $S$-optimization (1504.00076v2)

Published 1 Apr 2015 in math.MG, math.CO, and math.OC

Abstract: The scenario approach developed by Calafiore and Campi to attack chance-constrained convex programs utilizes random sampling on the uncertainty parameter to substitute the original problem with a representative continuous convex optimization with $N$ convex constraints which is a relaxation of the original. Calafiore and Campi provided an explicit estimate on the size $N$ of the sampling relaxation to yield high-likelihood feasible solutions of the chance-constrained problem. They measured the probability of the original constraints to be violated by the random optimal solution from the relaxation of size $N$. This paper has two main contributions. First, we present a generalization of the Calafiore-Campi results to both integer and mixed-integer variables. In fact, we demonstrate that their sampling estimates work naturally for variables restricted to some subset $S$ of $\mathbb Rd$. The key elements are generalizations of Helly's theorem where the convex sets are required to intersect $S \subset \mathbb Rd$. The size of samples in both algorithms will be directly determined by the $S$-Helly numbers. Motivated by the first half of the paper, for any subset $S \subset \mathbb Rd$, we introduce the notion of an $S$-optimization problem, where the variables take on values over $S$. It generalizes continuous, integer, and mixed-integer optimization. We illustrate with examples the expressive power of $S$-optimization to capture sophisticated combinatorial optimization problems with difficult modular constraints. We reinforce the evidence that $S$-optimization is "the right concept" by showing that the well-known randomized sampling algorithm of K. Clarkson for low-dimensional convex optimization problems can be extended to work with variables taking values over $S$.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Ai Generate Text Spark Streamline Icon: https://streamlinehq.com

Paper Prompts

Sign up for free to create and run prompts on this paper using GPT-5.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.