Data-Driven Contextual Optimization with Gaussian Mixtures: Flow-Based Generalization, Robust Models, and Multistage Extensions (2509.14557v1)
Abstract: Contextual optimization enhances decision quality by leveraging side information to improve predictions of uncertain parameters. However, existing approaches face significant challenges when dealing with multimodal or mixtures of distributions. The inherent complexity of such structures often precludes an explicit functional relationship between the contextual information and the uncertain parameters, limiting the direct applicability of parametric models. Conversely, while non-parametric models offer greater representational flexibility, they are plagued by the "curse of dimensionality," leading to unsatisfactory performance in high-dimensional problems. To address these challenges, this paper proposes a novel contextual optimization framework based on Gaussian Mixture Models (GMMs). This model naturally bridges the gap between parametric and non-parametric approaches, inheriting the favorable sample complexity of parametric models while retaining the expressiveness of non-parametric schemes. By employing normalizing flows, we further relax the GM assumption and extend our framework to arbitrary distributions. Finally, inspired by the structural properties of GMMs, we design a novel GMM-based solution scheme for multistage stochastic optimization problems with Markovian uncertainty. This method exhibits significantly better sample complexity compared to traditional approaches, offering a powerful methodology for solving long-horizon, high-dimensional multistage problems. We demonstrate the effectiveness of our framework through extensive numerical experiments on a series of operations management problems. The results show that our proposed approach consistently outperforms state-of-the-art methods, underscoring its practical value for complex decision-making problems under uncertainty.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Collections
Sign up for free to add this paper to one or more collections.