Papers
Topics
Authors
Recent
2000 character limit reached

Efficient Online Mirror Descent Stochastic Approximation for Multi-Stage Stochastic Programming (2506.15392v1)

Published 18 Jun 2025 in math.OC

Abstract: We study the unconstrained and the minimax saddle point variants of the convex multi-stage stochastic programming problem, where consecutive decisions are coupled through the objective functions, rather than through the constraints. Based on the analysis of deterministic mirror descent algorithms with inexact gradients, we introduce the idea of \textit{stochastic conditional gradient oracles}, a multi-stage analog of the stochastic gradient oracles used in (classical) stochastic programming. We show one approach to construct such oracles and prove the convergence of the (accelerated) mirror descent stochastic approximation, both in expectation and with high probability. To further reduce the oracle complexity, we view the problem from a \textit{semi-online} perspective, where the stage $t$ decision variables are constructed $s$ stages in advance, instead of before stage $1$. We show that the delay in decision making allows an asynchronous implementation of the mirror descent stochastic approximation algorithms. By avoiding computing solutions for scenarios that are inconsistent with information available during stage $t$, the complexity is reduced from exponential to linear in the number of stages.

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.