Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Efficient Conditional Gradient Methods for Solving Stochastic Convex Bilevel Optimization Problems (2505.18037v1)

Published 23 May 2025 in math.OC

Abstract: We propose efficient methods for solving stochastic convex bilevel optimization problems, where the goal is to minimize an outer stochastic objective function subject to the solution set of an inner stochastic optimization problem. Existing methods often rely on costly projection or linear optimization oracles over complex sets, which limits scalability. To overcome this, we propose an iteratively regularized conditional gradient framework that leverages efficient linear optimization oracles exclusively over the base feasible set. Our proposed methods employ a vanishing regularization sequence that progressively emphasizes the inner problem while biasing towards desirable minimal outer objective solutions. Under standard convexity assumptions, we establish non-asymptotic convergence rates of $O(t{-({1}/{2}-p)})$ for the outer objective and $O(t{-p})$ for the inner objective, where $p \in (0,1/2)$ controls the regularization decay, in the one-sample stochastic setting, and $O(t{-(1-p)})$ and $O(t{-p})$ in the finite-sum setting using a mini-batch scheme, where $p \in (0,1)$. Experimental results on over-parametrized regression and $\ell_1$-constrained logistics regression tasks demonstrate the practical advantages of our approach over existing methods, confirming our theoretical findings.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com