Dice Question Streamline Icon: https://streamlinehq.com

Optimal query complexity for SCO under a variance-bounded SGO (VSGO)

Determine whether stochastic convex optimization of an L-Lipschitz convex function f: R^d -> R with minimizer x* in the Euclidean ball of radius R can be solved using O(R^2 sigma_V^2 / epsilon^2 + d) queries to a sigma_V-variance-bounded stochastic gradient oracle (VSGO), i.e., an oracle O_V such that E[||O_V(x) - ∇f(x)||^2] ≤ sigma_V^2 for all x.

Information Square Streamline Icon: https://streamlinehq.com

Background

The paper studies stochastic convex optimization (SCO) for L-Lipschitz convex functions using stochastic gradient information. Under a variance-bounded SGO (VSGO) with parameter sigma_V, stochastic gradient descent achieves O(R2 sigma_V2 / epsilon2 + R2 L2 / epsilon2) queries. When sigma_V = 0 (deterministic gradients), cutting plane methods achieve O(d) query complexity at sufficiently high precision.

This motivates seeking a dimension-dependent bound that matches O(R2 sigma_V2 / epsilon2) for the stochastic part and O(d) for the deterministic component. The authors explicitly pose whether O(R2 sigma_V2 / epsilon2 + d) queries suffice under only the VSGO assumption, which, if true, would improve over SGD by removing the R2 L2 / epsilon2 term in appropriate regimes.

References

this begs the natural open problem:

Is it possible to solve SCO with $O(R2 2 / \epsilon2 + d)$ queries to a $$-VSGO?

Isotropic Noise in Stochastic and Quantum Convex Optimization (2510.20745 - Marsden et al., 23 Oct 2025) in Open Problem (label openprob:conjectured-rate), Introduction