Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
86 tokens/sec
GPT-4o
11 tokens/sec
Gemini 2.5 Pro Pro
52 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
3 tokens/sec
DeepSeek R1 via Azure Pro
33 tokens/sec
2000 character limit reached

Computer-aided analyses of stochastic first-order methods, via interpolation conditions for stochastic optimization (2507.05466v1)

Published 7 Jul 2025 in math.OC

Abstract: This work proposes a framework, embedded within the Performance Estimation framework (PEP), for obtaining worst-case performance guarantees on stochastic first-order methods. Given a first-order method, a function class, and a noise model with prescribed expectation and variance properties, we present a range of semidefinite programs (SDPs) of increasingly large size, whose solutions yield increasingly strong convergence guarantees on the problem. Eventually, we propose SDPs whose size depends on $2N$, with $N$ the number of iterations analyzed, that yield tight guarantees, attained by specific functions and noise distributions within these classes. On the other side of the spectrum, we propose SDPs whose size depends linearly on $N$, and numerically show that, on many problems, they already provide tight guarantees. The framework accommodates a wide range of stochastic settings, with finite or infinite support, including the unstructured noise model with bounded variance, finite-sum optimization, and block-coordinate methods, in a unified manner, as guarantees apply to any setting consistent with the noise model, i.e., its expectation and variance. It covers both non-variance-reduced and variance-reduced methods. Using the framework, we analyze the stochastic gradient method under several noise models, and illustrate how the resulting numerical and analytical convergence rates connect with existing results. In particular, we provide improved convergence rates on the unstructured noise model with bounded variance and in the block-coordinate setting.

Summary

We haven't generated a summary for this paper yet.