Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
121 tokens/sec
GPT-4o
9 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Stochastic Moving Anchor Algorithms and a Popov's Scheme with Moving Anchor (2506.07290v1)

Published 8 Jun 2025 in math.OC

Abstract: Since their introduction, anchoring methods in extragradient-type saddlepoint problems have inspired a flurry of research due to their ability to provide order-optimal rates of accelerated convergence in very general problem settings. Such guarantees are especially important as researchers consider problems in AI and ML, where large problem sizes demand immense computational power. Much of the more recent works explore theoretical aspects of this new acceleration framework, connecting it to existing methods and order-optimal convergence rates from the literature. However, in practice introducing stochastic oracles allows for more computational efficiency given the size of many modern optimization problems. To this end, this work provides the moving anchor variants [1] of the original anchoring algorithms [36] with stochastic implementations and robust analyses to bridge the gap from deterministic to stochastic algorithm settings. In particular, we demonstrate that an accelerated convergence rate theory for stochastic oracles also exists for our moving anchor scheme, itself a generalization of the original fixed anchor algorithms, and provide numerical results that validate our theoretical findings. We also develop a tentative moving anchor Popov scheme based on the work in [33], with promising numerical results pointing towards an as-of-yet uncovered general convergence theory for such methods.

Summary

We haven't generated a summary for this paper yet.