Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Online Stochastic Gradient Methods Under Sub-Weibull Noise and the Polyak-Łojasiewicz Condition (2108.03285v2)

Published 6 Aug 2021 in math.OC, cs.SY, and eess.SY

Abstract: This paper focuses on the online gradient and proximal-gradient methods with stochastic gradient errors. In particular, we examine the performance of the online gradient descent method when the cost satisfies the Polyak-\L ojasiewicz (PL) inequality. We provide bounds in expectation and in high probability (that hold iteration-wise), with the latter derived by leveraging a sub-Weibull model for the errors affecting the gradient. The convergence results show that the instantaneous regret converges linearly up to an error that depends on the variability of the problem and the statistics of the sub-Weibull gradient error. Similar convergence results are then provided for the online proximal-gradient method, under the assumption that the composite cost satisfies the proximal-PL condition. In the case of static costs, we provide new bounds for the regret incurred by these methods when the gradient errors are modeled as sub-Weibull random variables. Illustrative simulations are provided to corroborate the technical findings.

Citations (3)

Summary

We haven't generated a summary for this paper yet.