Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
194 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Distributed Stochastic Optimization for Non-Smooth and Weakly Convex Problems under Heavy-Tailed Noise (2505.09279v1)

Published 14 May 2025 in math.OC

Abstract: In existing distributed stochastic optimization studies, it is usually assumed that the gradient noise has a bounded variance. However, recent research shows that the heavy-tailed noise, which allows an unbounded variance, is closer to practical scenarios in many tasks. Under heavy-tailed noise, traditional optimization methods, such as stochastic gradient descent, may have poor performance and even diverge. Thus, it is of great importance to study distributed stochastic optimization algorithms applicable to the heavy-tailed noise scenario. However, most of the existing distributed algorithms under heavy-tailed noise are developed for convex and smooth problems, which limits their applications. This paper proposes a clipping-based distributed stochastic algorithm under heavy-tailed noise that is suitable for non-smooth and weakly convex problems. The convergence of the proposed algorithm is proven, and the conditions on the parameters are given. A numerical experiment is conducted to demonstrate the effectiveness of the proposed algorithm.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com