Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Compressed Zeroth-Order Algorithm for Stochastic Distributed Nonconvex Optimization (2503.23426v2)

Published 30 Mar 2025 in math.OC

Abstract: This paper studies the stochastic distributed nonconvex optimization problem over a network of agents, where agents only access stochastic zeroth-order information about their local cost functions and collaboratively optimize the global objective over bandwidth-limited communication networks. To mitigate communication overhead and handle the unavailability of explicit gradient information, we propose a communication compressed zeroth-order stochastic distributed (CZSD) algorithm. By integrating a generalized contractive compressor and a stochastic two-point zeroth-order oracle, CZSD achieves convergence rates comparable to its exact communication counterpart while reducing both communication overhead and sampling complexity. Specifically, to the best of our knowledge, CZSD is the first compressed zeroth-order algorithm achieving linear speedup, with convergence rates of $\mathcal{O}(\sqrt{p}/\sqrt{nT})$ and $\mathcal{O}(p/(nT))$ under general nonconvex settings and the Polyak--{\L}ojasiewicz condition, respectively. Numerical experiments validate the algorithm's effectiveness and communication efficiency.

Summary

We haven't generated a summary for this paper yet.