Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
143 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Communication Compression for Distributed Nonconvex Optimization (2201.03930v3)

Published 11 Jan 2022 in math.OC

Abstract: This paper considers distributed nonconvex optimization with the cost functions being distributed over agents. Noting that information compression is a key tool to reduce the heavy communication load for distributed algorithms as agents iteratively communicate with neighbors, we propose three distributed primal--dual algorithms with compressed communication. The first two algorithms are applicable to a general class of compressors with bounded relative compression error and the third algorithm is suitable for two general classes of compressors with bounded absolute compression error. We show that the proposed distributed algorithms with compressed communication have comparable convergence properties as state-of-the-art algorithms with exact communication. Specifically, we show that they can find first-order stationary points with sublinear convergence rate $\mathcal{O}(1/T)$ when each local cost function is smooth, where $T$ is the total number of iterations, and find global optima with linear convergence rate under an additional condition that the global cost function satisfies the Polyak--{\L}ojasiewicz condition. Numerical simulations are provided to illustrate the effectiveness of the theoretical results.

Summary

We haven't generated a summary for this paper yet.