Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Decentralized Composite Optimization with Compression (2108.04448v2)

Published 10 Aug 2021 in cs.LG, cs.DC, and math.OC

Abstract: Decentralized optimization and communication compression have exhibited their great potential in accelerating distributed machine learning by mitigating the communication bottleneck in practice. While existing decentralized algorithms with communication compression mostly focus on the problems with only smooth components, we study the decentralized stochastic composite optimization problem with a potentially non-smooth component. A \underline{Prox}imal gradient \underline{L}in\underline{EA}r convergent \underline{D}ecentralized algorithm with compression, Prox-LEAD, is proposed with rigorous theoretical analyses in the general stochastic setting and the finite-sum setting. Our theorems indicate that Prox-LEAD works with arbitrary compression precision, and it tremendously reduces the communication cost almost for free. The superiorities of the proposed algorithms are demonstrated through the comparison with state-of-the-art algorithms in terms of convergence complexities and numerical experiments. Our algorithmic framework also generally enlightens the compressed communication on other primal-dual algorithms by reducing the impact of inexact iterations, which might be of independent interest.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Yao Li (192 papers)
  2. Xiaorui Liu (50 papers)
  3. Jiliang Tang (204 papers)
  4. Ming Yan (190 papers)
  5. Kun Yuan (118 papers)
Citations (9)

Summary

We haven't generated a summary for this paper yet.