Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Compressed Gradient Tracking Methods for Decentralized Optimization with Linear Convergence (2103.13748v3)

Published 25 Mar 2021 in math.OC, cs.DC, cs.MA, and cs.SI

Abstract: Communication compression techniques are of growing interests for solving the decentralized optimization problem under limited communication, where the global objective is to minimize the average of local cost functions over a multi-agent network using only local computation and peer-to-peer communication. In this paper, we first propose a novel compressed gradient tracking algorithm (C-GT) that combines gradient tracking technique with communication compression. In particular, C-GT is compatible with a general class of compression operators that unifies both unbiased and biased compressors. We show that C-GT inherits the advantages of gradient tracking-based algorithms and achieves linear convergence rate for strongly convex and smooth objective functions. In the second part of this paper, we propose an error feedback based compressed gradient tracking algorithm (EF-C-GT) to further improve the algorithm efficiency for biased compression operators. Numerical examples complement the theoretical findings and demonstrate the efficiency and flexibility of the proposed algorithms.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Yiwei Liao (5 papers)
  2. Zhuorui Li (4 papers)
  3. Kun Huang (85 papers)
  4. Shi Pu (109 papers)
Citations (13)

Summary

We haven't generated a summary for this paper yet.