Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

An Exact Quantized Decentralized Gradient Descent Algorithm (1806.11536v3)

Published 29 Jun 2018 in cs.LG, cs.DC, math.OC, and stat.ML

Abstract: We consider the problem of decentralized consensus optimization, where the sum of $n$ smooth and strongly convex functions are minimized over $n$ distributed agents that form a connected network. In particular, we consider the case that the communicated local decision variables among nodes are quantized in order to alleviate the communication bottleneck in distributed optimization. We propose the Quantized Decentralized Gradient Descent (QDGD) algorithm, in which nodes update their local decision variables by combining the quantized information received from their neighbors with their local information. We prove that under standard strong convexity and smoothness assumptions for the objective function, QDGD achieves a vanishing mean solution error under customary conditions for quantizers. To the best of our knowledge, this is the first algorithm that achieves vanishing consensus error in the presence of quantization noise. Moreover, we provide simulation results that show tight agreement between our derived theoretical convergence rate and the numerical results.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Amirhossein Reisizadeh (14 papers)
  2. Aryan Mokhtari (95 papers)
  3. Hamed Hassani (120 papers)
  4. Ramtin Pedarsani (82 papers)
Citations (115)

Summary

We haven't generated a summary for this paper yet.