Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Decentralized Gradient Tracking with Local Steps (2301.01313v1)

Published 3 Jan 2023 in math.OC, cs.DC, and cs.LG

Abstract: Gradient tracking (GT) is an algorithm designed for solving decentralized optimization problems over a network (such as training a machine learning model). A key feature of GT is a tracking mechanism that allows to overcome data heterogeneity between nodes. We develop a novel decentralized tracking mechanism, $K$-GT, that enables communication-efficient local updates in GT while inheriting the data-independence property of GT. We prove a convergence rate for $K$-GT on smooth non-convex functions and prove that it reduces the communication overhead asymptotically by a linear factor $K$, where $K$ denotes the number of local steps. We illustrate the robustness and effectiveness of this heterogeneity correction on convex and non-convex benchmark problems and on a non-convex neural network training task with the MNIST dataset.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Yue Liu (257 papers)
  2. Tao Lin (167 papers)
  3. Anastasia Koloskova (18 papers)
  4. Sebastian U. Stich (66 papers)
Citations (28)

Summary

We haven't generated a summary for this paper yet.