Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Gradient Coding with Dynamic Clustering for Straggler Mitigation (2011.01922v1)

Published 3 Nov 2020 in cs.IT, cs.DC, cs.LG, eess.SP, and math.IT

Abstract: In distributed synchronous gradient descent (GD) the main performance bottleneck for the per-iteration completion time is the slowest \textit{straggling} workers. To speed up GD iterations in the presence of stragglers, coded distributed computation techniques are implemented by assigning redundant computations to workers. In this paper, we propose a novel gradient coding (GC) scheme that utilizes dynamic clustering, denoted by GC-DC, to speed up the gradient calculation. Under time-correlated straggling behavior, GC-DC aims at regulating the number of straggling workers in each cluster based on the straggler behavior in the previous iteration. We numerically show that GC-DC provides significant improvements in the average completion time (of each iteration) with no increase in the communication load compared to the original GC scheme.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Baturalp Buyukates (26 papers)
  2. Emre Ozfatura (33 papers)
  3. Sennur Ulukus (258 papers)
  4. Deniz Gunduz (506 papers)
Citations (6)

Summary

We haven't generated a summary for this paper yet.