Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Distributed Gradient Descent with Coded Partial Gradient Computations (1811.09271v1)

Published 22 Nov 2018 in cs.LG, cs.DC, cs.IT, eess.SP, math.IT, and stat.ML

Abstract: Coded computation techniques provide robustness against straggling servers in distributed computing, with the following limitations: First, they increase decoding complexity. Second, they ignore computations carried out by straggling servers; and they are typically designed to recover the full gradient, and thus, cannot provide a balance between the accuracy of the gradient and per-iteration completion time. Here we introduce a hybrid approach, called coded partial gradient computation (CPGC), that benefits from the advantages of both coded and uncoded computation schemes, and reduces both the computation time and decoding complexity.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Emre Ozfatura (33 papers)
  2. Sennur Ulukus (258 papers)
  3. Deniz Gunduz (506 papers)
Citations (40)

Summary

We haven't generated a summary for this paper yet.