2000 character limit reached
Distributed Gradient Descent with Coded Partial Gradient Computations (1811.09271v1)
Published 22 Nov 2018 in cs.LG, cs.DC, cs.IT, eess.SP, math.IT, and stat.ML
Abstract: Coded computation techniques provide robustness against straggling servers in distributed computing, with the following limitations: First, they increase decoding complexity. Second, they ignore computations carried out by straggling servers; and they are typically designed to recover the full gradient, and thus, cannot provide a balance between the accuracy of the gradient and per-iteration completion time. Here we introduce a hybrid approach, called coded partial gradient computation (CPGC), that benefits from the advantages of both coded and uncoded computation schemes, and reduces both the computation time and decoding complexity.
- Emre Ozfatura (33 papers)
- Sennur Ulukus (258 papers)
- Deniz Gunduz (506 papers)