Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On Gradient Coding with Partial Recovery (2102.10163v4)

Published 19 Feb 2021 in cs.IT, cs.DC, and math.IT

Abstract: We consider a generalization of the gradient coding framework where a dataset is divided across $n$ workers and each worker transmits to a master node one or more linear combinations of the gradients over its assigned data subsets. Unlike the conventional framework which requires the master node to recover the sum of the gradients over all the data subsets in the presence of straggler workers, we relax the goal to computing the sum of at least some $\alpha$ fraction of the gradients. We begin by deriving a lower bound on the computation load of any scheme and also propose two strategies which achieve this lower bound, albeit at the cost of high communication load and a number of data partitions which can be polynomial in $n$. We then propose schemes based on cyclic assignment which utilize $n$ data partitions and have a lower communication load. When each worker transmits a single linear combination, we prove lower bounds on the computation load of any scheme using $n$ data partitions. Finally, we describe a class of schemes which achieve different intermediate operating points for the computation and communication load and provide simulation results to demonstrate the empirical performance of our schemes.

Citations (9)

Summary

We haven't generated a summary for this paper yet.