Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Securely Aggregated Coded Matrix Inversion (2301.03539v3)

Published 9 Jan 2023 in cs.IT, cs.CR, cs.NA, math.IT, and math.NA

Abstract: Coded computing is a method for mitigating straggling workers in a centralized computing network, by using erasure-coding techniques. Federated learning is a decentralized model for training data distributed across client devices. In this work we propose approximating the inverse of an aggregated data matrix, where the data is generated by clients; similar to the federated learning paradigm, while also being resilient to stragglers. To do so, we propose a coded computing method based on gradient coding. We modify this method so that the coordinator does not access the local data at any point; while the clients access the aggregated matrix in order to complete their tasks. The network we consider is not centrally administrated, and the communications which take place are secure against potential eavesdroppers.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Neophytos Charalambides (15 papers)
  2. Mert Pilanci (102 papers)
  3. Alfred Hero (67 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.