Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
158 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Private and Secure Distributed Matrix Multiplication Schemes for Replicated or MDS-Coded Servers (2106.11214v2)

Published 21 Jun 2021 in cs.IT and math.IT

Abstract: In this paper, we study the problem of \emph{private and secure distributed matrix multiplication (PSDMM)}, where a user having a private matrix $A$ and $N$ non-colluding servers sharing a library of $L$ ($L>1$) matrices $B{(0)}, B{(1)},\ldots,B{(L-1)}$, for which the user wishes to compute $AB{(\theta)}$ for some $\theta\in [0, L)$ without revealing any information of the matrix $A$ to the servers, and keeping the index $\theta$ private to the servers. Previous work is limited to the case that the shared library (\textit{i.e.,} the matrices $B{(0)}, B{(1)},\ldots,B{(L-1)}$) is stored across the servers in a replicated form and schemes are very scarce in the literature, there is still much room for improvement. In this paper, we propose two PSDMM schemes, where one is limited to the case that the shared library is stored across the servers in a replicated form but has a better performance than state-of-the-art schemes in that it can achieve a smaller recovery threshold and download cost. The other one focuses on the case that the shared library is stored across the servers in an MDS-coded form, which requires less storage in the servers. The second PSDMM code does not subsume the first one even if the underlying MDS code is degraded to a repetition code as they are totally two different schemes.

Citations (18)

Summary

We haven't generated a summary for this paper yet.