Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 62 tok/s
Gemini 2.5 Pro 51 tok/s Pro
GPT-5 Medium 36 tok/s Pro
GPT-5 High 30 tok/s Pro
GPT-4o 67 tok/s Pro
Kimi K2 192 tok/s Pro
GPT OSS 120B 430 tok/s Pro
Claude Sonnet 4.5 34 tok/s Pro
2000 character limit reached

Secure Linear Aggregation Using Decentralized Threshold Additive Homomorphic Encryption For Federated Learning (2111.10753v1)

Published 21 Nov 2021 in cs.CR

Abstract: Secure linear aggregation is to linearly aggregate private inputs of different users with privacy protection. The server in a federated learning (FL) environment can fulfill any linear computation on private inputs of users through the secure linear aggregation. At present, based on pseudo-random number generator and one-time padding technique, one can efficiently compute the sum of user inputs in FL, but linear calculations of user inputs are not well supported. Based on decentralized threshold additive homomorphic encryption (DTAHE) schemes, this paper provides a secure linear aggregation protocol, which allows the server to multiply the user inputs by any coefficients and to sum them together, so that the server can build a full connected layer or a convolution layer on top of user inputs. The protocol adopts the framework of Bonawitz et al. to provide fault tolerance for user dropping out, and exploits a blockchain smart contract to encourage the server honest. The paper gives a security model, security proofs and a concrete lattice based DTAHE scheme for the protocol. It evaluates the communication and computation costs of known DTAHE construction methods. The evaluation shows that an elliptic curve based DTAHE is friendly to users and the lattice based version leads to a light computation on the server.

Citations (8)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.