Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

PPT: A Privacy-Preserving Global Model Training Protocol for Federated Learning in P2P Networks (2105.14408v2)

Published 30 May 2021 in cs.CR

Abstract: The concept of Federated Learning (FL) has emerged as a convergence of machine learning, information, and communication technology. It is vital to the development of machine learning, which is expected to be fully decentralized, privacy-preserving, secure, and robust. However, general federated learning settings with a central server can't meet requirements in decentralized environment. In this paper, we propose a decentralized, secure and privacy-preserving global model training protocol, named PPT, for federated learning in Peer-to-peer (P2P) Networks. PPT uses a one-hop communication form to aggregate local model update parameters and adopts the symmetric cryptosystem to ensure security. It is worth mentioning that PPT modifies the Eschenauer-Gligor (E-G) scheme to distribute keys for encryption. In terms of privacy preservation, PPT generates random noise to disturb local model update parameters. The noise is eliminated ultimately, which ensures the global model performance compared with other noise-based privacy-preserving methods in FL, e.g., differential privacy. PPT also adopts Game Theory to resist collusion attacks. Through extensive analysis, we demonstrate that PPT various security threats and preserve user privacy. Ingenious experiments demonstrate the utility and efficiency as well.

Citations (14)

Summary

We haven't generated a summary for this paper yet.