Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
158 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Privacy-Preserving Data Deduplication for Enhancing Federated Learning of Language Models (Extended Version) (2407.08152v2)

Published 11 Jul 2024 in cs.CR, cs.AI, cs.CL, and cs.LG

Abstract: Deduplication is a vital preprocessing step that enhances machine learning model performance and saves training time and energy. However, enhancing federated learning through deduplication poses challenges, especially regarding scalability and potential privacy violations if deduplication involves sharing all clients' data. In this paper, we address the problem of deduplication in a federated setup by introducing a pioneering protocol, Efficient Privacy-Preserving Multi-Party Deduplication (EP-MPD). It efficiently removes duplicates from multiple clients' datasets without compromising data privacy. EP-MPD is constructed in a modular fashion, utilizing two novel variants of the Private Set Intersection protocol. Our extensive experiments demonstrate the significant benefits of deduplication in federated learning of LLMs. For instance, we observe up to 19.62\% improvement in perplexity and up to 27.95\% reduction in running time while varying the duplication level between 10\% and 30\%. EP-MPD effectively balances privacy and performance in federated learning, making it a valuable solution for large-scale applications.

Citations (1)

Summary

We haven't generated a summary for this paper yet.