Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Over-the-Air Federated Learning with Enhanced Privacy (2212.11486v1)

Published 22 Dec 2022 in cs.CR and eess.SP

Abstract: Federated learning (FL) has emerged as a promising learning paradigm in which only local model parameters (gradients) are shared. Private user data never leaves the local devices thus preserving data privacy. However, recent research has shown that even when local data is never shared by a user, exchanging model parameters without protection can also leak private information. Moreover, in wireless systems, the frequent transmission of model parameters can cause tremendous bandwidth consumption and network congestion when the model is large. To address this problem, we propose a new FL framework with efficient over-the-air parameter aggregation and strong privacy protection of both user data and models. We achieve this by introducing pairwise cancellable random artificial noises (PCR-ANs) on end devices. As compared to existing over-the-air computation (AirComp) based FL schemes, our design provides stronger privacy protection. We analytically show the secrecy capacity and the convergence rate of the proposed wireless FL aggregation algorithm.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Xiaochan Xue (1 paper)
  2. Moh Khalid Hasan (6 papers)
  3. Shucheng Yu (18 papers)
  4. Laxima Niure Kandel (2 papers)
  5. Min Song (25 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.