Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

SaFL: Sybil-aware Federated Learning with Application to Face Recognition (2311.04346v2)

Published 7 Nov 2023 in cs.CV, cs.CR, and cs.LG

Abstract: Federated Learning (FL) is a machine learning paradigm to conduct collaborative learning among clients on a joint model. The primary goal is to share clients' local training parameters with an integrating server while preserving their privacy. This method permits to exploit the potential of massive mobile users' data for the benefit of machine learning models' performance while keeping sensitive data on local devices. On the downside, FL raises security and privacy concerns that have just started to be studied. To address some of the key threats in FL, researchers have proposed to use secure aggregation methods (e.g. homomorphic encryption, secure multiparty computation, etc.). These solutions improve some security and privacy metrics, but at the same time bring about other serious threats such as poisoning attacks, backdoor attacks, and free running attacks. This paper proposes a new defense method against poisoning attacks in FL called SaFL (Sybil-aware Federated Learning) that minimizes the effect of sybils with a novel time-variant aggregation scheme.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Mahdi Ghafourian (7 papers)
  2. Julian Fierrez (131 papers)
  3. Ruben Vera-Rodriguez (66 papers)
  4. Ruben Tolosana (79 papers)
  5. Aythami Morales (93 papers)
Citations (1)