Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Genetic CFL: Optimization of Hyper-Parameters in Clustered Federated Learning (2107.07233v3)

Published 15 Jul 2021 in cs.AI

Abstract: Federated learning (FL) is a distributed model for deep learning that integrates client-server architecture, edge computing, and real-time intelligence. FL has the capability of revolutionizing ML but lacks in the practicality of implementation due to technological limitations, communication overhead, non-IID (independent and identically distributed) data, and privacy concerns. Training a ML model over heterogeneous non-IID data highly degrades the convergence rate and performance. The existing traditional and clustered FL algorithms exhibit two main limitations, including inefficient client training and static hyper-parameter utilization. To overcome these limitations, we propose a novel hybrid algorithm, namely genetic clustered FL (Genetic CFL), that clusters edge devices based on the training hyper-parameters and genetically modifies the parameters cluster-wise. Then, we introduce an algorithm that drastically increases the individual cluster accuracy by integrating the density-based clustering and genetic hyper-parameter optimization. The results are bench-marked using MNIST handwritten digit dataset and the CIFAR-10 dataset. The proposed genetic CFL shows significant improvements and works well with realistic cases of non-IID and ambiguous data.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Shaashwat Agrawal (3 papers)
  2. Sagnik Sarkar (5 papers)
  3. Mamoun Alazab (14 papers)
  4. Praveen Kumar Reddy Maddikunta (24 papers)
  5. Thippa Reddy Gadekallu (36 papers)
  6. Quoc-Viet Pham (66 papers)
Citations (48)

Summary

We haven't generated a summary for this paper yet.