Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Effectively Heterogeneous Federated Learning: A Pairing and Split Learning Based Approach (2308.13849v1)

Published 26 Aug 2023 in cs.LG, cs.AI, cs.SY, and eess.SY

Abstract: As a promising paradigm federated Learning (FL) is widely used in privacy-preserving machine learning, which allows distributed devices to collaboratively train a model while avoiding data transmission among clients. Despite its immense potential, the FL suffers from bottlenecks in training speed due to client heterogeneity, leading to escalated training latency and straggling server aggregation. To deal with this challenge, a novel split federated learning (SFL) framework that pairs clients with different computational resources is proposed, where clients are paired based on computing resources and communication rates among clients, meanwhile the neural network model is split into two parts at the logical level, and each client only computes the part assigned to it by using the SL to achieve forward inference and backward training. Moreover, to effectively deal with the client pairing problem, a heuristic greedy algorithm is proposed by reconstructing the optimization of training latency as a graph edge selection problem. Simulation results show the proposed method can significantly improve the FL training speed and achieve high performance both in independent identical distribution (IID) and Non-IID data distribution.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Jinglong Shen (3 papers)
  2. Xiucheng Wang (24 papers)
  3. Nan Cheng (94 papers)
  4. Longfei Ma (11 papers)
  5. Conghao Zhou (37 papers)
  6. Yuan Zhang (331 papers)
Citations (3)

Summary

We haven't generated a summary for this paper yet.