Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

FedDCT: A Dynamic Cross-Tier Federated Learning Framework in Wireless Networks (2307.04420v2)

Published 10 Jul 2023 in cs.DC and cs.AI

Abstract: Federated Learning (FL), as a privacy-preserving machine learning paradigm, trains a global model across devices without exposing local data. However, resource heterogeneity and inevitable stragglers in wireless networks severely impact the efficiency and accuracy of FL training. In this paper, we propose a novel Dynamic Cross-Tier Federated Learning framework (FedDCT). Firstly, we design a dynamic tiering strategy that dynamically partitions devices into different tiers based on their response times and assigns specific timeout thresholds to each tier to reduce single-round training time. Then, we propose a cross-tier device selection algorithm that selects devices that respond quickly and are conducive to model convergence to improve convergence efficiency and accuracy. Experimental results demonstrate that the proposed approach under wireless networks outperforms the baseline approach, with an average reduction of 54.7\% in convergence time and an average improvement of 1.83\% in convergence accuracy.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Peng Liu (372 papers)
  2. Youquan Xian (10 papers)
  3. Chuanjian Yao (2 papers)
  4. Xiaoyun Gan (2 papers)
  5. Dongcheng Li (8 papers)
  6. Peng Wang (832 papers)
  7. Ying Zhao (69 papers)

Summary

We haven't generated a summary for this paper yet.