Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Bandwidth Allocation for Multiple Federated Learning Services in Wireless Edge Networks (2101.03627v1)

Published 10 Jan 2021 in cs.NI and cs.LG

Abstract: This paper studies a federated learning (FL) system, where \textit{multiple} FL services co-exist in a wireless network and share common wireless resources. It fills the void of wireless resource allocation for multiple simultaneous FL services in the existing literature. Our method designs a two-level resource allocation framework comprising \emph{intra-service} resource allocation and \emph{inter-service} resource allocation. The intra-service resource allocation problem aims to minimize the length of FL rounds by optimizing the bandwidth allocation among the clients of each FL service. Based on this, an inter-service resource allocation problem is further considered, which distributes bandwidth resources among multiple simultaneous FL services. We consider both cooperative and selfish providers of the FL services. For cooperative FL service providers, we design a distributed bandwidth allocation algorithm to optimize the overall performance of multiple FL services, meanwhile cater to the fairness among FL services and the privacy of clients. For selfish FL service providers, a new auction scheme is designed with the FL service owners as the bidders and the network provider as the auctioneer. The designed auction scheme strikes a balance between the overall FL performance and fairness. Our simulation results show that the proposed algorithms outperform other benchmarks under various network conditions.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Jie Xu (467 papers)
  2. Heqiang Wang (9 papers)
  3. Lixing Chen (26 papers)
Citations (34)

Summary

We haven't generated a summary for this paper yet.