Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Multi-Resource Allocation for On-Device Distributed Federated Learning Systems (2211.00481v1)

Published 1 Nov 2022 in eess.SY, cs.LG, and cs.SY

Abstract: This work poses a distributed multi-resource allocation scheme for minimizing the weighted sum of latency and energy consumption in the on-device distributed federated learning (FL) system. Each mobile device in the system engages the model training process within the specified area and allocates its computation and communication resources for deriving and uploading parameters, respectively, to minimize the objective of system subject to the computation/communication budget and a target latency requirement. In particular, mobile devices are connect via wireless TCP/IP architectures. Exploiting the optimization problem structure, the problem can be decomposed to two convex sub-problems. Drawing on the Lagrangian dual and harmony search techniques, we characterize the global optimal solution by the closed-form solutions to all sub-problems, which give qualitative insights to multi-resource tradeoff. Numerical simulations are used to validate the analysis and assess the performance of the proposed algorithm.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Yulan Gao (25 papers)
  2. Ziqiang Ye (10 papers)
  3. Han Yu (218 papers)
  4. Zehui Xiong (177 papers)
  5. Yue Xiao (46 papers)
  6. Dusit Niyato (671 papers)
Citations (6)