Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Cell-Free Massive MIMO for Wireless Federated Learning (1909.12567v5)

Published 27 Sep 2019 in eess.SP, cs.IT, and math.IT

Abstract: This paper proposes a novel scheme for cell-free massive multiple-input multiple-output (CFmMIMO) networks to support any federated learning (FL) framework. This scheme allows each instead of all the iterations of the FL framework to happen in a large-scale coherence time to guarantee a stable operation of an FL process. To show how to optimize the FL performance using this proposed scheme, we consider an existing FL framework as an example and target FL training time minimization for this framework. An optimization problem is then formulated to jointly optimize the local accuracy, transmit power, data rate, and users' processing frequency. This mixed-timescale stochastic nonconvex problem captures the complex interactions among the training time, and transmission and computation of training updates of one FL process. By employing the online successive convex approximation approach, we develop a new algorithm to solve the formulated problem with proven convergence to the neighbourhood of its stationary points. Our numerical results confirm that the presented joint design reduces the training time by up to $55\%$ over baseline approaches. They also show that CFmMIMO here requires the lowest training time for FL processes compared with cell-free time-division multiple access massive MIMO and collocated massive MIMO.

Citations (135)

Summary

We haven't generated a summary for this paper yet.