Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Completion Time Minimization of Fog-RAN-Assisted Federated Learning With Rate-Splitting Transmission (2206.01373v1)

Published 3 Jun 2022 in eess.SP, cs.IT, cs.LG, and math.IT

Abstract: This work studies federated learning (FL) over a fog radio access network, in which multiple internet-of-things (IoT) devices cooperatively learn a shared machine learning model by communicating with a cloud server (CS) through distributed access points (APs). Under the assumption that the fronthaul links connecting APs to CS have finite capacity, a rate-splitting transmission at IoT devices (IDs) is proposed which enables hybrid edge and cloud decoding of split uplink messages. The problem of completion time minimization for FL is tackled by optimizing the rate-splitting transmission and fronthaul quantization strategies along with training hyperparameters such as precision and iteration numbers. Numerical results show that the proposed rate-splitting transmission achieves notable gains over benchmark schemes which rely solely on edge or cloud decoding.

Citations (12)

Summary

We haven't generated a summary for this paper yet.