Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Gradient Statistics Aware Power Control for Over-the-Air Federated Learning (2003.02089v3)

Published 4 Mar 2020 in eess.SP, cs.IT, cs.LG, and math.IT

Abstract: Federated learning (FL) is a promising technique that enables many edge devices to train a machine learning model collaboratively in wireless networks. By exploiting the superposition nature of wireless waveforms, over-the-air computation (AirComp) can accelerate model aggregation and hence facilitate communication-efficient FL. Due to channel fading, power control is crucial in AirComp. Prior works assume that the signals to be aggregated from each device, i.e., local gradients have identical statistics. In FL, however, gradient statistics vary over both training iterations and feature dimensions, and are unknown in advance. This paper studies the power control problem for over-the-air FL by taking gradient statistics into account. The goal is to minimize the aggregation error by optimizing the transmit power at each device subject to peak power constraints. We obtain the optimal policy in closed form when gradient statistics are given. Notably, we show that the optimal transmit power is continuous and monotonically decreases with the squared multivariate coefficient of variation (SMCV) of gradient vectors. We then propose a method to estimate gradient statistics with negligible communication cost. Experimental results demonstrate that the proposed gradient-statistics-aware power control achieves higher test accuracy than the existing schemes for a wide range of scenarios.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Naifu Zhang (8 papers)
  2. Meixia Tao (155 papers)
Citations (20)

Summary

We haven't generated a summary for this paper yet.