Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Online Federated Learning via Non-Stationary Detection and Adaptation amidst Concept Drift (2211.12578v2)

Published 22 Nov 2022 in cs.LG, cs.AI, cs.SY, eess.SY, and stat.ML

Abstract: Federated Learning (FL) is an emerging domain in the broader context of artificial intelligence research. Methodologies pertaining to FL assume distributed model training, consisting of a collection of clients and a server, with the main goal of achieving optimal global model with restrictions on data sharing due to privacy concerns. It is worth highlighting that the diverse existing literature in FL mostly assume stationary data generation processes; such an assumption is unrealistic in real-world conditions where concept drift occurs due to, for instance, seasonal or period observations, faults in sensor measurements. In this paper, we introduce a multiscale algorithmic framework which combines theoretical guarantees of \textit{FedAvg} and \textit{FedOMD} algorithms in near stationary settings with a non-stationary detection and adaptation technique to ameliorate FL generalization performance in the presence of concept drifts. We present a multi-scale algorithmic framework leading to $\Tilde{\mathcal{O}} ( \min { \sqrt{LT} , \Delta{\frac{1}{3}}T{\frac{2}{3}} + \sqrt{T} })$ \textit{dynamic regret} for $T$ rounds with an underlying general convex loss function, where $L$ is the number of times non-stationary drifts occurred and $\Delta$ is the cumulative magnitude of drift experienced within $T$ rounds.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Bhargav Ganguly (8 papers)
  2. Vaneet Aggarwal (222 papers)
Citations (6)

Summary

We haven't generated a summary for this paper yet.