Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Federated Learning and catastrophic forgetting in pervasive computing: demonstration in HAR domain (2207.08180v1)

Published 17 Jul 2022 in cs.LG, cs.AI, and eess.SP

Abstract: Federated Learning has been introduced as a new machine learning paradigm enhancing the use of local devices. At a server level, FL regularly aggregates models learned locally on distributed clients to obtain a more general model. In this way, no private data is sent over the network, and the communication cost is reduced. However, current solutions rely on the availability of large amounts of stored data at the client side in order to fine-tune the models sent by the server. Such setting is not realistic in mobile pervasive computing where data storage must be kept low and data characteristic (distribution) can change dramatically. To account for this variability, a solution is to use the data regularly collected by the client to progressively adapt the received model. But such naive approach exposes clients to the well-known problem of catastrophic forgetting. The purpose of this paper is to demonstrate this problem in the mobile human activity recognition context on smartphones.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Anastasiia Usmanova (3 papers)
  2. François Portet (29 papers)
  3. Philippe Lalanda (13 papers)
  4. German Vega (119 papers)
Citations (3)

Summary

We haven't generated a summary for this paper yet.