Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
126 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Optimizing Federated Learning in LEO Satellite Constellations via Intra-Plane Model Propagation and Sink Satellite Scheduling (2302.13447v1)

Published 27 Feb 2023 in cs.LG and cs.NI

Abstract: The advances in satellite technology developments have recently seen a large number of small satellites being launched into space on Low Earth orbit (LEO) to collect massive data such as Earth observational imagery. The traditional way which downloads such data to a ground station (GS) to train a ML model is not desirable due to the bandwidth limitation and intermittent connectivity between LEO satellites and the GS. Satellite edge computing (SEC), on the other hand, allows each satellite to train an ML model onboard and uploads only the model to the GS which appears to be a promising concept. This paper proposes FedLEO, a novel federated learning (FL) framework that realizes the concept of SEC and overcomes the limitation (slow convergence) of existing FL-based solutions. FedLEO (1) augments the conventional FL's star topology with horizontal'' intra-plane communication pathways in which model propagation among satellites takes place; (2) optimally schedules communication betweensink'' satellites and the GS by exploiting the predictability of satellite orbiting patterns. We evaluate FedLEO extensively and benchmark it with the state of the art. Our results show that FedLEO drastically expedites FL convergence, without sacrificing -- in fact it considerably increases -- the model accuracy.

Citations (19)

Summary

We haven't generated a summary for this paper yet.