Dice Question Streamline Icon: https://streamlinehq.com

Extend Shadowheart SGD to statistical heterogeneity

Extend the Shadowheart SGD framework and analysis to the setting with statistical heterogeneity across workers (non-identically distributed data), while preserving the model of arbitrary computation and communication heterogeneity and compressed, asynchronous centralized training.

Information Square Streamline Icon: https://streamlinehq.com

Background

The paper focuses on device heterogeneity—arbitrary and heterogeneous computation and communication times—under centralized asynchronous training with compression. This excludes differences in data distributions across workers, which is central in federated learning.

The authors explicitly defer handling statistical heterogeneity, identifying it as a next step to broaden the applicability of their results to more realistic federated learning scenarios.

References

Due to our in-depth focus on device heterogeneity and the challenges that need to be overcome, we do not consider statistical heterogeneity, and leave an extension to this setup to future work.