Dice Question Streamline Icon: https://streamlinehq.com

Synchronization and consistency in federated learning

Develop robust, efficient synchronization and communication protocols that maintain consistent model updates and convergence across heterogeneous, intermittently connected devices in federated learning.

Information Square Streamline Icon: https://streamlinehq.com

Background

Federated learning distributes model training across many devices without centralizing raw data. The paper explains that communication disruptions, device heterogeneity, and statistical diversity pose unique systems challenges.

Among these, ensuring that distributed devices remain in sync during training is highlighted as a core unresolved issue affecting training robustness and convergence.

References

Getting all the federated devices to talk to each other and stay in sync is still an open issue.

Modern Computing: Vision and Challenges (2401.02469 - Gill et al., 4 Jan 2024) in Section 4.4 (Decentralized Computing → Technologies/Impact Areas) — Federated Learning