WHALE-FL: Wireless and Heterogeneity Aware Latency Efficient Federated Learning over Mobile Devices via Adaptive Subnetwork Scheduling
Abstract: As a popular distributed learning paradigm, federated learning (FL) over mobile devices fosters numerous applications, while their practical deployment is hindered by participating devices' computing and communication heterogeneity. Some pioneering research efforts proposed to extract subnetworks from the global model, and assign as large a subnetwork as possible to the device for local training based on its full computing and communications capacity. Although such fixed size subnetwork assignment enables FL training over heterogeneous mobile devices, it is unaware of (i) the dynamic changes of devices' communication and computing conditions and (ii) FL training progress and its dynamic requirements of local training contributions, both of which may cause very long FL training delay. Motivated by those dynamics, in this paper, we develop a wireless and heterogeneity aware latency efficient FL (WHALE-FL) approach to accelerate FL training through adaptive subnetwork scheduling. Instead of sticking to the fixed size subnetwork, WHALE-FL introduces a novel subnetwork selection utility function to capture device and FL training dynamics, and guides the mobile device to adaptively select the subnetwork size for local training based on (a) its computing and communication capacity, (b) its dynamic computing and/or communication conditions, and (c) FL training status and its corresponding requirements for local training contributions. Our evaluation shows that, compared with peer designs, WHALE-FL effectively accelerates FL training without sacrificing learning accuracy.
- Federated learning review: Fundamentals, enabling technologies, and future applications. Information Processing, Management, 59(6):103061, 2022. ISSN 0306-4573. https://doi.org/10.1016/j.ipm.2022.103061. URL https://www.sciencedirect.com/science/article/pii/S0306457322001649.
- Sex bias in graduate admissions: Data from berkeley. Science, 187(4175):398–404, 1975. 10.1126/science.187.4175.398. URL https://www.science.org/doi/abs/10.1126/science.187.4175.398.
- Federated learning of predictive models from federated electronic health records. International Journal of Medical Informatics, 112:59–67, 2018. ISSN 1386-5056. https://doi.org/10.1016/j.ijmedinf.2018.01.007. URL https://www.sciencedirect.com/science/article/pii/S138650561830008X.
- Eefl: High-speed wireless communications inspired energy efficient federated learning over mobile devices. In Proceedings of the 21st Annual International Conference on Mobile Systems, Applications and Services, MobiSys ’23, page 544–556, New York, NY, USA, 2023. Association for Computing Machinery. ISBN 9798400701108. 10.1145/3581791.3596865. URL https://doi.org/10.1145/3581791.3596865.
- Fedobd: Opportunistic block dropout for efficiently training large-scale neural networks through federated learning. In International Joint Conference on Artificial Intelligence, 2022. URL https://api.semanticscholar.org/CorpusID:251468165.
- Client selection in federated learning: Convergence analysis and power-of-choice selection strategies, 2021. URL https://openreview.net/forum?id=PYAFKBc8GL4.
- L. Deng. The mnist database of handwritten digit images for machine learning research [best of the web]. IEEE Signal Processing Magazine, 29(6):141–142, 2012. 10.1109/MSP.2012.2211477.
- Bert: Pre-training of deep bidirectional transformers for language understanding. In North American Chapter of the Association for Computational Linguistics, 2019. URL https://api.semanticscholar.org/CorpusID:52967399.
- Heterofl: Computation and communication efficient federated learning for heterogeneous clients. In International Conference on Learning Representations, 2021. URL https://openreview.net/forum?id=TNkPBBYFkXg.
- Human activity recognition in artificial intelligence framework: a narrative review. Artificial Intelligence Review, 55, 08 2022. 10.1007/s10462-021-10116-x.
- Federated learning for mobile keyboard prediction. ArXiv, abs/1811.03604, 2018.
- Deep residual learning for image recognition. 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pages 770–778, 2015. URL https://api.semanticscholar.org/CorpusID:206594692.
- Fjord: Fair and accurate federated learning under heterogeneous targets with ordered dropout. In A. Beygelzimer, Y. Dauphin, P. Liang, and J. W. Vaughan, editors, Advances in Neural Information Processing Systems, 2021. URL https://openreview.net/forum?id=4fLr7H5D_eT.
- Oort: Efficient federated learning via guided participant selection. In 15th USENIX Symposium on Operating Systems Design and Implementation (OSDI 21), pages 19–35. USENIX Association, July 2021. ISBN 978-1-939133-22-9. URL https://www.usenix.org/conference/osdi21/presentation/lai.
- To talk or to work: Flexible communication compression for energy efficient federated learning over heterogeneous mobile edge devices. In IEEE INFOCOM 2021 - IEEE Conference on Computer Communications, pages 1–10, 2021a. 10.1109/INFOCOM42981.2021.9488839.
- Federated learning on non-iid data silos: An experimental study, 2021b.
- On the convergence of fedavg on non-iid data. In International Conference on Learning Representations, 2020. URL https://openreview.net/forum?id=HJxNAnVtDS.
- Federated learning in mobile edge networks: A comprehensive survey. IEEE Communications Surveys and Tutorials, 22(3):2031–2063, 2020. 10.1109/COMST.2020.2986024.
- Fedet: A communication-efficient federated class-incremental learning framework based on enhanced transformer. In International Joint Conference on Artificial Intelligence, 2023. URL https://api.semanticscholar.org/CorpusID:259261962.
- Communication-efficient learning of deep networks from decentralized data. In Proc. of AISTATS 2017, pages 1273–1282, 20–22 Apr. 2017.
- S. Ruder. An overview of gradient descent optimization algorithms. ArXiv, abs/1609.04747, 2016. URL https://api.semanticscholar.org/CorpusID:17485266.
- Federated dropout – a simple approach for enabling federated learning on resource constrained devices, 2022.
- Federated learning in vehicular edge computing: A selective model aggregation approach. IEEE Access, 8:23920–23935, 2020. 10.1109/ACCESS.2020.2968399.
- Learning context-aware policies from multiple smart homes via federated multi-task learning. In 2020 IEEE/ACM Fifth International Conference on Internet-of-Things Design and Implementation (IoTDI), pages 104–115, 2020. 10.1109/IoTDI49375.2020.00017.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.