CC-FedAvg: Computationally Customized Federated Averaging (2212.13679v3)
Abstract: Federated learning (FL) is an emerging paradigm to train model with distributed data from numerous Internet of Things (IoT) devices. It inherently assumes a uniform capacity among participants. However, due to different conditions such as differing energy budgets or executing parallel unrelated tasks, participants have diverse computational resources in practice. Participants with insufficient computation budgets must plan for the use of restricted computational resources appropriately, otherwise they would be unable to complete the entire training procedure, resulting in model performance decline. To address this issue, we propose a strategy for estimating local models without computationally intensive iterations. Based on it, we propose Computationally Customized Federated Averaging (CC-FedAvg), which allows participants to determine whether to perform traditional local training or model estimation in each round based on their current computational budgets. Both theoretical analysis and exhaustive experiments indicate that CC-FedAvg has the same convergence rate and comparable performance as FedAvg without resource constraints. Furthermore, CC-FedAvg can be viewed as a computation-efficient version of FedAvg that retains model performance while considerably lowering computation overhead.
- Federated learning based on dynamic regularization. In International Conference on Learning Representations, 2021.
- Towards energy-aware federated learning on battery-powered clients. In Proceedings of the 1st ACM Workshop on Data Privacy and Federated Learning Technologies for Mobile Edge Network, pages 7–12, 2022.
- Towards understanding biased client selection in federated learning. In International Conference on Artificial Intelligence and Statistics, pages 10351–10375. PMLR, 2022.
- Heterofl: Computation and communication efficient federated learning for heterogeneous clients. In International Conference on Learning Representations, 2021.
- Clustered sampling: Low-variance and improved representativity for clients selection in federated learning. In International Conference on Machine Learning, pages 3407–3416. PMLR, 2021.
- Fast federated learning in the presence of arbitrary device unavailability. Advances in Neural Information Processing Systems, 34:12052–12064, 2021.
- On the convergence of local descent methods in federated learning. arXiv preprint arXiv:1910.14425, 2019.
- Efficient split-mix federated learning for on-demand and in-situ customization. In International Conference on Learning Representations, 2022.
- Fjord: Fair and accurate federated learning under heterogeneous targets with ordered dropout. Advances in Neural Information Processing Systems, 34:12876–12889, 2021.
- Advances and open problems in federated learning. Foundations and Trends® in Machine Learning, 14(1–2):1–210, 2021.
- Breaking the centralized barrier for cross-device federated learning. Advances in Neural Information Processing Systems, 34:28663–28676, 2021.
- Scaffold: Stochastic controlled averaging for federated learning. In International Conference on Machine Learning, pages 5132–5143. PMLR, 2020.
- Autofl: Enabling heterogeneity-aware energy efficient federated learning. In MICRO-54: 54th Annual IEEE/ACM International Symposium on Microarchitecture, pages 183–198, 2021.
- A Krizhevsky. Learning multiple layers of features from tiny images. Master’s thesis, University of Tront, 2009.
- Fedscale: Benchmarking model and system performance of federated learning at scale. In International Conference on Machine Learning, pages 11814–11827. PMLR, 2022.
- Model-contrastive federated learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 10713–10722, 2021.
- Federated optimization in heterogeneous networks. arXiv preprint arXiv:1812.06127, 2018.
- Variance reduced local sgd with lower communication complexity. arXiv preprint arXiv:1912.12844, 2019.
- Ensemble distillation for robust model fusion in federated learning. Advances in Neural Information Processing Systems, 33:2351–2363, 2020.
- Cost-effective federated learning design. In IEEE INFOCOM 2021-IEEE Conference on Computer Communications, pages 1–10. IEEE, 2021.
- Cmfl: Mitigating communication overhead for federated learning. In 2019 IEEE 39th international conference on distributed computing systems (ICDCS), pages 954–964. IEEE, 2019.
- Communication-efficient learning of deep networks from decentralized data. In Artificial intelligence and statistics, pages 1273–1282. PMLR, 2017.
- Resource-adaptive federated learning with all-in-one neural composition. In Alice H. Oh, Alekh Agarwal, Danielle Belgrave, and Kyunghyun Cho, editors, Advances in Neural Information Processing Systems, 2022.
- Client selection for federated learning with heterogeneous resources in mobile edge. In ICC 2019-2019 IEEE international conference on communications (ICC), pages 1–7. IEEE, 2019.
- Jorge Nocedal§. Optimization methods for large-scale machine learning. Siam Review, 60(2), 2016.
- Adaptive federated optimization. In International Conference on Learning Representations, 2020.
- Adaptive federated optimization. In International Conference on Learning Representations, 2021.
- Faster distributed deep net training: Computation and communication decoupled stochastic gradient descent. In Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence, pages 4582–4589, 7 2019.
- Friends to help: Saving federated learning from client dropout. arXiv preprint arXiv:2205.13222, 2022.
- A field guide to federated optimization. arXiv preprint arXiv:2107.06917, 2021.
- Adaptive communication strategies to achieve the best error-runtime trade-off in local-update sgd. Proceedings of Machine Learning and Systems, 1:212–229, 2019.
- Tackling the objective inconsistency problem in heterogeneous federated optimization. Advances in neural information processing systems, 33:7611–7623, 2020.
- Delta: Diverse client sampling for fasting federated learning, 2022.
- Fashion-mnist: a novel image dataset for benchmarking machine learning algorithms. arXiv preprint arXiv:1708.07747, 2017.
- Distributed non-convex optimization with sublinear speedup under intermittent client availability. arXiv preprint arXiv:2002.07399, 2020.
- Achieving linear speedup with partial worker participation in non-IID federated learning. In International Conference on Learning Representations, 2021.
- Federated learning with additional mechanisms on clients to reduce communication costs. arXiv preprint arXiv:1908.05891, 2019.
- Gradient diversity: a key ingredient for scalable distributed learning. In International Conference on Artificial Intelligence and Statistics, pages 1998–2007. PMLR, 2018.
- Fedcos: A scene-adaptive enhancement for federated learning. IEEE Internet of Things Journal, 2022.
- Aperiodic local sgd: Beyond local sgd. In Proceedings of the 51st International Conference on Parallel Processing, ICPP ’22, 2023.
- Federated learning with non-iid data. arXiv preprint arXiv:1806.00582, 2018.
- Distilled one-shot federated learning. arXiv preprint arXiv:2009.07999, 2020.
- Hao Zhang (948 papers)
- Tingting Wu (18 papers)
- Siyao Cheng (3 papers)
- Jie Liu (492 papers)