Tackling Computational Heterogeneity in FL: A Few Theoretical Insights (2307.06283v1)
Abstract: The future of machine learning lies in moving data collection along with training to the edge. Federated Learning, for short FL, has been recently proposed to achieve this goal. The principle of this approach is to aggregate models learned over a large number of distributed clients, i.e., resource-constrained mobile devices that collect data from their environment, to obtain a new more general model. The latter is subsequently redistributed to clients for further training. A key feature that distinguishes federated learning from data-center-based distributed training is the inherent heterogeneity. In this work, we introduce and analyse a novel aggregation framework that allows for formalizing and tackling computational heterogeneity in federated optimization, in terms of both heterogeneous data and local updates. Proposed aggregation algorithms are extensively analyzed from a theoretical, and an experimental prospective.
- How to backdoor federated learning. In International Conference on Artificial Intelligence and Statistics, pages 2938–2948. PMLR, 2020.
- Limited-memory bfgs with displacement aggregation. Mathematical Programming, 194(1-2):121–157, 2022.
- Personalized federated learning with communication compression. arXiv preprint arXiv:2209.05148, 2022.
- Federated learning with autotuned communication-efficient secure aggregation. In 2019 53rd Asilomar Conference on Signals, Systems, and Computers, pages 1222–1226. IEEE, 2019.
- Federated select: A primitive for communication-and memory-efficient federated learning. arXiv preprint arXiv:2208.09432, 2022.
- Federated meta-learning with fast convergence and efficient communication. arXiv preprint arXiv:1802.07876, 2018.
- Optimal client sampling for federated learning. Transactions on Machine Learning Research, 2017.
- Communication-efficient federated deep learning with layerwise asynchronous model update and temporally weighted aggregation. IEEE transactions on neural networks and learning systems, 31(10):4229–4238, 2019.
- Client selection in federated learning: Convergence analysis and power-of-choice selection strategies. arXiv preprint arXiv:2010.01243, 2020.
- Personalized federated learning for heterogeneous clients with clustered knowledge transfer. arXiv preprint arXiv:2109.08119, 2021.
- Heterogeneous ensemble knowledge transfer for training large models in federated learning. arXiv preprint arXiv:2204.12703, 2022a.
- Towards understanding biased client selection in federated learning. In International Conference on Artificial Intelligence and Statistics, pages 10351–10375. PMLR, 2022b.
- Variational federated multi-task learning. arXiv preprint arXiv:1906.06268, 2019.
- High-performance distributed ml at scale through parameter server consistency models. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 29, 2015.
- Li Deng. The mnist database of handwritten digit images for machine learning research [best of the web]. IEEE signal processing magazine, 29(6):141–142, 2012.
- Estimation, optimization, and parallelism when data is sparse. Advances in Neural Information Processing Systems, 26, 2013.
- Semi-cyclic stochastic gradient descent. In International Conference on Machine Learning, pages 1764–1773. PMLR, 2019.
- Sgd: General analysis and improved rates. In International conference on machine learning, pages 5200–5209. PMLR, 2019.
- Fedshuffle: Recipes for better use of local work in federated learning. arXiv preprint arXiv:2204.13169, 2022.
- Stochastic client selection for federated learning with volatile clients. IEEE Internet of Things Journal, 9(20):20055–20070, 2022.
- Fedvarp: Tackling the variance due to partial client participation in federated learning. In Uncertainty in Artificial Intelligence, pages 906–916. PMLR, 2022.
- A linear speedup analysis of distributed deep learning with sparse and quantized communication. Advances in Neural Information Processing Systems, 31, 2018.
- Accelerated federated learning with decoupled adaptive optimization. In International Conference on Machine Learning, pages 10298–10322. PMLR, 2022.
- Mechanisms that incentivize data sharing in federated learning. arXiv preprint arXiv:2207.04557, 2022.
- Adaptive gradient-based meta-learning methods. Advances in Neural Information Processing Systems, 32, 2019.
- Federated learning: Strategies for improving communication efficiency. arXiv preprint arXiv:1610.05492, 2016.
- Fair resource allocation in federated learning. arXiv preprint arXiv:1905.10497, 2019.
- Federated learning: Challenges, methods, and future directions. IEEE signal processing magazine, 37(3):50–60, 2020a.
- Federated learning: Challenges, methods, and future directions. IEEE signal processing magazine, 37(3):50–60, 2020b.
- Federated learning in mobile edge networks: A comprehensive survey. IEEE Communications Surveys & Tutorials, 22(3):2031–2063, 2020.
- Accelerating federated learning via momentum gradient descent. IEEE Transactions on Parallel and Distributed Systems, 31(8):1754–1766, 2020.
- Communication-efficient learning of deep networks from decentralized data. In Artificial intelligence and statistics, pages 1273–1282. PMLR, 2017a.
- Communication-efficient learning of deep networks from decentralized data. In Artificial intelligence and statistics, pages 1273–1282. PMLR, 2017b.
- Local learning matters: Rethinking data heterogeneity in federated learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 8397–8406, 2022.
- Agnostic federated learning. In International Conference on Machine Learning, pages 4615–4625. PMLR, 2019.
- Client selection for federated learning with heterogeneous resources in mobile edge. In ICC 2019-2019 IEEE international conference on communications (ICC), pages 1–7. IEEE, 2019.
- Adaptive federated optimization. arXiv preprint arXiv:2003.00295, 2020.
- A federated learning aggregation algorithm for pervasive computing: Evaluation and comparison. In 2021 IEEE International Conference on Pervasive Computing and Communications (PerCom), pages 1–10. IEEE, 2021.
- Federated learning on heterogeneous and long-tailed data via classifier re-training with federated features. IJCAI-ECAI 2022, 2022.
- Towards personalized federated learning. IEEE Transactions on Neural Networks and Learning Systems, 2022.
- Tackling the objective inconsistency problem in heterogeneous federated optimization. Advances in neural information processing systems, 33:7611–7623, 2020.
- Fashion-mnist: a novel image dataset for benchmarking machine learning algorithms. arXiv preprint arXiv:1708.07747, 2017.
- Characterizing impacts of heterogeneity in federated learning upon large-scale smartphone data, 2021.
- Bayesian nonparametric federated learning of neural networks. In International conference on machine learning, pages 7252–7261. PMLR, 2019.
- Fedpage: A fast local stochastic gradient method for communication-efficient federated learning. arXiv preprint arXiv:2108.04755, 2021.
- Efficient privacy-preserving data merging and skyline computation over multi-source encrypted data. Information Sciences, 498:91–105, 2019.
- Federated learning with online adaptive heterogeneous local models. In Workshop on Federated Learning: Recent Advances and New Challenges (in Conjunction with NeurIPS 2022).
- On the convergence of heterogeneous federated learning with arbitrary adaptive online model pruning. arXiv preprint arXiv:2201.11803, 2022.
- Adnan Ben Mansour (3 papers)
- Gaia Carenini (10 papers)
- Alexandre Duplessis (6 papers)