Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
GPT-4o
12 tokens/sec
Gemini 2.5 Pro Pro
o3 Pro
5 tokens/sec
GPT-4.1 Pro
37 tokens/sec
DeepSeek R1 via Azure Pro
33 tokens/sec
Gemini 2.5 Flash Deprecated
12 tokens/sec
2000 character limit reached

Factor-Assisted Federated Learning for Personalized Optimization with Heterogeneous Data (2312.04281v1)

Published 7 Dec 2023 in stat.ML and cs.LG

Abstract: Federated learning is an emerging distributed machine learning framework aiming at protecting data privacy. Data heterogeneity is one of the core challenges in federated learning, which could severely degrade the convergence rate and prediction performance of deep neural networks. To address this issue, we develop a novel personalized federated learning framework for heterogeneous data, which we refer to as FedSplit. This modeling framework is motivated by the finding that, data in different clients contain both common knowledge and personalized knowledge. Then the hidden elements in each neural layer can be split into the shared and personalized groups. With this decomposition, a novel objective function is established and optimized. We demonstrate FedSplit enjoyers a faster convergence speed than the standard federated learning method both theoretically and empirically. The generalization bound of the FedSplit method is also studied. To practically implement the proposed method on real datasets, factor analysis is introduced to facilitate the decoupling of hidden elements. This leads to a practically implemented model for FedSplit and we further refer to as FedFac. We demonstrated by simulation studies that, using factor analysis can well recover the underlying shared/personalized decomposition. The superior prediction performance of FedFac is further verified empirically by comparison with various state-of-the-art federated learning methods on several real datasets.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (54)
  1. Communication-efficient learning of deep networks from decentralized data. In Artificial Intelligence and Statistics, pages 1273–1282. PMLR, 2017.
  2. Federated Machine Learning: Concept and Applications. ACM Transactions on Intelligent Systems and Technology (TIST), 10(2):1–19, 2019.
  3. Federated learning: Challenges, methods, and future directions. IEEE Signal Processing Magazine, 37(3):50–60, 2020.
  4. Advances and open problems in federated learning. Foundations and Trends® in Machine Learning, 14(1–2):1–210, 2021.
  5. A survey on federated learning systems: Vision, hype and reality for data privacy and protection. IEEE Transactions on Knowledge and Data Engineering, 2021.
  6. On the Convergence of FedAvg on Non-IID Data, 2019.
  7. Scaffold: Stochastic Controlled Averaging for Federated Learning. In International conference on machine learning, pages 5132–5143. PMLR, 2020.
  8. Fedpd: A federated learning framework with adaptivity to non-iid data. IEEE Transactions on Signal Processing, 69:6055–6070, 2021.
  9. Fair Resource Allocation in Federated Learning, 2019.
  10. Agnostic Federated Learning. In International Conference on Machine Learning, pages 4615–4625. PMLR, 2019.
  11. Federated learning based on dynamic regularization, 2021.
  12. Towards Personalized Federated Learning. IEEE Transactions on Neural Networks and Learning Systems, 2022.
  13. Personalized Cross-Silo Federated Learning on Non-IID Data. In AAAI, pages 7865–7873, 2021.
  14. FedMD: Heterogenous Federated Learning via Model Distillation, 2019.
  15. Federated Evaluation of On-device Personalization, 2019.
  16. FedNER: Privacy-preserving Medical Named Entity Recognition with Federated Learning, 2020.
  17. Personalized Federated Learning Framework for Network Traffic Anomaly Detection. Computer Networks, 209:108906, 2022.
  18. Sample Estimate of the Entropy of A Random Vector. Problemy Peredachi Informatsii, 23(2):9–16, 1987.
  19. Harry H Harman. Modern Factor Analysis. University of Chicago press, 1976.
  20. Tackling the Objective Inconsistency Problem in Heterogeneous Federated Optimization. Advances in Neural Information Processing Systems, 33:7611–7623, 2020.
  21. Linear Convergence in Federated Learning: Tackling Client Heterogeneity and Sparse Gradients. Advances in Neural Information Processing Systems, 34:14606–14619, 2021.
  22. Federated Learning of A Mixture of Global and Local Models. arXiv preprint arXiv:2002.05516, 2020.
  23. Salvaging Federated Learning by Local Adaptation. arXiv preprint arXiv:2002.04758, 2022.
  24. Fedavg with fine tuning: Local updates lead to representation learning. In Alice H. Oh, Alekh Agarwal, Danielle Belgrave, and Kyunghyun Cho, editors, Advances in Neural Information Processing Systems, 2022.
  25. Personalized Federated Learning with Theoretical Guarantees: A Model-Agnostic Meta-Learning Approach. Advances in Neural Information Processing Systems, 33:3557–3568, 2020.
  26. Federated Multi-Task Learning. Advances in Neural Information Processing Systems, 30, 2017.
  27. Variational Federated Multi-Task Learning, 2019.
  28. Federated Multi-Task Learning under a Mixture of Distributions. Advances in Neural Information Processing Systems, 34:15434–15447, 2021.
  29. Adaptive personalized federated learning, 2020.
  30. Three Approaches for Personalization with Applications to Federated Learning, 2020.
  31. Personalized federated learning through local memorization. In International Conference on Machine Learning, pages 15070–15092. PMLR, 2022.
  32. An Efficient Framework for Clustered Federated Learning. Advances in Neural Information Processing Systems, 33:19586–19597, 2020.
  33. Clustered Federated Learning: Model-Agnostic Distributed Multitask Optimization under Privacy Constraints. IEEE Transactions on Neural Networks and Learning Systems, 32(8):3710–3722, 2020.
  34. Personalized Federated Learning with Moreau Envelopes. Advances in Neural Information Processing Systems, 33:21394–21405, 2020.
  35. Federated learning with personalization layers, 2019.
  36. Exploiting Shared Representations for Personalized Federated Learning. In International Conference on Machine Learning, pages 2089–2099. PMLR, 2021.
  37. FedBABU: Toward enhanced representation for federated image classification. In International Conference on Learning Representations, 2022.
  38. Think locally, act globally: Federated learning with local and global representations, 2020.
  39. FedBN: Federated Learning on Non-IID Features via Local Batch Normalization, 2021.
  40. Cd2-pfed: Cyclic Distillation-guided Channel Decoupling for Model Personalization in Federated Learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 10041–10050, 2022.
  41. Fedrolex: Model-Heterogeneous Federated Learning with Rolling Sub-Model Extraction. Advances in Neural Information Processing Systems, 35:29677–29690, 2022.
  42. Neural Tangent Kernel: Convergence and Generalization in Neural Networks. Advances in Neural Information Processing Systems, 31, 2018.
  43. Federated learning from small datasets. In The Eleventh International Conference on Learning Representations, 2023.
  44. Fl-ntk: A neural tangent kernel-based framework for federated learning analysis. In International Conference on Machine Learning, pages 4423–4434. PMLR, 2021.
  45. Neural Tangent Kernel Empowered Federated Learning. In International Conference on Machine Learning, pages 25783–25803. PMLR, 2022.
  46. Gradient descent provably optimizes over-parameterized neural networks, 2018.
  47. Fine-Grained Analysis of Optimization and Generalization for Overparameterized TwoLayer Neural Networks. In International Conference on Machine Learning, pages 322–332. PMLR, 2019.
  48. Gradient Descent Finds Global Minima of Deep Neural Networks. In International Conference on Machine Learning, pages 1675–1685. PMLR, 2019.
  49. Emil OW Kirkegaard. Inequality across US Counties: An S Factor Analysis. Open Quantitative Sociology & Political Science, 2016.
  50. Malek Abduljaber. A dimension reduction method application to a political science question: Using exploratory factor analysis to generate the dimensionality of political ideology in the arab world. Journal of Information & Knowledge Management, 19(01):2040002, 2020.
  51. Evaluation of Urban Competitiveness of the Huaihe River Eco-Economic Belt Based on Dynamic Factor Analysis. Computational Economics, 58(3):615–639, 2021.
  52. Learning multiple layers of features from tiny images. 2009.
  53. Measuring the effects of non-identical data distribution for federated visual classification, 2019.
  54. Adam: A method for stochastic optimization, 2014.

Summary

We haven't generated a summary for this paper yet.