Papers
Topics
Authors
Recent
2000 character limit reached

PeFLL: Personalized Federated Learning by Learning to Learn (2306.05515v4)

Published 8 Jun 2023 in cs.LG

Abstract: We present PeFLL, a new personalized federated learning algorithm that improves over the state-of-the-art in three aspects: 1) it produces more accurate models, especially in the low-data regime, and not only for clients present during its training phase, but also for any that may emerge in the future; 2) it reduces the amount of on-client computation and client-server communication by providing future clients with ready-to-use personalized models that require no additional finetuning or optimization; 3) it comes with theoretical guarantees that establish generalization from the observed clients to future ones. At the core of PeFLL lies a learning-to-learn approach that jointly trains an embedding network and a hypernetwork. The embedding network is used to represent clients in a latent descriptor space in a way that reflects their similarity to each other. The hypernetwork takes as input such descriptors and outputs the parameters of fully personalized client models. In combination, both networks constitute a learning algorithm that achieves state-of-the-art performance in several personalized federated learning benchmarks.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (62)
  1. R. Amit and R. Meir. Meta-learning by adjusting priors based on extended PAC-Bayes theory. In International Conference on Machine Learning (ICML), 2018.
  2. Learning to learn by gradient descent by gradient descent. In Conference on Neural Information Processing Systems (NeurIPS), 2016.
  3. Federated learning with personalization layers. CoRR, abs/1912.00818, 2019. URL http://arxiv.org/abs/1912.00818.
  4. J. Baxter. A model of inductive bias learning. Journal of Artificial Intelligence Research (JAIR), 12:149–198, 2000.
  5. Federated user representation learning. CoRR, abs/1909.12535, 2019. URL http://arxiv.org/abs/1909.12535.
  6. LEAF: A benchmark for federated settings. CoRR, abs/1812.01097, 2018. URL http://arxiv.org/abs/1812.01097.
  7. Efficient personalized federated learning via sparse model-adaptation. In International Conference on Machine Learning (ICML), 2023.
  8. Z. Chen and B. Liu. Lifelong machine learning. Synthesis Lectures on Artificial Intelligence and Machine Learning, 12(3), 2018.
  9. Exploiting shared representations for personalized federated learning. In International Conference on Machine Learning (ICML), 2021.
  10. Personalized federated learning with moreau envelopes. In Conference on Neural Information Processing Systems (NeurIPS), 2020.
  11. A new look and convergence rate of federated multi-task learning with Laplacian regularization. IEEE Transactions on Neural Networks (TNN), 2022.
  12. C. Dwork and A. Roth. The algorithmic foundations of differential privacy. Foundations and Trends in Machine Learning, 9(3–4), 2014.
  13. A pragmatic introduction to secure multi-party computation. Foundations and Trends in Privacy and Security, 2(2–3), 2018. ISSN 2474-1558.
  14. Personalized federated learning with theoretical guarantees: A model-agnostic meta-learning approach. In Conference on Neural Information Processing Systems (NeurIPS), 2020.
  15. Model-agnostic meta-learning for fast adaptation of deep networks. In International Conference on Machine Learning (ICML), Proceedings of Machine Learning Research, 2017.
  16. An efficient framework for clustered federated learning. In Conference on Neural Information Processing Systems (NeurIPS), 2020.
  17. J. Guan and Z. Lu. Fast-rate PAC-Bayesian generalization bounds for meta-learning. In International Conference on Machine Learning (ICML), 2022.
  18. O. Gupta and R. Raskar. Distributed learning of deep neural network over multiple agents. Journal of Network and Computer Applications, 116, 2018.
  19. Hypernetworks. In International Conference on Learning Representations (ICLR), 2017.
  20. F. Hanzely and P. Richtárik. Federated learning of a mixture of global and local models. CoRR, abs/2002.05516, 2020. URL https://arxiv.org/abs/2002.05516.
  21. Lower bounds and optimal algorithms for personalized federated learning. In Conference on Neural Information Processing Systems (NeurIPS), 2020.
  22. Federated learning for mobile keyboard prediction. CoRR, abs/1811.03604, 2018. URL http://arxiv.org/abs/1811.03604.
  23. Meta-learning in neural networks: A survey. IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI), 44(9):5149–5169, 2021.
  24. Improving federated learning personalization via model agnostic meta learning. CoRR, abs/1909.12488, 2019. URL http://arxiv.org/abs/1909.12488.
  25. Advances and open problems in federated learning. Foundations and Trends in Machine Learning, 14(1–2), 2021.
  26. Scaffold: Stochastic controlled averaging for federated learning. In International Conference on Machine Learning (ICML), 2020.
  27. A. Krizhevsky. Learning multiple layers of features from tiny images. Master’s thesis, Department of Computer Science, University of Toronto, 2009.
  28. Gradient-based learning applied to document recognition. Proceedings of the IEEE, 86(11):2278–2324, 1998.
  29. FedTP: Federated learning by transformer personalization. IEEE Transactions on Neural Networks and Learning Systems, 2023.
  30. Federated learning: Challenges, methods, and future directions. IEEE Signal Processing Magazine, 37(3):50–60, 2020.
  31. Ditto: Fair and robust federated learning through personalization. In International Conference on Machine Learning (ICML), 2021.
  32. Think locally, act globally: Federated learning with local and global representations. CoRR, abs/2001.01523, 2020. URL http://arxiv.org/abs/2001.01523.
  33. Layer-wised model aggregation for personalized federated learning. In IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2022.
  34. Three approaches for personalization with applications to federated learning. CoRR, abs/2002.10619, 2020. URL https://arxiv.org/abs/2002.10619.
  35. Federated multi-task learning under a mixture of distributions. In Conference on Neural Information Processing Systems (NeurIPS), 2021.
  36. Personalized federated learning through local memorization. In International Conference on Machine Learning (ICML), 2022.
  37. A. Maurer. A note on the PAC Bayesian theorem. CoRR, cs.LG/0411099, 2004. URL http://arxiv.org/abs/cs.LG/0411099.
  38. D. A. McAllester. Some PAC-Bayesian theorems. In Conference on Computational Learning Theory (COLT), 1998.
  39. Communication-efficient learning of deep networks from decentralized data. In International Conference on Artificial Intelligence and Statistics (AISTATS), 2017.
  40. Never-ending learning. In Conference on Artificial Intelligence (AAAI), 2015.
  41. Continual lifelong learning with neural networks: A review. Neural networks, 113:54–71, 2019.
  42. A. Pentina and C. H. Lampert. A PAC-Bayesian bound for lifelong learning. In International Conference on Machine Learning (ICML), 2014.
  43. A. Pentina and C. H. Lampert. Lifelong learning with non-iid tasks. In Conference on Neural Information Processing Systems (NeurIPS), 2015.
  44. Tighter risk certificates for neural networks. The Journal of Machine Learning Research, 22(1):10326–10365, 2021.
  45. Reliable and interpretable personalized federated learning. In IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2023.
  46. A. Rezazadeh. A unified view on PAC-Bayes bounds for meta-learning. In International Conference on Machine Learning (ICML), 2022.
  47. M. B. Ring. Continual Learning in Reinforcement Environments. PhD thesis, University of Texas at Austin, Computer Science Department, 1994.
  48. PACOH: Bayes-optimal meta-learning with PAC-guarantees. In International Conference on Machine Learning (ICML), 2021.
  49. PAC-Bayesian inequalities for martingales. IEEE Transactions on Information Theory, 58(12):7086–7093, 2012.
  50. Personalized federated learning using hypernetworks. In International Conference on Machine Learning (ICML), 2021.
  51. Federated multi-task learning. In Conference on Neural Information Processing Systems (NeurIPS), 2017.
  52. C. Spearman. The proof and measurement of association between two things. The American Journal of Psychology, 15(1):72–101, 1904.
  53. SplitFed: When federated learning meets split learning. In Conference on Artificial Intelligence (AAAI), 2022.
  54. S. Thrun and T. Mitchell. Lifelong robot learning. Robotics and Autonomous Systems, 15(1):25–46, 1995.
  55. S. Thrun and L. Pratt, editors. Learning to Learn. Kluwer Academic Press, 1998.
  56. R. Vilalta and Y. Drissi. A perspective view and survey of meta-learning. Artifial Intelligence Reviews, 18(2):77–95, 2002.
  57. Personalized federated learning with parameter propagation. In International Conference on Knowledge Discovery and Data Mining (KDD), 2023a.
  58. Personalized federated learning under mixture of distributions. In International Conference on Machine Learning (ICML), 2023b.
  59. Personalized federated learning with inferred collaboration graphs. In International Conference on Machine Learning (ICML), 2023.
  60. pFedLHNs: Personalized federated learning via local hypernetworks. In International Conference on Artificial Neural Networks (ICANN), 2023.
  61. Deep sets. In Conference on Neural Information Processing Systems (NeurIPS), 2017.
  62. FedCR: Personalized federated learning based on across-client common representation with conditional mutual information regularization. In International Conference on Machine Learning (ICML), 2023.
Citations (4)

Summary

We haven't generated a summary for this paper yet.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.