Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

FedSheafHN: Personalized Federated Learning on Graph-structured Data (2405.16056v3)

Published 25 May 2024 in cs.LG

Abstract: Personalized subgraph Federated Learning (FL) is a task that customizes Graph Neural Networks (GNNs) to individual client needs, accommodating diverse data distributions. However, applying hypernetworks in FL, while aiming to facilitate model personalization, often encounters challenges due to inadequate representation of client-specific characteristics. To overcome these limitations, we propose a model called FedSheafHN, using enhanced collaboration graph embedding and efficient personalized model parameter generation. Specifically, our model embeds each client's local subgraph into a server-constructed collaboration graph. We utilize sheaf diffusion in the collaboration graph to learn client representations. Our model improves the integration and interpretation of complex client characteristics. Furthermore, our model ensures the generation of personalized models through advanced hypernetworks optimized for parallel operations across clients. Empirical evaluations demonstrate that FedSheafHN outperforms existing methods in most scenarios, in terms of client model performance on various graph-structured datasets. It also has fast model convergence and effective new clients generalization.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (60)
  1. K. Zhang, C. Yang, X. Li, L. Sun, and S. M. Yiu, “Subgraph federated learning with missing neighbor generation,” Advances in Neural Information Processing Systems, vol. 34, pp. 6671–6682, 2021.
  2. C. Wu, F. Wu, Y. Cao, Y. Huang, and X. Xie, “FedGNN: Federated graph neural network for privacy-preserving recommendation,” arXiv preprint arXiv:2102.04925, 2021.
  3. K. Pillutla, K. Malik, A.-R. Mohamed, M. Rabbat, M. Sanjabi, and L. Xiao, “Federated learning with partial model personalization,” in International Conference on Machine Learning.   PMLR, 2022, pp. 17 716–17 758.
  4. Z. Wu, S. Pan, F. Chen, G. Long, C. Zhang, and S. Y. Philip, “A comprehensive survey on graph neural networks,” IEEE transactions on neural networks and learning systems, vol. 32, no. 1, pp. 4–24, 2020.
  5. C. Bodnar, F. Di Giovanni, B. Chamberlain, P. Liò, and M. Bronstein, “Neural sheaf diffusion: A topological perspective on heterophily and oversmoothing in gnns,” Advances in Neural Information Processing Systems, vol. 35, pp. 18 527–18 541, 2022.
  6. F. Ji, S. H. Lee, H. Meng, K. Zhao, J. Yang, and W. P. Tay, “Leveraging label non-uniformity for node classification in graph neural networks,” in Proc. International Conference on Machine Learning, ser. Proc. Machine Learning Research, vol. 202.   PMLR, Jul. 2023, pp. 14 869–14 885.
  7. Q. Kang, K. Zhao, Y. Song, S. Wang, and W. P. Tay, “Node embedding from neural hamiltonian orbits in graph neural networks,” in Proc. International Conference on Machine Learning, ser. Proc. Machine Learning Research, vol. 202.   PMLR, Jul. 2023, pp. 15 786–15 808.
  8. V. Smith, C.-K. Chiang, M. Sanjabi, and A. S. Talwalkar, “Federated multi-task learning,” Advances in neural information processing systems, vol. 30, 2017.
  9. D. Ha, A. Dai, and Q. V. Le, “Hypernetworks,” arXiv preprint arXiv:1609.09106, 2016.
  10. A. Shamsian, A. Navon, E. Fetaya, and G. Chechik, “Personalized federated learning using hypernetworks,” in International Conference on Machine Learning.   PMLR, 2021, pp. 9489–9502.
  11. Z. Xu, L. Yang, and S. Gu, “Heterogeneous federated learning based on graph hypernetwork,” in International Conference on Artificial Neural Networks.   Springer, 2023, pp. 464–476.
  12. B. McMahan, E. Moore, D. Ramage, S. Hampson, and B. A. y Arcas, “Communication-efficient learning of deep networks from decentralized data,” in Artificial intelligence and statistics.   PMLR, 2017, pp. 1273–1282.
  13. T. Li, A. K. Sahu, M. Zaheer, M. Sanjabi, A. Talwalkar, and V. Smith, “Federated optimization in heterogeneous networks,” Proceedings of Machine learning and systems, vol. 2, pp. 429–450, 2020.
  14. Q. Li, B. He, and D. Song, “Model-contrastive federated learning,” in Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, 2021, pp. 10 713–10 722.
  15. R. Ye, Z. Ni, C. Xu, J. Wang, S. Chen, and Y. C. Eldar, “FedFM: Anchor-based feature matching for data heterogeneity in federated learning,” IEEE Transactions on Signal Processing, 2023.
  16. Z. Chen, H. H. Yang, T. Quek, and K. F. E. Chong, “Spectral co-distillation for personalized federated learning,” in Advances in Neural Information Processing Systems, 2023.
  17. M. Mohri, G. Sivek, and A. T. Suresh, “Agnostic federated learning,” in International Conference on Machine Learning.   PMLR, 2019, pp. 4615–4625.
  18. M. G. Arivazhagan, V. Aggarwal, A. K. Singh, and S. Choudhary, “Federated learning with personalization layers,” arXiv preprint arXiv:1912.00818, 2019.
  19. K. Wang, R. Mathews, C. Kiddon, H. Eichner, F. Beaufays, and D. Ramage, “Federated evaluation of on-device personalization,” arXiv preprint arXiv:1910.10252, 2019.
  20. J. Schneider and M. Vlachos, “Personalization of deep learning,” in Proceedings of International Data Science Conference.   Springer, 2021, pp. 89–96.
  21. F. Hanzely, S. Hanzely, S. Horváth, and P. Richtárik, “Lower bounds and optimal algorithms for personalized federated learning,” Advances in Neural Information Processing Systems, vol. 33, pp. 2304–2315, 2020.
  22. F. Hanzely and P. Richtárik, “Federated learning of a mixture of global and local models,” arXiv preprint arXiv:2002.05516, 2020.
  23. A. Z. Tan, H. Yu, L. Cui, and Q. Yang, “Towards personalized federated learning,” IEEE Transactions on Neural Networks and Learning Systems, 2022.
  24. Y. Deng, M. M. Kamani, and M. Mahdavi, “Adaptive personalized federated learning,” arXiv preprint arXiv:2003.13461, 2020.
  25. Y. Mansour, M. Mohri, J. Ro, and A. T. Suresh, “Three approaches for personalization with applications to federated learning,” arXiv preprint arXiv:2002.10619, 2020.
  26. X. Ma, J. Zhang, S. Guo, and W. Xu, “Layer-wised model aggregation for personalized federated learning,” in Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, 2022, pp. 10 092–10 101.
  27. Y. Jiang, J. Konečnỳ, K. Rush, and S. Kannan, “Improving federated learning personalization via model agnostic meta learning,” arXiv preprint arXiv:1909.12488, 2019.
  28. A. Fallah, A. Mokhtari, and A. Ozdaglar, “Personalized federated learning with theoretical guarantees: A model-agnostic meta-learning approach,” Advances in Neural Information Processing Systems, vol. 33, pp. 3557–3568, 2020.
  29. R. Lee, M. Kim, D. Li, X. Qiu, T. Hospedales, F. Huszár, and N. Lane, “Fedl2p: Federated learning to personalize,” Advances in Neural Information Processing Systems, vol. 36, 2024.
  30. D. Bui, K. Malik, J. Goetz, H. Liu, S. Moon, A. Kumar, and K. G. Shin, “Federated user representation learning,” arXiv preprint arXiv:1909.12535, 2019.
  31. L. Collins, H. Hassani, A. Mokhtari, and S. Shakkottai, “Exploiting shared representations for personalized federated learning,” in International conference on machine learning.   PMLR, 2021, pp. 2089–2099.
  32. N. Agarwal, P. Kairouz, and Z. Liu, “The skellam mechanism for differentially private federated learning,” Advances in Neural Information Processing Systems, vol. 34, pp. 5052–5064, 2021.
  33. M. Noble, A. Bellet, and A. Dieuleveut, “Differentially private federated learning on heterogeneous data,” in International Conference on Artificial Intelligence and Statistics.   PMLR, 2022, pp. 10 110–10 145.
  34. Y. Li, T. Wang, C. Chen, J. Lou, B. Chen, L. Yang, and Z. Zheng, “Clients collaborate: Flexible differentially private federated learning with guaranteed improvement of utility-privacy trade-off,” arXiv preprint arXiv:2402.07002, 2024.
  35. A. Ghosh, J. Chung, D. Yin, and K. Ramchandran, “An efficient framework for clustered federated learning,” Advances in Neural Information Processing Systems, vol. 33, pp. 19 586–19 597, 2020.
  36. Y. Huang, L. Chu, Z. Zhou, L. Wang, J. Liu, J. Pei, and Y. Zhang, “Personalized cross-silo federated learning on non-iid data,” in Proceedings of the AAAI conference on artificial intelligence, vol. 35, no. 9, 2021, pp. 7865–7873.
  37. M. Zhang, K. Sapra, S. Fidler, S. Yeung, and J. M. Alvarez, “Personalized federated learning with first order model optimization,” arXiv preprint arXiv:2012.08565, 2020.
  38. J. Zhang, S. Guo, X. Ma, H. Wang, W. Xu, and F. Wu, “Parameterized knowledge transfer for personalized federated learning,” Advances in Neural Information Processing Systems, vol. 34, pp. 10 092–10 104, 2021.
  39. C. He, K. Balasubramanian, E. Ceyani, C. Yang, H. Xie, L. Sun, L. He, L. Yang, P. S. Yu, Y. Rong et al., “Fedgraphnn: A federated learning system and benchmark for graph neural networks,” arXiv preprint arXiv:2104.07145, 2021.
  40. Z. Wang, W. Kuang, Y. Xie, L. Yao, Y. Li, B. Ding, and J. Zhou, “Federatedscope-gnn: Towards a unified, comprehensive and efficient package for federated graph learning,” in Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, 2022, pp. 4110–4120.
  41. H. Xie, J. Ma, L. Xiong, and C. Yang, “Federated graph classification over non-iid graphs,” Advances in Neural Information Processing Systems, vol. 34, pp. 18 839–18 852, 2021.
  42. C. He, E. Ceyani, K. Balasubramanian, M. Annavaram, and S. Avestimehr, “Spreadgnn: Serverless multi-task federated learning for graph neural networks,” arXiv preprint arXiv:2106.02743, 2021.
  43. Y. Tan, Y. Liu, G. Long, J. Jiang, Q. Lu, and C. Zhang, “Federated learning on non-iid graphs via structural knowledge sharing,” in Proceedings of the AAAI conference on artificial intelligence, 2023, pp. 9953–9961.
  44. Y. Yao, W. Jin, S. Ravi, and C. Joe-Wong, “Fedgcn: Convergence and communication tradeoffs in federated training of graph convolutional networks,” arXiv preprint arXiv:2201.12433, 2022.
  45. J. Baek, W. Jeong, J. Jin, J. Yoon, and S. J. Hwang, “Personalized subgraph federated learning,” in International Conference on Machine Learning.   PMLR, 2023, pp. 1396–1415.
  46. B. Klein, L. Wolf, and Y. Afek, “A dynamic convolutional layer for short range weather prediction,” in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2015, pp. 4840–4848.
  47. S. Klocek, Ł. Maziarka, M. Wołczyk, J. Tabor, J. Nowak, and M. Śmieja, “Hypernetwork functional image representation,” in International Conference on Artificial Neural Networks.   Springer, 2019, pp. 496–510.
  48. A. Navon, A. Shamsian, G. Chechik, and E. Fetaya, “Learning the pareto front with hypernetworks,” arXiv preprint arXiv:2010.04104, 2020.
  49. A. Brock, T. Lim, J. M. Ritchie, and N. Weston, “SMASH: one-shot model architecture search through hypernetworks,” arXiv preprint arXiv:1708.05344, 2017.
  50. C. Zhang, M. Ren, and R. Urtasun, “Graph hypernetworks for neural architecture search,” arXiv preprint arXiv:1810.05749, 2018.
  51. J. Hansen and T. Gebhart, “Sheaf neural networks,” arXiv preprint arXiv:2012.06333, 2020.
  52. P. Sen, G. Namata, M. Bilgic, L. Getoor, B. Galligher, and T. Eliassi-Rad, “Collective classification in network data,” AI magazine, vol. 29, no. 3, pp. 93–93, 2008.
  53. W. Hu, M. Fey, M. Zitnik, Y. Dong, H. Ren, B. Liu, M. Catasta, and J. Leskovec, “Open graph benchmark: Datasets for machine learning on graphs,” Advances in neural information processing systems, vol. 33, pp. 22 118–22 133, 2020.
  54. J. McAuley, C. Targett, Q. Shi, and A. Van Den Hengel, “Image-based recommendations on styles and substitutes,” in Proceedings of the 38th international ACM SIGIR conference on research and development in information retrieval, 2015, pp. 43–52.
  55. O. Shchur, M. Mumme, A. Bojchevski, and S. Günnemann, “Pitfalls of graph neural network evaluation,” arXiv preprint arXiv:1811.05868, 2018.
  56. G. Karypis, “Metis: Unstructured graph partitioning and sparse matrix ordering system,” Technical report, 1997.
  57. R. Ye, Z. Ni, F. Wu, S. Chen, and Y. Wang, “Personalized federated learning with inferred collaboration graphs,” in International Conference on Machine Learning.   PMLR, 2023, pp. 39 801–39 817.
  58. D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” arXiv preprint arXiv:1412.6980, 2014.
  59. D. J. Watts and S. H. Strogatz, “Collective dynamics of ‘small-world’networks,” nature, vol. 393, no. 6684, pp. 440–442, 1998.
  60. S. Lin, Y. Han, X. Li, and Z. Zhang, “Personalized federated learning towards communication efficiency, robustness and fairness,” Advances in Neural Information Processing Systems, vol. 35, pp. 30 471–30 485, 2022.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Wenfei Liang (6 papers)
  2. Yanan Zhao (13 papers)
  3. Rui She (37 papers)
  4. Yiming Li (199 papers)
  5. Wee Peng Tay (101 papers)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets