Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

MAPL: Model Agnostic Peer-to-peer Learning (2403.19792v1)

Published 28 Mar 2024 in cs.LG, cs.AI, cs.CR, and cs.DC

Abstract: Effective collaboration among heterogeneous clients in a decentralized setting is a rather unexplored avenue in the literature. To structurally address this, we introduce Model Agnostic Peer-to-peer Learning (coined as MAPL) a novel approach to simultaneously learn heterogeneous personalized models as well as a collaboration graph through peer-to-peer communication among neighboring clients. MAPL is comprised of two main modules: (i) local-level Personalized Model Learning (PML), leveraging a combination of intra- and inter-client contrastive losses; (ii) network-wide decentralized Collaborative Graph Learning (CGL) dynamically refining collaboration weights in a privacy-preserving manner based on local task similarities. Our extensive experimentation demonstrates the efficacy of MAPL and its competitive (or, in most cases, superior) performance compared to its centralized model-agnostic counterparts, without relying on any central server. Our code is available and can be accessed here: https://github.com/SayakMukherjee/MAPL

Definition Search Book Streamline Icon: https://streamlinehq.com
References (75)
  1. Federated Learning with Personalization Layers.
  2. Decentralized Federated Learning: Fundamentals, State-of-the-art, Frameworks, Trends, and Challenges.
  3. On Bridging Generic and Personalized Federated Learning for Image Classification. ICLR 2022 - 10th International Conference on Learning Representations.
  4. A Simple Framework for Contrastive Learning of Visual Representations. 37th International Conference on Machine Learning, ICML 2020, PartF168147-3:1575–1585.
  5. Exploiting Shared Representations for Personalized Federated Learning. Proceedings of Machine Learning Research, 139:2089–2099.
  6. Condat, L. (2016). Fast projection onto the simplex and the l1 ball. Mathematical Programming, 158(1-2):575–585.
  7. DisPFL: Towards Communication-Efficient Personalized Federated Learning via Decentralized Sparse Training.
  8. CINIC-10 is not ImageNet or CIFAR-10.
  9. ImageNet: A large-scale hierarchical image database.
  10. Personalized Federated Learning with Moreau Envelopes. Advances in Neural Information Processing Systems, 2020-December.
  11. WeiAvg: Federated Learning Model Aggregation Promoting Data Diversity.
  12. Decentralized K-means using randomized gossip protocols for clustering large datasets. Proceedings - IEEE 13th International Conference on Data Mining Workshops, ICDMW 2013, pages 599–606.
  13. A survey on heterogeneous federated learning.
  14. Dissecting supervised contrastive learning. In International Conference on Machine Learning, pages 3821–3830. PMLR.
  15. Group Knowledge Transfer: Federated Learning of Large CNNs at the Edge. Advances in Neural Information Processing Systems, 33:14068–14080.
  16. Deep Residual Learning for Image Recognition. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2016-December:770–778.
  17. Hendrikx, H. (2023). A principled framework for the design and analysis of token algorithms. In Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, pages 470–489.
  18. Dual-Free Stochastic Decentralized Optimization with Variance Reduction. Advances in Neural Information Processing Systems, 33:19455–19466.
  19. The Non-IID Data Quagmire of Decentralized Machine Learning. 37th International Conference on Machine Learning, ICML 2020, PartF168147-6:4337–4348.
  20. Federated Learning With Taskonomy for Non-IID Data. IEEE Transactions on Neural Networks and Learning Systems.
  21. FedClassAvg: Local Representation Learning for Personalized Federated Learning on Heterogeneous Neural Networks. In Proceedings of the 51st International Conference on Parallel Processing, pages 1–10.
  22. Personalized Decentralized Federated Learning with Knowledge Distillation.
  23. Collaborative Deep Learning in Fixed Topology Networks. Advances in Neural Information Processing Systems, 2017-December:5905–5915.
  24. Advances and Open Problems in Federated Learning. Foundations and Trends® in Machine Learning, 14(1–2):1–210.
  25. Kalofolias, V. (2016). How to Learn a Graph from Smooth Signals. In Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, pages 920–929. PMLR.
  26. Decentralized federated learning through proxy model sharing. Nature Communications 2023 14:1, 14(1):1–10.
  27. SCAFFOLD: Stochastic Controlled Averaging for Federated Learning. 37th International Conference on Machine Learning, ICML 2020, PartF168147-7:5088–5099.
  28. DFML: Decentralized Federated Mutual Learning.
  29. Supervised Contrastive Learning. Advances in neural information processing systems, 33:18661–18673.
  30. Learning multiple layers of features from tiny images.
  31. ImageNet Classification with Deep Convolutional Neural Networks. Advances in Neural Information Processing Systems, 25.
  32. Prototype-Based Decentralized Federated Learning for the Heterogeneous Time-Varying IoT Systems. IEEE Internet of Things Journal, 11(4):6916–6927.
  33. Model-Contrastive Federated Learning. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pages 10708–10717.
  34. Learning To Collaborate in Decentralized Learning of Personalized Models.
  35. Ditto: Fair and Robust Federated Learning Through Personalization.
  36. Federated Optimization in Heterogeneous Networks. Proceedings of Machine learning and systems, 2:429–450.
  37. Fedh2l: Federated learning with model and statistical heterogeneity. arXiv preprint arXiv:2101.11296.
  38. Towards Effective Clustered Federated Learning: A Peer-to-peer Framework with Adaptive Neighbor Matching. IEEE Transactions on Big Data, pages 1–16.
  39. Li Deng (2012). The MNIST Database of Handwritten Digit Images for Machine Learning Research [Best of the Web]. IEEE Signal Processing Magazine, 29(6):141–142.
  40. Can Decentralized Algorithms Outperform Centralized Algorithms? A Case Study for Decentralized Parallel Stochastic Gradient Descent. Advances in Neural Information Processing Systems, 30.
  41. Ensemble Distillation for Robust Model Fusion in Federated Learning. Advances in Neural Information Processing Systems, 2020-December.
  42. Completely Heterogeneous Federated Learning.
  43. Communication-Efficient Learning of Deep Networks from Decentralized Data.
  44. FedProc: Prototypical Contrastive Federated Learning on Non-IID data. Future Generation Computer Systems, 143:93–104.
  45. Reading Digits in Natural Images with Unsupervised Feature Learning.
  46. Representation Learning with Contrastive Predictive Coding. arXiv preprint arXiv:1807.03748.
  47. PyTorch: An Imperative Style, High-Performance Deep Learning Library. Advances in Neural Information Processing Systems, 32.
  48. Federated Learning with Partial Model Personalization.
  49. Conditional Moment Alignment for Improved Generalization in Federated Learning. In Workshop on Federated Learning: Recent Advances and New Challenges (in Conjunction with NeurIPS 2022).
  50. Clustered Federated Learning: Model-Agnostic Distributed Multi-Task Optimization under Privacy Constraints. IEEE Transactions on Neural Networks and Learning Systems, 32(8):3710–3722.
  51. Personalized Federated Learning using Hypernetworks. Proceedings of Machine Learning Research, 139:9489–9502.
  52. Towards More Suitable Personalization in Federated Learning via Decentralized Partial Model Training.
  53. Improving Model Consistency of Decentralized Federated Learning via Sharpness Aware Minimization and Multiple Gossip Approaches.
  54. Improving the Model Consistency of Decentralized Federated Learning. Proceedings of Machine Learning Research, 202:31269–31291.
  55. Decentralized and Adaptive K-Means Clustering for Non-IID Data Using HyperLogLog Counters. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 12084 LNAI:343–355.
  56. Decentralized Federated Averaging. IEEE Transactions on Pattern Analysis and Machine Intelligence.
  57. Going Deeper with Convolutions. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 07-12-June-2015:1–9.
  58. Toward Personalized Federated Learning. IEEE Transactions on Neural Networks and Learning Systems.
  59. pFedSim: Similarity-Aware Model Aggregation Towards Personalized Federated Learning.
  60. FedProto: Federated Prototype Learning across Heterogeneous Clients. Proceedings of the AAAI Conference on Artificial Intelligence, 36(8):8432–8440.
  61. Decentralized Collaborative Learning of Personalized Models over Networks. Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, AISTATS 2017.
  62. A Survey on Distributed Machine Learning. ACM Computing Surveys (CSUR), 53(2).
  63. Beyond spectral gap: The role of the topology in decentralized learning. In Advances in Neural Information Processing Systems 35 (NeurIPS 2022).
  64. Does Learning from Decentralized Non-IID Unlabeled Data Benefit from Self Supervision? In 11th International Conference on Learning Representations.
  65. A survey on federated learning: challenges and applications. International Journal of Machine Learning and Cybernetics 2022 14:2, 14(2):513–535.
  66. Unsupervised Feature Learning via Non-Parametric Instance-level Discrimination. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pages 3733–3742.
  67. Fashion-MNIST: a Novel Image Dataset for Benchmarking Machine Learning Algorithms.
  68. Distributed average consensus with least-mean-square deviation. Journal of Parallel and Distributed Computing, 67(1):33–46.
  69. Personalized Federated Learning with Feature Alignment and Classifier Collaboration. 11th International Conference on Learning Representations, ICLR 2023 - Conference Track Proceedings.
  70. Heterogeneous Federated Learning: State-of-the-art and Research Challenges. ACM Computing Surveys, 56(3):1–44.
  71. Fully Decentralized Joint Learning of Personalized Models and Collaboration Graphs.
  72. Parameterized Knowledge Transfer for Personalized Federated Learning. Advances in Neural Information Processing Systems, 13:10092–10104.
  73. Energy-efficient decentralized learning via graph sparsification. arXiv preprint arXiv:2401.03083.
  74. ShuffleNet: An Extremely Efficient Convolutional Neural Network for Mobile Devices. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pages 6848–6856.
  75. Data-Free Knowledge Distillation for Heterogeneous Federated Learning. Proceedings of Machine Learning Research, 139:12878–12889.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com