Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Replica Tree-based Federated Learning using Limited Data (2312.17159v1)

Published 28 Dec 2023 in cs.LG

Abstract: Learning from limited data has been extensively studied in machine learning, considering that deep neural networks achieve optimal performance when trained using a large amount of samples. Although various strategies have been proposed for centralized training, the topic of federated learning with small datasets remains largely unexplored. Moreover, in realistic scenarios, such as settings where medical institutions are involved, the number of participating clients is also constrained. In this work, we propose a novel federated learning framework, named RepTreeFL. At the core of the solution is the concept of a replica, where we replicate each participating client by copying its model architecture and perturbing its local data distribution. Our approach enables learning from limited data and a small number of clients by aggregating a larger number of models with diverse data distributions. Furthermore, we leverage the hierarchical structure of the client network (both original and virtual), alongside the model diversity across replicas, and introduce a diversity-based tree aggregation, where replicas are combined in a tree-like manner and the aggregation weights are dynamically updated based on the model discrepancy. We evaluated our method on two tasks and two types of data, graph generation and image classification (binary and multi-class), with both homogeneous and heterogeneous model architectures. Experimental results demonstrate the effectiveness and outperformance of RepTreeFL in settings where both data and clients are limited. Our code is available at https://github.com/basiralab/RepTreeFL.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (41)
  1. Federated learning based on dynamic regularization, in: International Conference on Learning Representations.
  2. Fedrolex: Model-heterogeneous federated learning with rolling sub-model extraction. Advances in Neural Information Processing Systems 35, 29677–29690.
  3. Federated learning with personalization layers. arXiv preprint arXiv:1912.00818 .
  4. The liver tumor segmentation benchmark (lits). CoRR abs/1901.04056. 1901.04056.
  5. Expanding the reach of federated learning by reducing client resource requirements. arXiv preprint arXiv:1812.07210 .
  6. A review of medical image data augmentation techniques for deep learning applications. Journal of Medical Imaging and Radiation Oncology 65, 545–563.
  7. Deep cross-modality and resolution graph integration for universal brain connectivity mapping and augmentation, in: MICCAI Workshop on Imaging Systems for GI Endoscopy, Springer. pp. 89–98.
  8. Adaptive personalized federated learning. arXiv preprint arXiv:2003.13461 .
  9. Heterofl: Computation and communication efficient federated learning for heterogeneous clients, in: International Conference on Learning Representations.
  10. Prediction of individual brain maturity using fmri. Science 329, 1358–1361.
  11. Replica-based federated learning with heterogeneous architectures for graph super-resolution, in: International Workshop on Machine Learning in Medical Imaging, Springer. pp. 273–282.
  12. An efficient framework for clustered federated learning. Advances in Neural Information Processing Systems 33, 19586–19597.
  13. Generative adversarial networks. Communications of the ACM 63, 139–144.
  14. A review on generative adversarial networks: Algorithms, theory, and applications. IEEE transactions on knowledge and data engineering 35, 3313–3332.
  15. Federated learning of a mixture of global and local models. arXiv preprint arXiv:2002.05516 .
  16. Group knowledge transfer: Federated learning of large cnns at the edge. Advances in Neural Information Processing Systems 33, 14068–14080.
  17. Deep residual learning for image recognition, in: Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 770–778.
  18. Distilling the knowledge in a neural network. arXiv preprint arXiv:1503.02531 .
  19. Fjord: Fair and accurate federated learning under heterogeneous targets with ordered dropout. Advances in Neural Information Processing Systems 34, 12876–12889.
  20. Advances and open problems in federated learning. Foundations and Trends® in Machine Learning 14, 1–210.
  21. Federated learning from small datasets, in: The Eleventh International Conference on Learning Representations.
  22. Transfer learning for medical image classification: a literature review. BMC medical imaging 22, 69.
  23. Survey of personalization techniques for federated learning, in: 2020 Fourth World Conference on Smart Trends in Systems, Security and Sustainability (WorldS4), IEEE. pp. 794–797.
  24. Fedmd: Heterogenous federated learning via model distillation. arXiv preprint arXiv:1910.03581 .
  25. Federated optimization in heterogeneous networks. Proceedings of Machine learning and systems 2, 429–450.
  26. Federated learning with heterogeneous architectures using graph hypernetworks. arXiv preprint arXiv:2201.08459 .
  27. Longitudinal test-retest neuroimaging data from healthy young adults in southwest china. Scientific data 4, 1–9.
  28. Communication-efficient learning of deep networks from decentralized data, in: Artificial intelligence and statistics, PMLR. pp. 1273–1282.
  29. Stairwaygraphnet for inter-and intra-modality multi-resolution brain graph alignment and synthesis, in: Machine Learning in Medical Imaging: 12th International Workshop, MLMI 2021, Held in Conjunction with MICCAI 2021, Strasbourg, France, September 27, 2021, Proceedings 12, Springer. pp. 140–150.
  30. Clustered federated learning: Model-agnostic distributed multitask optimization under privacy constraints. IEEE transactions on neural networks and learning systems 32, 3710–3722.
  31. Personalized federated learning using hypernetworks, in: International Conference on Machine Learning, PMLR. pp. 9489–9502.
  32. Groupwise whole-brain parcellation from resting-state fmri data for network node identification. Neuroimage 82, 403–415.
  33. A survey on image data augmentation for deep learning. Journal of big data 6, 1–48.
  34. Dynamic edge-conditioned filters in convolutional neural networks on graphs, in: Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 3693–3702.
  35. Towards personalized federated learning. IEEE Transactions on Neural Networks and Learning Systems .
  36. Transfer learning, in: Handbook of research on machine learning applications and trends: algorithms, methods, and techniques. IGI global, pp. 242–264.
  37. Efficient multiple organ localization in ct image using 3d region proposal network. IEEE Transactions on Medical Imaging 38, 1885–1898.
  38. Medmnist classification decathlon: A lightweight automl benchmark for medical image analysis, in: IEEE 18th International Symposium on Biomedical Imaging (ISBI), pp. 191–195.
  39. Medmnist v2-a large-scale lightweight benchmark for 2d and 3d biomedical image classification. Scientific Data 10, 41.
  40. Personalized federated learning with first order model optimization. 2012.08565.
  41. A comprehensive survey on transfer learning. Proceedings of the IEEE 109, 43–76.

Summary

We haven't generated a summary for this paper yet.

Github Logo Streamline Icon: https://streamlinehq.com