Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Federated Multi-Task Learning on Non-IID Data Silos: An Experimental Study (2402.12876v2)

Published 20 Feb 2024 in cs.LG, cs.CR, and cs.DC

Abstract: The innovative Federated Multi-Task Learning (FMTL) approach consolidates the benefits of Federated Learning (FL) and Multi-Task Learning (MTL), enabling collaborative model training on multi-task learning datasets. However, a comprehensive evaluation method, integrating the unique features of both FL and MTL, is currently absent in the field. This paper fills this void by introducing a novel framework, FMTL-Bench, for systematic evaluation of the FMTL paradigm. This benchmark covers various aspects at the data, model, and optimization algorithm levels, and comprises seven sets of comparative experiments, encapsulating a wide array of non-independent and identically distributed (Non-IID) data partitioning scenarios. We propose a systematic process for comparing baselines of diverse indicators and conduct a case study on communication expenditure, time, and energy consumption. Through our exhaustive experiments, we aim to provide valuable insights into the strengths and limitations of existing baseline methods, contributing to the ongoing discourse on optimal FMTL application in practical scenarios. The source code can be found on https://github.com/youngfish42/FMTL-Benchmark .

Definition Search Book Streamline Icon: https://streamlinehq.com
References (65)
  1. Lasse F. Wolff Anthony et al. 2020. Carbontracker: Tracking and Predicting the Carbon Footprint of Training Deep Learning Models. ICML Workshop on Challenges in Deploying and monitoring Machine Learning Systems.
  2. Xiang Bai et al. 2021. Advancing COVID-19 Diagnosis with Privacy-Preserving Collaboration in Artificial Intelligence. Nature Machine Intelligence 3, 12 (2021), 1081–1089.
  3. Cosmin I. Bercea et al. 2022. Federated Disentangled Representation Learning for Unsupervised Brain Anomaly Detection. Nature Machine Intelligence 4, 8 (2022), 685–695.
  4. Many-Task Federated Learning: A New Problem Setting and A Simple Baseline. In CVPR. 5037–5045.
  5. Rich Caruana. 1997. Multitask learning. Machine learning 28, 1 (1997), 41–75.
  6. Hong-You Chen and Wei-Lun Chao. 2022. On Bridging Generic and Personalized Federated Learning for Image Classification. In ICLR.
  7. Liang-Chieh Chen et al. 2018. Encoder-decoder with atrous separable convolution for semantic image segmentation. In ECCV. 801–818.
  8. FedBone: Towards Large-Scale Federated Multi-Task Learning. CoRR abs/2306.17465 (2023).
  9. Exploiting shared representations for personalized federated learning. In ICML. 2089–2099.
  10. Michael Crawshaw. 2020. Multi-task learning with deep neural networks: A survey. CoRR abs/2009.09796 (2020).
  11. Ittai Dayan et al. 2021. Federated Learning for Predicting Clinical Outcomes in Patients with COVID-19. Nature Medicine 27, 10 (2021), 1735–1743.
  12. Janez Demsar. 2006. Statistical Comparisons of Classifiers over Multiple Data Sets. J. Mach. Learn. Res. 7 (2006), 1–30.
  13. Sabri Eyuboglu et al. 2021. Multi-task weak supervision enables anatomically-resolved abnormality detection in whole-body FDG-PET/CT. Nat Commun 12 (2021), 1880.
  14. Robustly federated learning model for identifying high-risk patients with postoperative gastric cancer recurrence. Nat Commun 15 (2024), 742.
  15. Tianyu Han et al. 2020. Breaking Medical Data Sharing Boundaries by Using Synthesized Radiographs. Science Advances 6, 49 (2020), eabb7973.
  16. Spreadgnn: Decentralized multi-task federated learning for graph neural networks on molecular data. In AAAI, Vol. 36. 6865–6873.
  17. Deep residual learning for image recognition. In CVPR. 770–778.
  18. The oarf benchmark suite: Characterization and implications for federated learning systems. ACM Transactions on Intelligent Systems and Technology 13 (2022), 1–32.
  19. Personalized cross-silo federated learning on non-iid data. In AAAI, Vol. 35. 7865–7873.
  20. Joel Janai et al. 2020. Computer vision for autonomous vehicles: Problems, datasets and state of the art. Foundations and Trends® in Computer Graphics and Vision 12, 1–3 (2020), 1–308.
  21. Cheng Jin et al. 2021. Predicting treatment response from longitudinal images using multi-task deep learning. Nat Commun 12 (2021), 1851.
  22. Advances and Open Problems in Federated Learning. Found. Trends Mach. Learn. 14, 1-2 (2021), 1–210.
  23. Georgios Kaissis et al. 2021. End-to-End Privacy Preserving Deep Learning on Multi-Institutional Medical Imaging. Nature Machine Intelligence 3, 6 (2021), 473–484.
  24. Georgios A. Kaissis et al. 2020. Secure, Privacy-Preserving and Federated Machine Learning in Medical Imaging. Nature Machine Intelligence 2, 6 (2020), 305–311.
  25. Shivam Kalra et al. 2023. Decentralized federated learning through proxy model sharing. Nat Commun 14 (2023), 2899.
  26. Reparameterizing convolutions for incremental multi-task learning without task interference. In ECCV. 689–707.
  27. Federated benchmarking of medical artificial intelligence with MedPerf. Nat Mach Intell 5, 7 (2023), 799–810.
  28. Jakub Konečný et al. 2015. Federated Optimization: Distributed Optimization Beyond the Datacenter. CoRR abs/1511.03575 (2015).
  29. Federated learning on non-iid data silos: An experimental study. In 2022 IEEE 38th International Conference on Data Engineering (ICDE). IEEE, 965–978.
  30. Federated Optimization in Heterogeneous Networks. In MLSys.
  31. Conflict-Averse Gradient Descent for Multi-task learning. In NeurIPS. 18878–18890.
  32. On privacy and personalization in cross-silo federated learning. NeurIPS 35 (2022), 5925–5940.
  33. Swin transformer: Hierarchical vision transformer using shifted windows. In ICCV. 10012–10022.
  34. Ilya Loshchilov and Frank Hutter. 2017. Sgdr: Stochastic gradient descent with warm restarts. In ICLR.
  35. Ilya Loshchilov and Frank Hutter. 2019. Decoupled Weight Decay Regularization. In ICLR.
  36. Towards Hetero-Client Federated Multi-Task Learning. CoRR abs/2311.13250 (2023).
  37. Attentive single-tasking of multiple tasks. In CVPR. 1851–1860.
  38. Federated Multi-Task Learning under a Mixture of Distributions. In NeurIPS. 15434–15447.
  39. Brendan McMahan et al. 2017. Communication-Efficient Learning of Deep Networks from Decentralized Data. In AISTATS, Vol. 54. 1273–1282.
  40. Jed Mills et al. 2021. Multi-task federated learning for personalised deep neural networks in edge computing. TPDS 33, 3 (2021), 630–641.
  41. Roozbeh Mottaghi et al. 2014. The role of context for object detection and semantic segmentation in the wild. In CVPR. 891–898.
  42. Sangjoon Park et al. 2021. Federated Split Task-Agnostic Vision Transformer for COVID-19 CXR Diagnosis. In NeurIPS.
  43. Adam Paszke et al. 2019. Pytorch: An imperative style, high-performance deep learning library. NeurIPS 32 (2019).
  44. Handling Data Heterogeneity via Architectural Design for Federated Visual Recognition. NeurIPS 36 (2024).
  45. Tao Qi et al. 2023. Differentially private knowledge transfer for federated learning. Nat Commun 14 (2023), 3785.
  46. A First Look into the Carbon Footprint of Federated Learning. J. Mach. Learn. Res. 24 (2023), 129:1–129:23.
  47. Vision transformers for dense prediction. In ICCV. 12179–12188.
  48. Sebastian Ruder. 2017. An overview of multi-task learning in deep neural networks. CoRR abs/1706.05098 (2017).
  49. Lothar Sachs. 2013. Angewandte Statistik: Statistische Methoden und ihre Anwendungen. Springer-Verlag.
  50. Mingjia Shi et al. 2023. PRIOR: Personalized Prior for Reactivating the Information Overlooked in Federated Learning.. In NeurIPS.
  51. Nathan Silberman et al. 2012. Indoor segmentation and support inference from rgbd images. In ECCV. 746–760.
  52. Federated Multi-Task Learning. In NeurIPS. 4424–4434.
  53. Task switching network for multi-task learning. In ICCV. 8291–8300.
  54. Towards Personalized Federated Learning. IEEE Transactions on Neural Networks and Learning Systems (2022), 1–17.
  55. Simon Vandenhende et al. 2021. Multi-task learning for dense prediction tasks: A survey. IEEE TPAMI 44, 7 (2021), 3614–3633.
  56. Wenhai Wang et al. 2021. Pyramid vision transformer: A versatile backbone for dense prediction without convolutions. In ICCV. 568–578.
  57. Stefanie Warnat-Herresthal et al. 2021. Swarm Learning for Decentralized and Confidential Clinical Machine Learning. Nature 594, 7862 (2021), 265–270.
  58. Chuhan Wu et al. 2022a. Communication-Efficient Federated Learning via Knowledge Distillation. Nature Communications 13, 1 (2022), 2032.
  59. Chuhan Wu et al. 2022b. A Federated Graph Neural Network Framework for Privacy-Preserving Personalization. Nature Communications 13 (2022), 3091.
  60. Xiaosong Wu et al. 2023. Wearable in-sensor reservoir computing using optoelectronic polymers with through-space charge-transport characteristics for multi-task learning. Nat Commun 14 (2023), 468.
  61. Federated Machine Learning: Concept and Applications. ACM Trans. Intell. Syst. Technol. 10, 2 (2019), 12:1–12:19.
  62. Hanrong Ye and Dan Xu. 2022. Inverted pyramid multi-task transformer for dense scene understanding. In ECCV. 514–530.
  63. Gradient Surgery for Multi-Task Learning. In NeurIPS.
  64. Angela Zhang et al. 2022. Shifting Machine Learning for Healthcare from Development to Deployment and from Models to Data. Nature Biomedical Engineering (2022), 1–16.
  65. MAS: Towards Resource-Efficient Federated Multiple-Task Learning. In ICCV. 23414–23424.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Yuwen Yang (21 papers)
  2. Yuxiang Lu (26 papers)
  3. Suizhi Huang (10 papers)
  4. Shalayiding Sirejiding (6 papers)
  5. Hongtao Lu (76 papers)
  6. Yue Ding (49 papers)
Citations (1)