Advances in Robust Federated Learning: Heterogeneity Considerations (2405.09839v1)
Abstract: In the field of heterogeneous federated learning (FL), the key challenge is to efficiently and collaboratively train models across multiple clients with different data distributions, model structures, task objectives, computational capabilities, and communication resources. This diversity leads to significant heterogeneity, which increases the complexity of model training. In this paper, we first outline the basic concepts of heterogeneous federated learning and summarize the research challenges in federated learning in terms of five aspects: data, model, task, device, and communication. In addition, we explore how existing state-of-the-art approaches cope with the heterogeneity of federated learning, and categorize and review these approaches at three different levels: data-level, model-level, and architecture-level. Subsequently, the paper extensively discusses privacy-preserving strategies in heterogeneous federated learning environments. Finally, the paper discusses current open issues and directions for future research, aiming to promote the further development of heterogeneous federated learning.
- Q. Li, Z. Wen, Z. Wu, S. Hu, N. Wang, Y. Li, X. Liu, and B. He, “A survey on federated learning systems: Vision, hype and reality for data privacy and protection,” IEEE Transactions on Knowledge and Data Engineering, vol. 35, no. 4, pp. 3347–3366, 2021.
- B. McMahan, E. Moore, D. Ramage, S. Hampson, and B. A. y Arcas, “Communication-efficient learning of deep networks from decentralized data,” in Artificial intelligence and statistics. PMLR, 2017, pp. 1273–1282.
- R. S. Antunes, C. André da Costa, A. Küderle, I. A. Yari, and B. Eskofier, “Federated learning for healthcare: Systematic review and architecture proposal,” ACM Transactions on Intelligent Systems and Technology (TIST), vol. 13, no. 4, pp. 1–23, 2022.
- T. Nevrataki, A. Iliadou, G. Ntolkeras, I. Sfakianakis, L. Lazaridis, G. Maraslidis, N. Asimopoulos, and G. F. Fragulis, “A survey on federated learning applications in healthcare, finance, and data privacy/data security,” in AIP Conference Proceedings, vol. 2909, no. 1. AIP Publishing, 2023.
- D. C. Nguyen, M. Ding, P. N. Pathirana, A. Seneviratne, J. Li, and H. V. Poor, “Federated learning for internet of things: A comprehensive survey,” IEEE Communications Surveys & Tutorials, vol. 23, no. 3, pp. 1622–1658, 2021.
- Z. Zheng, Y. Zhou, Y. Sun, Z. Wang, B. Liu, and K. Li, “Applications of federated learning in smart cities: recent advances, taxonomy, and open challenges,” Connection Science, vol. 34, no. 1, pp. 1–28, 2022.
- H. Wang, Z. Kaplan, D. Niu, and B. Li, “Optimizing federated learning on non-iid data with reinforcement learning,” in IEEE INFOCOM 2020-IEEE conference on computer communications. IEEE, 2020, pp. 1698–1707.
- D. Li and J. Wang, “Fedmd: Heterogenous federated learning via model distillation,” arXiv preprint arXiv:1910.03581, 2019.
- V. Smith, C.-K. Chiang, M. Sanjabi, and A. S. Talwalkar, “Federated multi-task learning,” Advances in neural information processing systems, vol. 30, 2017.
- M. S. H. Abad, E. Ozfatura, D. Gunduz, and O. Ercetin, “Hierarchical federated learning across heterogeneous cellular networks,” in ICASSP 2020-2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2020, pp. 8866–8870.
- K. Pfeiffer, M. Rapp, R. Khalili, and J. Henkel, “Federated learning for computationally constrained heterogeneous devices: A survey,” ACM Computing Surveys, vol. 55, no. 14s, pp. 1–27, 2023.
- L. Zhu, Z. Liu, and S. Han, “Deep leakage from gradients,” Advances in neural information processing systems, vol. 32, 2019.
- J. Geiping, H. Bauermeister, H. Dröge, and M. Moeller, “Inverting gradients-how easy is it to break privacy in federated learning?” Advances in neural information processing systems, vol. 33, pp. 16 937–16 947, 2020.
- M. Nasr, R. Shokri, and A. Houmansadr, “Comprehensive privacy analysis of deep learning: Passive and active white-box inference attacks against centralized and federated learning,” in 2019 IEEE symposium on security and privacy (SP). IEEE, 2019, pp. 739–753.
- L. Lyu, H. Yu, X. Ma, C. Chen, L. Sun, J. Zhao, Q. Yang, and S. Y. Philip, “Privacy and robustness in federated learning: Attacks and defenses,” IEEE transactions on neural networks and learning systems, 2022.
- M. Ye, X. Fang, B. Du, P. C. Yuen, and D. Tao, “Heterogeneous federated learning: State-of-the-art and research challenges,” ACM Computing Surveys, vol. 56, no. 3, pp. 1–44, 2023.
- J. Wen, Z. Zhang, Y. Lan, Z. Cui, J. Cai, and W. Zhang, “A survey on federated learning: challenges and applications,” International Journal of Machine Learning and Cybernetics, vol. 14, no. 2, pp. 513–535, 2023.
- C. Zhang, Y. Xie, H. Bai, B. Yu, W. Li, and Y. Gao, “A survey on federated learning,” Knowledge-Based Systems, vol. 216, p. 106775, 2021.
- S. Ji, Y. Tan, T. Saravirta, Z. Yang, Y. Liu, L. Vasankari, S. Pan, G. Long, and A. Walid, “Emerging trends in federated learning: From model fusion to federated x learning,” International Journal of Machine Learning and Cybernetics, pp. 1–22, 2024.
- D. Gao, X. Yao, and Q. Yang, “A survey on heterogeneous federated learning,” arXiv preprint arXiv:2210.04505, 2022.
- T. M. Mengistu, T. Kim, and J.-W. Lin, “A survey on heterogeneity taxonomy, security and privacy preservation in the integration of iot, wireless sensor networks and federated learning,” Sensors, vol. 24, no. 3, p. 968, 2024.
- B. S. Guendouzi, S. Ouchani, H. E. Assaad, and M. E. Zaher, “A systematic review of federated learning: Challenges, aggregation methods, and development tools,” Journal of Network and Computer Applications, p. 103714, 2023.
- Z. Lu, H. Pan, Y. Dai, X. Si, and Y. Zhang, “Federated learning with non-iid data: A survey,” IEEE Internet of Things Journal, 2024.
- Q. Yang, Y. Liu, T. Chen, and Y. Tong, “Federated machine learning: Concept and applications,” ACM Transactions on Intelligent Systems and Technology (TIST), vol. 10, no. 2, pp. 1–19, 2019.
- J. Zhang, S. Guo, Z. Qu, D. Zeng, Y. Zhan, Q. Liu, and R. Akerkar, “Adaptive federated learning on non-iid data with resource constraint,” IEEE Transactions on Computers, vol. 71, no. 7, pp. 1655–1667, 2021.
- L. Zhang, L. Shen, L. Ding, D. Tao, and L.-Y. Duan, “Fine-tuning global model via data-free knowledge distillation for non-iid federated learning,” in Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, 2022, pp. 10 174–10 183.
- L. Gao, H. Fu, L. Li, Y. Chen, M. Xu, and C.-Z. Xu, “Feddc: Federated learning with non-iid data via local drift decoupling and correction,” in Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, 2022, pp. 10 112–10 121.
- X. Ma, J. Zhu, Z. Lin, S. Chen, and Y. Qin, “A state-of-the-art survey on solving non-iid data in federated learning,” Future Generation Computer Systems, vol. 135, pp. 244–258, 2022.
- Z. Wang, Y. Zhu, D. Wang, and Z. Han, “Fedacs: Federated skewness analytics in heterogeneous decentralized data environments,” in 2021 IEEE/ACM 29th International Symposium on Quality of Service (IWQOS). IEEE, 2021, pp. 1–10.
- J. Zhang, Z. Li, B. Li, J. Xu, S. Wu, S. Ding, and C. Wu, “Federated learning with label distribution skew via logits calibration,” in International Conference on Machine Learning. PMLR, 2022, pp. 26 311–26 329.
- X.-C. Li and D.-C. Zhan, “Fedrs: Federated learning with restricted softmax for label distribution non-iid data,” in Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining, 2021, pp. 995–1005.
- Z. Luo, Y. Wang, Z. Wang, Z. Sun, and T. Tan, “Disentangled federated learning for tackling attributes skew via invariant aggregation and diversity transferring,” arXiv preprint arXiv:2206.06818, 2022.
- T. Zhou, J. Zhang, and D. H. Tsang, “Fedfa: Federated learning with feature anchors to align features and classifiers for heterogeneous data,” IEEE Transactions on Mobile Computing, 2023.
- T. Chen, X. Jin, Y. Sun, and W. Yin, “Vafl: a method of vertical asynchronous federated learning,” arXiv preprint arXiv:2007.06081, 2020.
- K. Wei, J. Li, C. Ma, M. Ding, S. Wei, F. Wu, G. Chen, and T. Ranbaduge, “Vertical federated learning: Challenges, methodologies and experiments,” arXiv preprint arXiv:2202.04309, 2022.
- S. Yang, H. Park, J. Byun, and C. Kim, “Robust federated learning with noisy labels,” IEEE Intelligent Systems, vol. 37, no. 2, pp. 35–43, 2022.
- J. Xu, Z. Chen, T. Q. Quek, and K. F. E. Chong, “Fedcorr: Multi-stage federated learning for label noise correction,” in Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, 2022, pp. 10 184–10 193.
- X. Fang and M. Ye, “Robust federated learning with noisy and heterogeneous clients,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2022, pp. 10 072–10 081.
- L. Wang, J. Bian, and J. Xu, “Federated learning with instance-dependent noisy label,” in ICASSP 2024-2024 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2024, pp. 8916–8920.
- Z. Chai, A. Ali, S. Zawad, S. Truex, A. Anwar, N. Baracaldo, Y. Zhou, H. Ludwig, F. Yan, and Y. Cheng, “Tifl: A tier-based federated learning system,” in Proceedings of the 29th international symposium on high-performance parallel and distributed computing, 2020, pp. 125–136.
- Q. Li, Y. Diao, Q. Chen, and B. He, “Federated learning on non-iid data silos: An experimental study,” in 2022 IEEE 38th international conference on data engineering (ICDE). IEEE, 2022, pp. 965–978.
- L. Qu, N. Balachandar, and D. L. Rubin, “An experimental study of data heterogeneity in federated learning methods for medical imaging,” arXiv preprint arXiv:2107.08371, 2021.
- T. Li, A. K. Sahu, M. Zaheer, M. Sanjabi, A. Talwalkar, and V. Smith, “Federated optimization in heterogeneous networks,” Proceedings of Machine learning and systems, vol. 2, pp. 429–450, 2020.
- S. Alam, L. Liu, M. Yan, and M. Zhang, “Fedrolex: Model-heterogeneous federated learning with rolling sub-model extraction,” Advances in neural information processing systems, vol. 35, pp. 29 677–29 690, 2022.
- L. Yi, G. Wang, X. Liu, Z. Shi, and H. Yu, “Fedgh: Heterogeneous federated learning with generalized global header,” in Proceedings of the 31st ACM International Conference on Multimedia, 2023, pp. 8686–8696.
- T. Castiglia, S. Wang, and S. Patterson, “Flexible vertical federated learning with heterogeneous parties,” IEEE Transactions on Neural Networks and Learning Systems, 2023.
- A. M. Abdelmoniem, C.-Y. Ho, P. Papageorgiou, and M. Canini, “A comprehensive empirical study of heterogeneity in federated learning,” IEEE Internet of Things Journal, 2023.
- L. Yao, D. Gao, Z. Wang, Y. Xie, W. Kuang, D. Chen, H. Wang, C. Dong, B. Ding, and Y. Li, “A benchmark for federated hetero-task learning,” arXiv preprint arXiv:2206.03436, 2022.
- Y. SarcheshmehPour, Y. Tian, L. Zhang, and A. Jung, “Networked federated multi-task learning,” Authorea Preprints, 2023.
- Z. Zhu, J. Hong, S. Drew, and J. Zhou, “Resilient and communication efficient learning for heterogeneous federated systems,” Proceedings of machine learning research, vol. 162, p. 27504, 2022.
- E. Diao, J. Ding, and V. Tarokh, “Heterofl: Computation and communication efficient federated learning for heterogeneous clients,” arXiv preprint arXiv:2010.01264, 2020.
- A. A. Abdellatif, N. Mhaisen, A. Mohamed, A. Erbad, M. Guizani, Z. Dawy, and W. Nasreddine, “Communication-efficient hierarchical federated learning for iot heterogeneous systems with imbalanced data,” Future Generation Computer Systems, vol. 128, pp. 406–419, 2022.
- T. Nishio and R. Yonetani, “Client selection for federated learning with heterogeneous resources in mobile edge,” in ICC 2019-2019 IEEE international conference on communications (ICC). IEEE, 2019, pp. 1–7.
- L. Li, D. Shi, R. Hou, H. Li, M. Pan, and Z. Han, “To talk or to work: Flexible communication compression for energy efficient federated learning over heterogeneous mobile edge devices,” in IEEE INFOCOM 2021-IEEE Conference on Computer Communications. IEEE, 2021, pp. 1–10.
- S. Wang, M. Lee, S. Hosseinalipour, R. Morabito, M. Chiang, and C. G. Brinton, “Device sampling for heterogeneous federated learning: Theory, algorithms, and implementation,” in IEEE INFOCOM 2021-IEEE Conference on Computer Communications. IEEE, 2021, pp. 1–10.
- X. Xu, S. Duan, J. Zhang, Y. Luo, and D. Zhang, “Optimizing federated learning on device heterogeneity with a sampling strategy,” in 2021 IEEE/ACM 29th International Symposium on Quality of Service (IWQOS). IEEE, 2021, pp. 1–10.
- A. Li, L. Zhang, J. Tan, Y. Qin, J. Wang, and X.-Y. Li, “Sample-level data selection for federated learning,” in IEEE INFOCOM 2021-IEEE Conference on Computer Communications. IEEE, 2021, pp. 1–10.
- L. Ma, Q. Pei, L. Zhou, H. Zhu, L. Wang, and Y. Ji, “Federated data cleaning: Collaborative and privacy-preserving data cleaning for edge intelligence,” IEEE Internet of Things Journal, vol. 8, no. 8, pp. 6757–6770, 2020.
- L. Chen, F. Ang, Y. Chen, and W. Wang, “Robust federated learning with noisy labeled data through loss function correction,” IEEE Transactions on Network Science and Engineering, vol. 10, no. 3, pp. 1501–1511, 2022.
- S. Duan, C. Liu, Z. Cao, X. Jin, and P. Han, “Fed-dr-filter: Using global data representation to reduce the impact of noisy labels on the performance of federated learning,” Future Generation Computer Systems, vol. 137, pp. 336–348, 2022.
- V. Tsouvalas, A. Saeed, T. Ozcelebi, and N. Meratnia, “Labeling chaos to learning harmony: Federated learning with noisy labels,” ACM Transactions on Intelligent Systems and Technology, vol. 15, no. 2, pp. 1–26, 2024.
- J. Li, G. Li, H. Cheng, Z. Liao, and Y. Yu, “Feddiv: Collaborative noise filtering for federated learning with noisy labels,” in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 38, no. 4, 2024, pp. 3118–3126.
- X. Xu, H. Li, Z. Li, and X. Zhou, “Safe: Synergic data filtering for federated learning in cloud-edge computing,” IEEE Transactions on Industrial Informatics, vol. 19, no. 2, pp. 1655–1665, 2022.
- W. Hao, M. El-Khamy, J. Lee, J. Zhang, K. J. Liang, C. Chen, and L. C. Duke, “Towards fair federated learning with zero-shot data augmentation,” in Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, 2021, pp. 3310–3319.
- H. Chen, A. Frikha, D. Krompass, J. Gu, and V. Tresp, “Fraug: Tackling federated learning with non-iid features via representation augmentation,” in Proceedings of the IEEE/CVF International Conference on Computer Vision, 2023, pp. 4849–4859.
- D. Lewy, J. Mańdziuk, M. Ganzha, and M. Paprzycki, “Statmix: Data augmentation method that relies on image statistics in federated learning,” in International Conference on Neural Information Processing. Springer, 2022, pp. 574–585.
- B. Xin, W. Yang, Y. Geng, S. Chen, S. Wang, and L. Huang, “Private fl-gan: Differential privacy synthetic data generation based on federated learning,” in ICASSP 2020-2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2020, pp. 2927–2931.
- X. Zhang, J. M. Parra-Ullauri, S. Moazzeni, X. Vasilakos, R. Nejabati, and D. Simeonidou, “Federated analytics with data augmentation in domain generalization towards future networks,” IEEE Transactions on Machine Learning in Communications and Networking, 2024.
- J. Zhang and Y. Jiang, “A data augmentation method for vertical federated learning,” Wireless Communications and Mobile Computing, vol. 2022, pp. 1–16, 2022.
- Y. Xiao, X. Li, T. Li, R. Wang, Y. Pang, and G. Wang, “A distributed generative adversarial network for data augmentation under vertical federated learning,” IEEE Transactions on Big Data, 2024.
- Y. Liu, Y. Kang, C. Xing, T. Chen, and Q. Yang, “A secure federated transfer learning framework,” IEEE Intelligent Systems, vol. 35, no. 4, pp. 70–82, 2020.
- A. Li, J. Sun, X. Zeng, M. Zhang, H. Li, and Y. Chen, “Fedmask: Joint computation and communication-efficient personalized federated learning via heterogeneous masking,” in Proceedings of the 19th ACM Conference on Embedded Networked Sensor Systems, 2021, pp. 42–55.
- F. Zhang, K. Kuang, L. Chen, Z. You, T. Shen, J. Xiao, Y. Zhang, C. Wu, F. Wu, Y. Zhuang et al., “Federated unsupervised representation learning,” Frontiers of Information Technology & Electronic Engineering, vol. 24, no. 8, pp. 1181–1193, 2023.
- B. van Berlo, A. Saeed, and T. Ozcelebi, “Towards federated unsupervised representation learning,” in Proceedings of the third ACM international workshop on edge systems, analytics and networking, 2020, pp. 31–36.
- Z. Wu, Q. Li, and B. He, “Practical vertical federated learning with unsupervised representation learning,” IEEE Transactions on Big Data, 2022.
- Q. Li, B. He, and D. Song, “Model-contrastive federated learning,” in Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, 2021, pp. 10 713–10 722.
- J. Jang, H. Ha, D. Jung, and S. Yoon, “Fedclassavg: Local representation learning for personalized federated learning on heterogeneous neural networks,” in Proceedings of the 51st International Conference on Parallel Processing, 2022, pp. 1–10.
- Y. Tan, G. Long, L. Liu, T. Zhou, Q. Lu, J. Jiang, and C. Zhang, “Fedproto: Federated prototype learning across heterogeneous clients,” in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, no. 8, 2022, pp. 8432–8440.
- M. G. Arivazhagan, V. Aggarwal, A. K. Singh, and S. Choudhary, “Federated learning with personalization layers,” arXiv preprint arXiv:1912.00818, 2019.
- L. Collins, H. Hassani, A. Mokhtari, and S. Shakkottai, “Exploiting shared representations for personalized federated learning,” in International conference on machine learning. PMLR, 2021, pp. 2089–2099.
- M. Luo, F. Chen, D. Hu, Y. Zhang, J. Liang, and J. Feng, “No fear of heterogeneity: Classifier calibration for federated learning with non-iid data,” Advances in Neural Information Processing Systems, vol. 34, pp. 5972–5984, 2021.
- X. Shang, Y. Lu, G. Huang, and H. Wang, “Federated learning on heterogeneous and long-tailed data via classifier re-training with federated features,” arXiv preprint arXiv:2204.13399, 2022.
- P. P. Liang, T. Liu, L. Ziyin, N. B. Allen, R. P. Auerbach, D. Brent, R. Salakhutdinov, and L.-P. Morency, “Think locally, act globally: Federated learning with local and global representations,” arXiv preprint arXiv:2001.01523, 2020.
- J. Xu, X. Tong, and S.-L. Huang, “Personalized federated learning with feature alignment and classifier collaboration,” arXiv preprint arXiv:2306.11867, 2023.
- Y. Shen, Y. Zhou, and L. Yu, “Cd2-pfed: Cyclic distillation-guided channel decoupling for model personalization in federated learning,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2022, pp. 10 041–10 050.
- J. Chen, J. Zhu, Q. Zheng, Z. Li, and Z. Tian, “Watch your head: Assembling projection heads to save the reliability of federated models,” arXiv preprint arXiv:2402.16255, 2024.
- K. Pillutla, K. Malik, A.-R. Mohamed, M. Rabbat, M. Sanjabi, and L. Xiao, “Federated learning with partial model personalization,” in International Conference on Machine Learning. PMLR, 2022, pp. 17 716–17 758.
- J. Wang, X. Yang, S. Cui, L. Che, L. Lyu, D. D. Xu, and F. Ma, “Towards personalized federated learning via heterogeneous model reassembly,” Advances in Neural Information Processing Systems, vol. 36, 2024.
- K. Yi, N. Gazagnadou, P. Richtárik, and L. Lyu, “Fedp3: Federated personalized and privacy-friendly network pruning under model heterogeneity,” arXiv preprint arXiv:2404.09816, 2024.
- K. Pfeiffer, R. Khalili, and J. Henkel, “Aggregating capacity in fl through successive layer training for computationally-constrained devices,” Advances in Neural Information Processing Systems, vol. 36, 2024.
- J. Bernstein, Y.-X. Wang, K. Azizzadenesheli, and A. Anandkumar, “signsgd: Compressed optimisation for non-convex problems,” in International Conference on Machine Learning. PMLR, 2018, pp. 560–569.
- F. Sattler, S. Wiedemann, K.-R. Müller, and W. Samek, “Robust and communication-efficient federated learning from non-iid data,” IEEE transactions on neural networks and learning systems, vol. 31, no. 9, pp. 3400–3413, 2019.
- M. M. Amiri, D. Gunduz, S. R. Kulkarni, and H. V. Poor, “Federated learning with quantized global model updates,” arXiv preprint arXiv:2006.10672, 2020.
- S. M. Shah and V. K. Lau, “Model compression for communication efficient federated learning,” IEEE Transactions on Neural Networks and Learning Systems, vol. 34, no. 9, pp. 5937–5951, 2021.
- Z. Tang, Y. Wang, and T.-H. Chang, “z-signfedavg: A unified stochastic sign-based compression for federated learning,” in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 38, no. 14, 2024, pp. 15 301–15 309.
- D. Yao, W. Pan, M. J. O’Neill, Y. Dai, Y. Wan, H. Jin, and L. Sun, “Fedhm: Efficient federated learning for heterogeneous models via low-rank factorization,” arXiv preprint arXiv:2111.14655, 2021.
- W. Jeong and S. J. Hwang, “Factorized-fl: Personalized federated learning with parameter factorization & similarity matching,” Advances in Neural Information Processing Systems, vol. 35, pp. 35 684–35 695, 2022.
- Y. Chen, H. Vikalo, and C. Wang, “Fed-qssl: A framework for personalized federated learning under bitwidth and data heterogeneity,” in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 38, no. 10, 2024, pp. 11 443–11 452.
- G. Hinton, O. Vinyals, and J. Dean, “Distilling the knowledge in a neural network,” arXiv preprint arXiv:1503.02531, 2015.
- W. Huang, M. Ye, and B. Du, “Learn from others and be yourself in heterogeneous federated learning,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2022, pp. 10 143–10 153.
- Z. Zhu, J. Hong, and J. Zhou, “Data-free knowledge distillation for heterogeneous federated learning,” in International conference on machine learning. PMLR, 2021, pp. 12 878–12 889.
- Z. Chen, H. Yang, T. Quek, and K. F. E. Chong, “Spectral co-distillation for personalized federated learning,” Advances in Neural Information Processing Systems, vol. 36, 2024.
- L. Corinzia, A. Beuret, and J. M. Buhmann, “Variational federated multi-task learning,” arXiv preprint arXiv:1906.06268, 2019.
- R. Li, F. Ma, W. Jiang, and J. Gao, “Online federated multitask learning,” in 2019 IEEE International Conference on Big Data (Big Data). IEEE, 2019, pp. 215–220.
- C. T. Dinh, T. T. Vu, N. H. Tran, M. N. Dao, and H. Zhang, “Fedu: A unified framework for federated multi-task learning with laplacian regularization,” arXiv preprint arXiv:2102.07148, vol. 400, 2021.
- O. Marfoq, G. Neglia, A. Bellet, L. Kameni, and R. Vidal, “Federated multi-task learning under a mixture of distributions,” Advances in Neural Information Processing Systems, vol. 34, pp. 15 434–15 447, 2021.
- F. Chen, M. Luo, Z. Dong, Z. Li, and X. He, “Federated meta-learning with fast convergence and efficient communication,” arXiv preprint arXiv:1802.07876, 2018.
- I. Jeon, M. Hong, J. Yun, and G. Kim, “Federated learning via meta-variational dropout,” Advances in Neural Information Processing Systems, vol. 36, 2024.
- R. Lee, M. Kim, D. Li, X. Qiu, T. Hospedales, F. Huszár, and N. Lane, “Fedl2p: Federated learning to personalize,” Advances in Neural Information Processing Systems, vol. 36, 2024.
- J. Scott, H. Zakerinia, and C. H. Lampert, “Pefll: Personalized federated learning by learning to learn,” in The Twelfth International Conference on Learning Representations, 2023.
- J. Kim, G. Kim, and B. Han, “Multi-level branched regularization for federated learning,” in International Conference on Machine Learning. PMLR, 2022, pp. 11 058–11 073.
- N. Shoham, T. Avidor, A. Keren, N. Israel, D. Benditkis, L. Mor-Yosef, and I. Zeitak, “Overcoming forgetting in federated learning on non-iid data,” arXiv preprint arXiv:1910.07796, 2019.
- X. Yao and L. Sun, “Continual local training for better initialization of federated models,” in 2020 IEEE International Conference on Image Processing (ICIP). IEEE, 2020, pp. 1736–1740.
- C. T Dinh, N. Tran, and J. Nguyen, “Personalized federated learning with moreau envelopes,” Advances in Neural Information Processing Systems, vol. 33, pp. 21 394–21 405, 2020.
- S. P. Karimireddy, S. Kale, M. Mohri, S. Reddi, S. Stich, and A. T. Suresh, “Scaffold: Stochastic controlled averaging for federated learning,” in International conference on machine learning. PMLR, 2020, pp. 5132–5143.
- F. Hanzely and P. Richtárik, “Federated learning of a mixture of global and local models,” arXiv preprint arXiv:2002.05516, 2020.
- T. Li, S. Hu, A. Beirami, and V. Smith, “Ditto: Fair and robust federated learning through personalization,” in International conference on machine learning. PMLR, 2021, pp. 6357–6368.
- X. Zhou and X. Wang, “Federated label-noise learning with local diversity product regularization,” in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 38, no. 15, 2024, pp. 17 141–17 149.
- M. Zhi, Y. Bi, W. Xu, H. Wang, and T. Xiang, “Knowledge-aware parameter coaching for personalized federated learning,” in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 38, no. 15, 2024, pp. 17 069–17 077.
- J. Li, A. Li, C. Tian, Q. Ho, E. Xing, and H. Wang, “Fednar: Federated optimization with normalized annealing regularization,” Advances in Neural Information Processing Systems, vol. 36, 2024.
- Y. Wang and B. Kantarci, “A novel reputation-aware client selection scheme for federated learning within mobile environments,” in 2020 IEEE 25th International Workshop on Computer Aided Modeling and Design of Communication Links and Networks (CAMAD). IEEE, 2020, pp. 1–6.
- T. Huang, W. Lin, L. Shen, K. Li, and A. Y. Zomaya, “Stochastic client selection for federated learning with volatile clients,” IEEE Internet of Things Journal, vol. 9, no. 20, pp. 20 055–20 070, 2022.
- N. Yoshida, T. Nishio, M. Morikura, K. Yamamoto, and R. Yonetani, “Hybrid-fl for wireless networks: Cooperative learning mechanism using non-iid data,” in ICC 2020-2020 IEEE International Conference On Communications (ICC). IEEE, 2020, pp. 1–7.
- J. Xu and H. Wang, “Client selection and bandwidth allocation in wireless federated learning networks: A long-term perspective,” IEEE Transactions on Wireless Communications, vol. 20, no. 2, pp. 1188–1200, 2020.
- F. Lai, X. Zhu, H. V. Madhyastha, and M. Chowdhury, “Oort: Efficient federated learning via guided participant selection,” in 15th {{\{{USENIX}}\}} Symposium on Operating Systems Design and Implementation ({{\{{OSDI}}\}} 21), 2021, pp. 19–35.
- H. Wu and P. Wang, “Node selection toward faster convergence for federated learning on non-iid data,” IEEE Transactions on Network Science and Engineering, vol. 9, no. 5, pp. 3099–3111, 2022.
- Y. J. Cho, J. Wang, and G. Joshi, “Towards understanding biased client selection in federated learning,” in International Conference on Artificial Intelligence and Statistics. PMLR, 2022, pp. 10 351–10 375.
- C. Li, X. Zeng, M. Zhang, and Z. Cao, “Pyramidfl: A fine-grained client selection framework for efficient federated learning,” in Proceedings of the 28th Annual International Conference on Mobile Computing And Networking, 2022, pp. 158–171.
- M. Ribero, H. Vikalo, and G. De Veciana, “Federated learning under intermittent client availability and time-varying communication constraints,” IEEE Journal of Selected Topics in Signal Processing, vol. 17, no. 1, pp. 98–111, 2022.
- F. Sattler, K.-R. Müller, and W. Samek, “Clustered federated learning: Model-agnostic distributed multitask optimization under privacy constraints,” IEEE Transactions on Neural Networks and Llearning Systems, vol. 32, no. 8, pp. 3710–3722, 2020.
- A. Ghosh, J. Chung, D. Yin, and K. Ramchandran, “An efficient framework for clustered federated learning,” IEEE Transactions on Information Theory, vol. 68, no. 12, pp. 8076–8091, 2022.
- S. Vahidian, M. Morafah, W. Wang, V. Kungurtsev, C. Chen, M. Shah, and B. Lin, “Efficient distribution similarity identification in clustered federated learning via principal angles between client data subspaces,” in Proceedings of the AAAI Conference on Artificial Intelligence, 2023, pp. 10 043–10 052.
- Y. Ruan and C. Joe-Wong, “Fedsoft: Soft clustered federated learning with proximal local updating,” in Proceedings of the AAAI conference on artificial intelligence, vol. 36, no. 7, 2022, pp. 8124–8131.
- R. Lu, W. Zhang, Y. Wang, Q. Li, X. Zhong, H. Yang, and D. Wang, “Auction-based cluster federated learning in mobile edge computing systems,” IEEE Transactions on Parallel and Distributed Systems, vol. 34, no. 4, pp. 1145–1158, 2023.
- L. Liu, J. Zhang, S. Song, and K. B. Letaief, “Client-edge-cloud hierarchical federated learning,” in ICC 2020-2020 IEEE international conference on communications (ICC). IEEE, 2020, pp. 1–6.
- C. Briggs, Z. Fan, and P. Andras, “Federated learning with hierarchical clustering of local updates to improve training on non-iid data,” in 2020 International Joint Conference on Neural Networks (IJCNN). IEEE, 2020, pp. 1–9.
- B. Xu, W. Xia, W. Wen, P. Liu, H. Zhao, and H. Zhu, “Adaptive hierarchical federated learning over wireless networks,” IEEE Transactions on Vehicular Technology, vol. 71, no. 2, pp. 2070–2083, 2021.
- W. Y. B. Lim, J. S. Ng, Z. Xiong, D. Niyato, C. Miao, and D. I. Kim, “Dynamic edge association and resource allocation in self-organizing hierarchical federated learning networks,” IEEE Journal on Selected Areas in Communications, vol. 39, no. 12, pp. 3640–3653, 2021.
- X. Zhou, X. Ye, I. Kevin, K. Wang, W. Liang, N. K. C. Nair, S. Shimizu, Z. Yan, and Q. Jin, “Hierarchical federated learning with social context clustering-based participant selection for internet of medical things applications,” IEEE Transactions on Computational Social Systems, 2023.
- Q. Ma, Y. Xu, H. Xu, J. Liu, and L. Huang, “Feduc: A unified clustering approach for hierarchical federated learning,” IEEE Transactions on Mobile Computing, 2024.
- Y. Deng, F. Lyu, T. Xia, Y. Zhou, Y. Zhang, J. Ren, and Y. Yang, “A communication-efficient hierarchical federated learning framework via shaping data distribution at edge,” IEEE/ACM Transactions on Networking, 2024.
- Y. Li, C. Chen, N. Liu, H. Huang, Z. Zheng, and Q. Yan, “A blockchain-based decentralized federated learning framework with committee consensus,” IEEE Network, vol. 35, no. 1, pp. 234–241, 2020.
- C. Che, X. Li, C. Chen, X. He, and Z. Zheng, “A decentralized federated learning framework via committee mechanism with convergence guarantee,” IEEE Transactions on Parallel and Distributed Systems, vol. 33, no. 12, pp. 4783–4800, 2022.
- W. Y. B. Lim, J. S. Ng, Z. Xiong, J. Jin, Y. Zhang, D. Niyato, C. Leung, and C. Miao, “Decentralized edge intelligence: A dynamic resource allocation framework for hierarchical federated learning,” IEEE Transactions on Parallel and Distributed Systems, vol. 33, no. 3, pp. 536–550, 2021.
- H. Ye, L. Liang, and G. Y. Li, “Decentralized federated learning with unreliable communications,” IEEE journal of selected topics in signal processing, vol. 16, no. 3, pp. 487–500, 2022.
- C. Pappas, D. Chatzopoulos, S. Lalis, and M. Vavalis, “Ipls: A framework for decentralized federated learning,” in 2021 IFIP Networking Conference (IFIP Networking). IEEE, 2021, pp. 1–6.
- W. Liu, L. Chen, and W. Zhang, “Decentralized federated learning: Balancing communication and computing costs,” IEEE Transactions on Signal and Information Processing over Networks, vol. 8, pp. 131–143, 2022.
- S. Kalra, J. Wen, J. C. Cresswell, M. Volkovs, and H. R. Tizhoosh, “Decentralized federated learning through proxy model sharing,” Nature communications, vol. 14, no. 1, p. 2899, 2023.
- I. Hegedűs, G. Danner, and M. Jelasity, “Decentralized learning works: An empirical comparison of gossip learning and federated learning,” Journal of Parallel and Distributed Computing, vol. 148, pp. 109–124, 2021.
- Y. Qu, H. Dai, Y. Zhuang, J. Chen, C. Dong, F. Wu, and S. Guo, “Decentralized federated learning for uav networks: Architecture, challenges, and opportunities,” IEEE Network, vol. 35, no. 6, pp. 156–162, 2021.
- E. T. M. Beltrán, Á. L. P. Gómez, C. Feng, P. M. S. Sánchez, S. L. Bernal, G. Bovet, M. G. Pérez, G. M. Pérez, and A. H. Celdrán, “Fedstellar: A platform for decentralized federated learning,” Expert Systems with Applications, vol. 242, p. 122861, 2024.
- Y. Zhou, M. Shi, Y. Tian, Q. Ye, and J. Lv, “Defta: A plug-and-play peer-to-peer decentralized federated learning framework,” Information Sciences, p. 120582, 2024.
- P.-C. Cheng, K. Eykholt, Z. Gu, H. Jamjoom, K. Jayaram, E. Valdez, and A. Verma, “Deta: Minimizing data leaks in federated learning via decentralized and trustworthy aggregation,” in Proceedings of the Nineteenth European Conference on Computer Systems, 2024, pp. 219–235.
- A. Shafahi, W. R. Huang, M. Najibi, O. Suciu, C. Studer, T. Dumitras, and T. Goldstein, “Poison frogs! targeted clean-label poisoning attacks on neural networks,” Advances in neural information processing systems, vol. 31, 2018.
- L. Su and J. Xu, “Securing distributed gradient descent in high dimensional statistical learning,” Proceedings of the ACM on Measurement and Analysis of Computing Systems, vol. 3, no. 1, pp. 1–41, 2019.
- L. Melis, C. Song, E. De Cristofaro, and V. Shmatikov, “Exploiting unintended feature leakage in collaborative learning,” in 2019 IEEE symposium on security and privacy (SP). IEEE, 2019, pp. 691–706.
- X. Yin, Y. Zhu, and J. Hu, “A comprehensive survey of privacy-preserving federated learning: A taxonomy, review, and future directions,” ACM Computing Surveys (CSUR), vol. 54, no. 6, pp. 1–36, 2021.
- G. Xia, J. Chen, C. Yu, and J. Ma, “Poisoning attacks in federated learning: A survey,” IEEE Access, vol. 11, pp. 10 708–10 722, 2023.
- X. Zhang, Y. Kang, K. Chen, L. Fan, and Q. Yang, “Trading off privacy, utility, and efficiency in federated learning,” ACM Transactions on Intelligent Systems and Technology, vol. 14, no. 6, pp. 1–32, 2023.
- Y. Li, T. Wang, C. Chen, J. Lou, B. Chen, L. Yang, and Z. Zheng, “Clients collaborate: Flexible differentially private federated learning with guaranteed improvement of utility-privacy trade-off,” arXiv preprint arXiv:2402.07002, 2024.
- C. Dwork, “Differential privacy,” in International colloquium on automata, languages, and programming. Springer, 2006, pp. 1–12.
- C. Dwork, A. Roth et al., “The algorithmic foundations of differential privacy,” Foundations and Trends® in Theoretical Computer Science, vol. 9, no. 3–4, pp. 211–407, 2014.
- M. Abadi, A. Chu, I. Goodfellow, H. B. McMahan, I. Mironov, K. Talwar, and L. Zhang, “Deep learning with differential privacy,” in Proceedings of the 2016 ACM SIGSAC conference on computer and communications security, 2016, pp. 308–318.
- H. B. McMahan, D. Ramage, K. Talwar, and L. Zhang, “Learning differentially private recurrent language models,” arXiv preprint arXiv:1710.06963, 2017.
- K. Wei, J. Li, M. Ding, C. Ma, H. H. Yang, F. Farokhi, S. Jin, T. Q. Quek, and H. V. Poor, “Federated learning with differential privacy: Algorithms and performance analysis,” IEEE transactions on information forensics and security, vol. 15, pp. 3454–3469, 2020.
- M. Seif, R. Tandon, and M. Li, “Wireless federated learning with local differential privacy,” in 2020 IEEE International Symposium on Information Theory (ISIT). IEEE, 2020, pp. 2604–2609.
- A. Girgis, D. Data, S. Diggavi, P. Kairouz, and A. T. Suresh, “Shuffled model of differential privacy in federated learning,” in International Conference on Artificial Intelligence and Statistics. PMLR, 2021, pp. 2521–2529.
- W. Wei, L. Liu, Y. Wu, G. Su, and A. Iyengar, “Gradient-leakage resilient federated learning,” in 2021 IEEE 41st International Conference on Distributed Computing Systems (ICDCS). IEEE, 2021, pp. 797–807.
- I. Mironov, “Rényi differential privacy,” in 2017 IEEE 30th computer security foundations symposium (CSF). IEEE, 2017, pp. 263–275.
- Y. Shi, Y. Liu, K. Wei, L. Shen, X. Wang, and D. Tao, “Make landscape flatter in differentially private federated learning,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2023, pp. 24 552–24 562.
- R. Hu, Y. Guo, and Y. Gong, “Federated learning with sparsified model perturbation: Improving accuracy under client-level differential privacy,” IEEE Transactions on Mobile Computing, 2023.
- Z. Xiong, Z. Cai, D. Takabi, and W. Li, “Privacy threat and defense for federated learning with non-iid data in aiot,” IEEE Transactions on Industrial Informatics, vol. 18, no. 2, pp. 1310–1321, 2021.
- J. Liu, J. Lou, L. Xiong, J. Liu, and X. Meng, “Projected federated averaging with heterogeneous differential privacy,” Proceedings of the VLDB Endowment, vol. 15, no. 4, pp. 828–840, 2021.
- Z. Gao, Y. Duan, Y. Yang, L. Rui, and C. Zhao, “Fedsec: a robust differential private federated learning framework in heterogeneous networks,” in 2022 IEEE Wireless Communications and Networking Conference (WCNC). IEEE, 2022, pp. 1868–1873.
- M. Noble, A. Bellet, and A. Dieuleveut, “Differentially private federated learning on heterogeneous data,” in International Conference on Artificial Intelligence and Statistics. PMLR, 2022, pp. 10 110–10 145.
- O. Goldreich, “Secure multi-party computation,” Manuscript. Preliminary version, vol. 78, no. 110, pp. 1–108, 1998.
- D. Evans, V. Kolesnikov, M. Rosulek et al., “A pragmatic introduction to secure multi-party computation,” Foundations and Trends® in Privacy and Security, vol. 2, no. 2-3, pp. 70–246, 2018.
- R. L. Rivest, L. Adleman, M. L. Dertouzos et al., “On data banks and privacy homomorphisms,” Foundations of secure computation, vol. 4, no. 11, pp. 169–180, 1978.
- A. Acar, H. Aksu, A. S. Uluagac, and M. Conti, “A survey on homomorphic encryption schemes: Theory and implementation,” ACM Computing Surveys (Csur), vol. 51, no. 4, pp. 1–35, 2018.
- Y. Aono, T. Hayashi, L. Wang, S. Moriai et al., “Privacy-preserving deep learning via additively homomorphic encryption,” IEEE transactions on information forensics and security, vol. 13, no. 5, pp. 1333–1345, 2017.
- P. Paillier, “Public-key cryptosystems based on composite degree residuosity classes,” in International conference on the theory and applications of cryptographic techniques. Springer, 1999, pp. 223–238.
- C. Gentry, A. Sahai, and B. Waters, “Homomorphic encryption from learning with errors: Conceptually-simpler, asymptotically-faster, attribute-based,” in Advances in Cryptology–CRYPTO 2013: 33rd Annual Cryptology Conference, Santa Barbara, CA, USA, August 18-22, 2013. Proceedings, Part I. Springer, 2013, pp. 75–92.
- J. H. Cheon, A. Kim, M. Kim, and Y. Song, “Homomorphic encryption for arithmetic of approximate numbers,” in Advances in Cryptology–ASIACRYPT 2017: 23rd International Conference on the Theory and Applications of Cryptology and Information Security, Hong Kong, China, December 3-7, 2017, Proceedings, Part I 23. Springer, 2017, pp. 409–437.
- J. Ma, S.-A. Naas, S. Sigg, and X. Lyu, “Privacy-preserving federated learning based on multi-key homomorphic encryption,” International Journal of Intelligent Systems, vol. 37, no. 9, pp. 5880–5901, 2022.
- D. Boneh, A. Sahai, and B. Waters, “Functional encryption: Definitions and challenges,” in Theory of Cryptography: 8th Theory of Cryptography Conference, TCC 2011, Providence, RI, USA, March 28-30, 2011. Proceedings 8. Springer, 2011, pp. 253–273.
- S. Goldwasser, S. D. Gordon, V. Goyal, A. Jain, J. Katz, F.-H. Liu, A. Sahai, E. Shi, and H.-S. Zhou, “Multi-input functional encryption,” in Advances in Cryptology–EUROCRYPT 2014: 33rd Annual International Conference on the Theory and Applications of Cryptographic Techniques, Copenhagen, Denmark, May 11-15, 2014. Proceedings 33. Springer, 2014, pp. 578–602.
- R. Xu, N. Baracaldo, Y. Zhou, A. Anwar, and H. Ludwig, “Hybridalpha: An efficient approach for privacy-preserving federated learning,” in Proceedings of the 12th ACM workshop on artificial intelligence and security, 2019, pp. 13–23.
- J. Chotard, E. Dufour Sans, R. Gay, D. H. Phan, and D. Pointcheval, “Decentralized multi-client functional encryption for inner product,” in Advances in Cryptology–ASIACRYPT 2018: 24th International Conference on the Theory and Application of Cryptology and Information Security, Brisbane, QLD, Australia, December 2–6, 2018, Proceedings, Part II 24. Springer, 2018, pp. 703–732.
- Y. Chang, K. Zhang, J. Gong, and H. Qian, “Privacy-preserving federated learning via functional encryption, revisited,” IEEE Transactions on Information Forensics and Security, vol. 18, pp. 1855–1869, 2023.
- A. Shamir, “How to share a secret,” Communications of the ACM, vol. 22, no. 11, pp. 612–613, 1979.
- G. R. Blakley, “Safeguarding cryptographic keys,” in Managing requirements knowledge, international workshop on. IEEE Computer Society, 1979, pp. 313–313.
- K. Bonawitz, V. Ivanov, B. Kreuter, A. Marcedone, H. B. McMahan, S. Patel, D. Ramage, A. Segal, and K. Seth, “Practical secure aggregation for privacy-preserving machine learning,” in proceedings of the 2017 ACM SIGSAC Conference on Computer and Communications Security, 2017, pp. 1175–1191.
- J. H. Bell, K. A. Bonawitz, A. Gascón, T. Lepoint, and M. Raykova, “Secure single-server aggregation with (poly) logarithmic overhead,” in Proceedings of the 2020 ACM SIGSAC Conference on Computer and Communications Security, 2020, pp. 1253–1269.
- H. Fereidooni, S. Marchal, M. Miettinen, A. Mirhoseini, H. Möllering, T. D. Nguyen, P. Rieger, A.-R. Sadeghi, T. Schneider, H. Yalame et al., “Safelearn: Secure aggregation for private federated learning,” in 2021 IEEE Security and Privacy Workshops (SPW). IEEE, 2021, pp. 56–62.
- H. Fazli Khojir, D. Alhadidi, S. Rouhani, and N. Mohammed, “Fedshare: secure aggregation based on additive secret sharing in federated learning,” in Proceedings of the 27th International Database Engineered Applications Symposium, 2023, pp. 25–33.
- I. Cox, M. Miller, J. Bloom, and C. Honsinger, “Digital watermarking,” Journal of Electronic Imaging, vol. 11, no. 3, pp. 414–414, 2002.
- B. G. Tekgul, Y. Xia, S. Marchal, and N. Asokan, “Waffle: Watermarking in federated learning,” in 2021 40th International Symposium on Reliable Distributed Systems (SRDS). IEEE, 2021, pp. 310–320.
- H. Nie and S. Lu, “Fedcrmw: Federated model ownership verification with compression-resistant model watermarking,” Expert Systems with Applications, p. 123776, 2024.
- B. Han, R. H. Jhaveri, H. Wang, D. Qiao, and J. Du, “Application of robust zero-watermarking scheme based on federated learning for securing the healthcare data,” IEEE journal of biomedical and health informatics, vol. 27, no. 2, pp. 804–813, 2021.
- S. Nakamoto, “Bitcoin: A peer-to-peer electronic cash system,” 2008.
- Q. Wang, R. Li, Q. Wang, and S. Chen, “Non-fungible token (nft): Overview, evaluation, opportunities and challenges,” arXiv preprint arXiv:2105.07447, 2021.
- L. Cui, X. Su, Z. Ming, Z. Chen, S. Yang, Y. Zhou, and W. Xiao, “Creat: Blockchain-assisted compression algorithm of federated learning for content caching in edge computing,” IEEE Internet of Things Journal, vol. 9, no. 16, pp. 14 151–14 161, 2020.
- K. Toyoda and A. N. Zhang, “Mechanism design for an incentive-aware blockchain-enabled federated learning platform,” in 2019 IEEE international conference on big data (Big Data). IEEE, 2019, pp. 395–403.
- Y. Li, Y. Lai, C. Chen, and Z. Zheng, “Veryfl: A verify federated learning framework embedded with blockchain,” arXiv preprint arXiv:2311.15617, 2023.
- Y. Huang, L. Chu, Z. Zhou, L. Wang, J. Liu, J. Pei, and Y. Zhang, “Personalized cross-silo federated learning on non-iid data,” in Proceedings of the AAAI conference on artificial intelligence, vol. 35, no. 9, 2021, pp. 7865–7873.
- H. Jiang, M. Liu, B. Yang, Q. Liu, J. Li, and X. Guo, “Customized federated learning for accelerated edge computing with heterogeneous task targets,” Computer Networks, vol. 183, p. 107569, 2020.
- P. Zheng, Y. Zhu, Y. Hu, Z. Zhang, and A. Schmeink, “Federated learning in heterogeneous networks with unreliable communication,” IEEE Transactions on Wireless Communications, 2023.
- Y. H. Ezzeldin, S. Yan, C. He, E. Ferrara, and A. S. Avestimehr, “Fairfed: Enabling group fairness in federated learning,” in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 37, no. 6, 2023, pp. 7494–7502.
- F.-Y. Wang, R. Qin, J. Li, X. Wang, H. Qi, X. Jia, and B. Hu, “Federated management: Toward federated services and federated security in federated ecology,” IEEE Transactions on Computational Social Systems, vol. 8, no. 6, pp. 1283–1290, 2021.
- Z. Wang, Q. Hu, R. Li, M. Xu, and Z. Xiong, “Incentive mechanism design for joint resource allocation in blockchain-based federated learning,” IEEE Transactions on Parallel and Distributed Systems, vol. 34, no. 5, pp. 1536–1547, 2023.
- S. Feng, B. Li, H. Yu, Y. Liu, and Q. Yang, “Semi-supervised federated heterogeneous transfer learning,” Knowledge-Based Systems, vol. 252, p. 109384, 2022.