Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Federated Distillation: A Survey (2404.08564v1)

Published 2 Apr 2024 in cs.LG

Abstract: Federated Learning (FL) seeks to train a model collaboratively without sharing private training data from individual clients. Despite its promise, FL encounters challenges such as high communication costs for large-scale models and the necessity for uniform model architectures across all clients and the server. These challenges severely restrict the practical applications of FL. To address these limitations, the integration of knowledge distillation (KD) into FL has been proposed, forming what is known as Federated Distillation (FD). FD enables more flexible knowledge transfer between clients and the server, surpassing the mere sharing of model parameters. By eliminating the need for identical model architectures across clients and the server, FD mitigates the communication costs associated with training large-scale models. This paper aims to offer a comprehensive overview of FD, highlighting its latest advancements. It delves into the fundamental principles underlying the design of FD frameworks, delineates FD approaches for tackling various challenges, and provides insights into the diverse applications of FD across different scenarios.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (213)
  1. A. Krizhevsky, I. Sutskever, and G E. Hinton, “ImageNet classification with deep convolutional neural networks,” Comm. ACM, vol. 60, no. 6, pp. 84–90, 2017.
  2. D. Li, and J. Wang, “Fedmd: Heterogenous federated learning via model distillation,” 2019, arXiv preprint arXiv:1910.03581.
  3. F. Sattler, A. Marban, R. Rischke, and W. Samek, “Communication-efficient federated distillation,” 2020, arXiv preprint arXiv:2012.00632.
  4. H. Seo, J. Park, S. Oh, M. Bennis, and S-L. Kim, “Federated Knowledge Distillation,” Mach. Learn. Wireless Commun., pp. 457, 2022.
  5. S. Itahara, T. Nishio, Y. Koda, M. Morikura, and K. Yamamoto, “Distillation-based semi-supervised federated learning for communication-efficient collaborative training with non-iid private data,” IEEE Trans. Mobile Comput. vol. 22, no. 1, pp. 191–205, 2021.
  6. J. Kim, G. Kim, and B. Han, “Multi-level branched regularization for federated learning,” in Proc. Int. Conf. Mach. Learn., 2022, pp. 11058–11073.
  7. H. Wen, Y. Wu, J. Li, and H. Duan, “Communication-efficient federated data augmentation on Non-IID data,” in Proc. IEEE/CVF Conf. Comput. Vis. Pattern Recognit., 2022, pp. 3377–3386.
  8. S. Cheng, J. Wu, Y. Xiao, and Y. Liu, “Fedgems: federated learning of larger server models via selective knowledge fusion,” 2021, arXiv preprint arXiv:2110.11027.
  9. R. Mishra, H. Gupta, and T. Dutta, “A network resource aware federated learning approach using knowledge distillation,” in Proc. IEEE/CVF Int. Conf. Comput. Commun. Workshop, 2021, pp. 1–2.
  10. J. Zhang, S. Guo, X. Ma, H. Wang, W. Xu, and F. Wu, “Parameterized knowledge transfer for personalized federated learning,” in Proc. Adv. Neural Inf. Process. Syst., 2021, vol. 34, pp. 10092–10104.
  11. J. Zhang, S. Guo, J. Guo, D. Zeng, J. Zhou, and A. Zomaya, “Towards data-independent knowledge transfer in model-heterogeneous federated learning,” IEEE Trans. Comput., vol. 72, no. 10, pp. 2888–2901, 2023.
  12. X. Ni, X. Shen, and H. Zhao, “Federated optimization via knowledge codistillation,” Expert Syst. Appl., vol. 191, pp. 116310, 2022.
  13. P. Qi, X. Zhou, Y. Ding, Z. Zhang, S. Zheng, and Z. Li, “Fedbkd: Heterogenous federated learning via bidirectional knowledge distillation for modulation classification in iot-edge system,” IEEE J. Sel. Topics Signal Process., vol. 17, no. 1, pp. 189–204, 2022.
  14. E. Shang, H. Liu, Z. Yang, J. Du, and Y. Ge, “FedBiKD: Federated Bidirectional Knowledge Distillation for Distracted Driving Detection,” IEEE Internet Things J., vol. 10, no. 13, pp. 11643–11654, 2023.
  15. S. Oh, J. Park, E. Jeong, H. Kim, M. Bennis, and S. Kim, “Mix2FLD: Downlink federated learning after uplink federated distillation with two-way mixup,” IEEE Commun. Lett., vol. 24, no. 10, pp. 2211-2215, 2020.
  16. H. Cha, J. Park, H. Kim, S. Kim, and M. Bennis, “Federated reinforcement distillation with proxy experience memory,” in Proc. Int. Joint Conf. Artif. Intell. Workshop, 2019, pp. 1–6.
  17. H. Cha, J. Park, H. Kim, M. Bennis, and S. Kim, “Proxy experience replay: federated distillation for distributed reinforcement learning,” IEEE Intell. Syst., vol. 35, no. 4, pp. 94–101, 2020.
  18. Q. Li, B. He, and D. Song, “Practical one-shot federated learning for cross-silo setting,” in Proc. Int. Joint Conf. Artif. Intell., 2021, pp. 1484–1490.
  19. A. Taya, T. Nishio, M. Morikura, and K. Yamamoto, “Decentralized and model-free federated learning: Consensus-based distillation in function space,” IEEE Trans. Signal Inf. Proc. Netw., vol. 8, pp. 799–814, 2022.
  20. H. Wen, Y. Wu, C. Yang, H. Duan, and S. Yu, “A unified federated learning framework for wireless communications: Towards privacy, efficiency, and security,” in Proc. IEEE/CVF Int. Conf. Comput. Commun. Workshop, 2020, pp. 653–658.
  21. Z. Zhang, S. Wang, Y. Hong, L. Zhou, and Q. Hao, Q. “Distributed dynamic map fusion via federated learning for intelligent networked vehicles,” in IEEE Int. Conf. Robot. Autom., 2021, pp. 953–959.
  22. R. Zhao, Y. Wang, Z. Xue, T. Ohtsuki, B. Adebisi, and G. Gui, “Semi-supervised federated learning based intrusion detection method for internet of things,” IEEE Internet Things J., vol. 10, no. 10, pp. 8645–8657, 2022.
  23. T. Lin, L. Kong, S. Stich, and M. Jaggi, “Ensemble distillation for robust model fusion in federated learning,” in Proc. Adv. Neural Inf. Process. Syst., 2020, vol. 33, pp. 2351–2363.
  24. C. Wang, G. Yang, G. Papanastasiou, H. Zhang, J. Rodrigues, and V. De Albuquerque, “Industrial cyber-physical systems-based cloud IoT edge for federated heterogeneous distillation,” IEEE Trans. Ind. Informat., vol. 17, no. 8, pp. 5511–5521, 2020.
  25. J. Gou, B. Yu, S. Maybank, and D. Tao, “Knowledge distillation: A survey,” Int. J. Comput. Vis., vol. 129, pp. 1789–1819, 2021.
  26. Y. Tian, S. Pei, X. Zhang, C. Zhang, and N. Chawla, “Knowledge Distillation on Graphs: A Survey,” 2023, arXiv preprint arXiv:2302.00219.
  27. K. Zhang, X. Song, C. Zhang, and S. Yu, “Challenges and future directions of secure federated learning: a survey,” Front.. Comput. Sci., vol. 16, pp. 1–8, 2022.
  28. L. Li, Y. Fan, M. Tse, and K. Lin, “A review of applications in federated learning,” Comput. Ind. Eng., vol. 149, pp. 106854, 2020.
  29. V. Mothukuri, R. Parizi, S. Pouriyeh, Y. Huang, A. Dehghantanha, and G. Srivastava, “A survey on security and privacy of federated learning,” Futur. Gener. Comp. Syst., vol. 115, pp. 619–640, 2021.
  30. H. Zhu, J. Xu, S. Liu, and Y. Jin, “Federated learning on non-IID data: A survey,” Neurocomputing, vol. 465, pp. 371–390, 2021.
  31. L. Cao, H. Chen, X. Fan, J. Gama, Y. Ong, and V. Kumar, “Bayesian federated learning: A survey,” 2023, arXiv preprint arXiv:2304.13267.
  32. T. Li, A. Sahu, M. Zaheer, M. Sanjabi, A. Talwalkar,and V. Smith, “Federated optimization in heterogeneous networks,” in Proc. Mach. Learn. Syst., 2020, vol. 2, pp. 429–450.
  33. Q. Yang, Y. Liu, T. Chen, and Y. Tong, “Federated machine learning: concept and applications,” ACM Trans. Intell. Syst. Technol., vol. 10, no. 2, pp. 1–19, 2019.
  34. A. Mora, I. Tenison, P. Bellavista, and I. Rish, “Knowledge distillation for federated learning: a practical guide,” 2022, arXiv preprint arXiv:2211.04742.
  35. Z. Wu, S. Sun, Y. Wang, M. Liu, X. Jiang, and R. Li, “Survey of knowledge distillation in federated edge learning,” 2023, arXiv preprint arXiv:2301.05849.
  36. M. Gong, J. Feng, and Y. Xie, “Privacy-enhanced multi-party deep learning,” Neural Netw., vol. 121, pp. 484–496, 2020.
  37. J. Albrecht, “How the GDPR will change the world,” Eur. Data Prot. L. Rev., vol. 2, pp. 287, 2016.
  38. M. Parasol, “The impact of China’s 2016 Cyber Security Law on foreign technology firms, and on China’s big data and Smart City dreams,” Comput. Law Secur. Rev., vol. 34, no. 1, pp. 67-89, 2018.
  39. W. Gray, and H. Zheng, “General Principles of Civil Law of the People’s Republic of China,” Am. J. Comp. Law, vol. 34, no. 4, pp. 715–743, 1986.
  40. B. McMahan, E. Moore, D. Ramage, S. Hampson, and B. Y Arcas, “Communication-efficient learning of deep networks from decentralized data,” in Proc. Int. Conf. Artif. Intell. Stat., 2017, vol. 54, pp. 1273–1282.
  41. D. Nguyen, M. Ding, P. Pathirana, A. Seneviratne, J. Li, and H. Poor, “Federated learning for internet of things: A comprehensive survey,” IEEE Commun. Surv. Tutor., vol. 23, no. 3, pp. 1622–1658, 2021.
  42. H. Zhu, H. Zhang, and Y. Jin, “From federated learning to federated neural architecture search: a survey,” Complex Inttel. Syst., vol. 7, pp. 639–657, 2021.
  43. C. Zhang, Y. Xie, H. Bai, B. Yu, W. Li, Y. Gao, “A survey on federated learning,” Knowledge-Based Syst.,vol. 216, pp. 106775, 2021.
  44. X.Li, M. Jiang, X. Zhang, M. Kamp, and Q. Dou, “ Fedbn: federated learning on non-iid features via local batch normalization,” in Proc. Int. Conf. Learn. Represents, 2021, [Online]. Available: https://openreview.net/forum?id=6YEQUn0QICG.
  45. G. Hinton, O. Vinyals, and J. Dean, “Distilling the knowledge in a neural network,” 2015, arXiv preprint arXiv:1503.02531
  46. T. Wen, S. Lai, and X. Qian, “Preparing lessons: improve knowledge distillation with better supervision,” Neurocomputing, vol. 454, pp. 25–33, 2021.
  47. J. Cho, and B. Hariharan, “On the efficacy of knowledge distillation,” in Proc. IEEE/CVF Conf. Comput. Vis. Pattern Recognit., 2019, pp. 4794–4802.
  48. B. Zhao, Q. Cui, R. Song, Y. Qiu, and J. Liang, “Decoupled knowledge distillation,” in Proc. IEEE/CVF Conf. Comput. Vis. Pattern Recognit., 2022, pp. 11953–11962.
  49. Y. Guan, P. Zhao, B. Wang, Y. Zhang, C. Yao, K. Bian, and J. Tang, “Differentiable feature aggregation search for knowledge distillation,” in Proc. Eur. Conf. Comput. Vis., 2020, pp. 469–484.
  50. R. Adriana, B. Nicolas, K. Ebrahimi, C. Antoine, G. Carlo, and B. Yoshua, “Fitnets: hints for thin deep nets,” in Proc. Int. Conf. Learn. Represents, 2015, vol. 2, no. 3, pp. 1.
  51. S. Park, and N. Kwak, “Feed: feature-level ensemble for knowledge distillation,” 2019, arXiv preprint arXiv:1909.10754.
  52. P. Chen, S. Liu, H. Zhao,and J. Jia, “Distilling knowledge via knowledge review,” in Proc. IEEE/CVF Conf. Comput. Vis. Pattern Recognit., 2021, pp. 5008–5017.
  53. Z. Chen, B. Dutton, B. Ramachandra, T. Wu, and R. Vatsavai, “Local clustering with mean teacher for semi-supervised learning,” in Proc. Int. Conf. Pattern Recognit., pp. 6243–6250, 2021.
  54. J. Rao, X. Meng, L. Ding, S. Qi, and D. Tao, “Parameter-efficient and student-friendly knowledge distillation,” IEEE Trans. Multimedia, vol. 26, pp. 4230–4241, 2023.
  55. T. Li, J. Li, Z. Liu, and C. Zhang, “Few sample knowledge distillation for efficient network compression,” in Proc. IEEE/CVF Conf. Comput. Vis. Pattern Recognit., 2020, pp. 14639–14647.
  56. S. Fu, Z. Li, Z. Liu, and X. Yang, “Interactive knowledge distillation for image classification,” Neurocomputing, vol. 449, pp. 411–421, 2021.
  57. C. Shen, X. Wang, Y. Yin, J. Song, S. Luo, and M. Song, “Progressive network grafting for few-shot knowledge distillation,” in Proc. AAAI Conf. Artif. Intell., vom.35, no. 3, pp. 2541–2549, 2021.
  58. J. Song, Y. Chen, J. Ye, and M. Song, “Spot-adaptive knowledge distillation,” IEEE Trans. Image Process., vol. 31, pp. 3359–3370, 2022.
  59. Y. Hou, Z. Ma, C. Liu, T., Hui, and C. Loy, “Inter-region affinity distillation for road marking segmentation,” in Proc. IEEE/CVF Conf. Comput. Vis. Pattern Recognit., 2020, pp. 12486–12495.
  60. W. Park, D. Kim, Y. Lu, and M. Cho, “Relational knowledge distillation,” in Proc. IEEE/CVF Conf. Comput. Vis. Pattern Recognit., 2019, pp. 3967–3976.
  61. F. Tung, and G. Mori, “Similarity-preserving knowledge distillation,” in Proc. IEEE/CVF Int. Conf. Comput. Vis., 2019, pp. 1365–1374.
  62. Y. Tian, D. Krishnan, and P. Isola, “Contrastive representation distillation,” in Proc. Int. Conf. Learn. Represents, 2020, [Online]. Available: https://openreview.net/forum?id=SkgpBJrtvS.
  63. Y. Liu, K. Chen, C. Liu, Z. Qin, Z. Luo, and J. Wang, “Structured knowledge distillation for semantic segmentation,” in Proc. IEEE/CVF Conf. Comput. Vis. Pattern Recognit., 2019, pp. 2604–2613.
  64. Y. Liu, C. Shu, J. Wang, and C. Shen, “Structured knowledge distillation for dense prediction,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 45, no. 6, pp. 7035–7049, 2020.
  65. C. Shu, Y. Liu, J. Gao, Z. Yan, and C. Shen, “Channel-wise knowledge distillation for dense prediction, in Proc. IEEE/CVF Int. Conf. Comput. Vis., 2021, pp. 5311–5320.
  66. P. De Rijk, L. Schneider, M. Cordts, and D. Gavrila, “Structural knowledge distillation for object detection,” in Proc. Adv. Neural Inf. Process. Syst., 2022, vol. 35, pp. 3858–3870.
  67. L. Zhang, R. Dong, H. Tai, and K. Ma, “Pointdistiller: structured knowledge distillation towards efficient and compact 3D detection,” in Proc. IEEE/CVF Conf. Comput. Vis. Pattern Recognit., 2022, pp. 21791–21801.
  68. C. Liu, C. Tao, J. Feng, and D. Zhao, “Multi-granularity structural knowledge distillation for language model compression,” in Proc. 60th Annu. Meeting Assoc. Comput. Linguistics, 2022, pp. 1001-1011.
  69. W. Lin, Y. Li, L. Liu, S. Shi, and H. Zheng, “A simple yet effective approach to structured knowledge distillation,” in Proc. IEEE Int. Conf. Acoust. Speech Signal Process., 2023, pp. 1–5.
  70. A. Afonin, and S. Karimireddy, “Towards model agnostic federated learning using knowledge distillation,” in Proc. Int. Conf. Learn. Represents, 2022, [Online]. Available: https://openreview.net/forum?id=lQI_mZjvBxj.
  71. Y. Chen, X. Qin, J. Wang, C. Yu, and W. Gao, “Fedhealth: A federated transfer learning framework for wearable healthcare,” IEEE Intell. Syst., vol. 35, no. 4, pp. 83–93, 2020.
  72. C. Huang, L. Wang, L., and X. Han, “Vertical federated knowledge transfer via representation distillation for healthcare collaboration networks,” in Proc. ACM Web Conf., 2023, pp. 4188–4199.
  73. F. Sattler, A. Marban, R. Rischke, and W. Samek, “Cfd: communication-efficient federated distillation via soft-label quantization and delta coding,” IEEE Trans. Netw. Sci. Eng., vol. 9, no. 4, pp. 2025–2038, 2021.
  74. T. Liu, J. Xia, Z. Ling, X. Fu, S. Yu, and M. Chen, “Efficient federated learning for AIoT applications using knowledge distillation,” IEEE Internet Things J., vol. 10, no. 8, pp. 7229–7243, 2022.
  75. S. Bhatt, A. Gupta, and P. Rai, “Federated learning with uncertainty via distilled predictive distributions,” in Proc. Asian Conf. Mach. Learn., 2024, vol. pp. 153–168.
  76. H. Kim, Y. Kwak, M. Jung, J. Shin, Y. Kim, and C. Kim, “ProtoFL: unsupervised federated learning via prototypical distillation,” in Proc. IEEE/CVF Int. Conf. Comput. Vis., 2023, pp. 6470–6479.
  77. D. Verma, G. White, S. Julier, S. Pasteris, S. Chakraborty, and G. Cirincione, “Approaches to address the data skew problem in federated learning,” Artif. Intell. Mach. Learn. Multi-Domain Operations Appl., vol, 11006, pp. 542–557, 2019.
  78. S. Caldas, V. Smith, and A. Talwalkar, “Federated kernelized multi-task learning,” in Proc. SysML Conf., pp. 1–3, 2018.
  79. F. Sattler, K.-R. Müller, and W. Samek, “Clustered federated learning: Model-agnostic distributed multitask optimization under privacy constraints,” IEEE Trans. Neural Netw. Learn. Syst., vol. 32, no. 8, pp. 3710–3722, 2020.
  80. Y. Jiang, J. Konečnỳ, K. Rush, and S. Kannan, “Improving federated learning personalization via model agnostic meta learning,” 2019, arXiv preprint arXiv:1909.12488.
  81. C. Fung, J. Koerner, S. Grant, and I. Beschastnikh, “Dancing in the dark: Private multi-party machine learning in an untrusted setting,” 2018, arXiv preprint arXiv:1811.09712.
  82. L. Corinzia, A. Beuret, and J. Buhmann, “Variational federated multi-task learning,” 2019, arXiv preprint arXiv:1906.06268.
  83. V. Smith, C. Chiang, M. Sanjabi, and A. Talwalkar, “Federated multi-task learning,” in Proc. Adv. Neural Inf. Process. Syst., 2017, vol. 30, pp. 4424–4434.
  84. L. Liu, J. Zhang, S. Song, and K. Letaief, “Client-edge-cloud hierarchical federated learning,” in Proc. IEEE Int. Conf. Commun., 2020, pp. 1–6.
  85. A. Muhammad, K. Lin, J. Gao, and B. Chen, “Robust Multi-model Personalized Federated Learning via Model Distillation,” inProc. Int. Conf. Algorithms Archit. Parallel Process., 2022, pp. 432–446.
  86. L. Hu, H. Yan, L. Li, Z. Pan, X. Liu, and Z. Zhang, “ MHAT: An efficient model-heterogenous aggregation training scheme for federated learning,” Inf. Sci., vol. 560, pp. 493–503, 2021.
  87. Y. Cho, A. Manoel, G. Joshi, R. Sim, and D. Dimitriadis, “Heterogeneous ensemble knowledge transfer for training large models in federated learning,” in Proc. Int. Joint Conf. Artif. Intell., 2022, [Online]. Available: https://www.ijcai.org/proceedings/2022/0399.pdf.
  88. F. Sattler, S. Wiedemann, K.-R. Müller, and W. Samek, “Robust and communication-efficient federated learning from non-iid data,” IEEE Trans. Neural Netw. Learn. Syst., vol. 31, no. 9, PP. 3400–3413, 2019.
  89. Y. Yeganeh, A. Farshad, N. Navab, and S. Albarqouni, “Inverse distance aggregation for federated learning with non-iid data,” in Proc. Int. Conf. Med. Image Comput. Comput.-Assist. Interv. Workshop Distrib. Collaborative Learn., 2020, pp. 150–159.
  90. Z. Wang, J. Xiao, L. Wang, and J. Yao, “A novel federated learning approach with knowledge transfer for credit scoring,” Decis. Support Syst., pp. 114084, 2023.
  91. C. Huang, R. Jin, C. Zhao, D. Xu, and X. Li, “Federated virtual learning on heterogeneous data with local-global distillation,” 2023, arXiv preprint arXiv:2303.02278.
  92. J. Yin, Y. Sun, L. Cui, Z. Ai, and H. Zhu, “SynCPFL: synthetic distribution aware clustered framework for personalized federated learning,” in Proc. Int. Conf. Comput. Support. Coop. Work Des., 2023, pp. 438–443.
  93. X. Li, S. Sun, M. Liu, J. Ren, X. Jiang, and T. He, “Federated classification tasks in long-tailed data environments via classifier representation adjustment and calibration,” IEEE Trans. Mobile Comput., 2023.
  94. Y. Zhao, M. Li, L. Lai, N. Suda, D. Civin, and V. Chandra, “Federated learning with non-iid data,” 2018, arXiv preprint arXiv:1806.00582.
  95. T. Wang, J. Zhu, A. Torralba, and A. Efros, “Dataset distillation,” 2018, arXiv preprint arXiv:1811.10959.
  96. S. Ahmad, and A. Aral, “FedCD: personalized federated learning via collaborative distillation,” in Proc. Int. Conf. Util. Cloud Comput., 2022, pp. 189–194.
  97. L. Witt, U. Zafar, K. Shen,F. Sattler, D. Li, and W. Samek, “Reward-based 1-bit compressed federated distillation on blockchain,” 2021, arXiv preprint arXiv:2106.14265.
  98. M. Nguyen, H. Le, S. Pandey, and C. Hong, “CDKT-FL: cross-device knowledge transfer using proxy dataset in federated learning,” 2022, arXiv preprint arXiv:2204.01542.
  99. B. Li, Y. Shi, Y. Guo, Q Kong, and Y. Jiang, “Incentive and knowledge distillation based federated learning for cross-silo applications,” in Proc. IEEE/CVF Int. Conf. Comput. Commun., 2022, pp. 1–6.
  100. Y. Chen, D. Chen, H. Wang, K. Han, and M. Zhao, “Confidence-based federated distillation for vision-based lane-centering,” in Proc. IEEE Int. Conf. Acoust. Speech Signal Process. Workshop, 2023, pp. 1–5.
  101. Y. Huang, H. Yang, and C. Lee, “Federated learning via conditional mutual learning for alzheimer’s disease classification on T1w MRI,” in Proc. Annu. Int. Conf. IEEE Eng. Med. Biol. Soc., 2021, pp. 2427–2432.
  102. B. Li, Y. Shi, Q. Kong, Q. Du, and R. Lu, “Incentive-based federated learning for digital twin driven industrial mobile crowdsensing,” IEEE Internet Things J., vol. 10, no. 20, pp. 17851–17864, 2023.
  103. J. Shao, F. Wu, and J. Zhang, “Selective knowledge sharing for privacy-preserving federated distillation without a good teacher,” Nat. Commun., vol. 15, no. 1, pp. 349, 2023.
  104. X. Shang, Y. Lu, Y. Cheung, and H. Wang, “FEDIC: federated learning on Non-IID and long-tailed data via calibrated distillation,” in Proc. IEEE Int. Conf. Multimedia Expo., 2022, pp. 1–6.
  105. C. Peng, Y. Guo, Y. Chen, Q. Rui, Z. Yang, and C. Xu, “FedGM: heterogeneous federated learning via generative learning and mutual distillation,” in Eur. Conf. on Parallel Process., 2023, pp. 339–351.
  106. L. Zhang, L. Shen, L. Ding, D. Tao, and L. Duan, “Fine-tuning global model via data-free knowledge distillation for non-iid federated learning,” in Proc. IEEE/CVF Conf. Comput. Vis. Pattern Recognit., 2022, pp. 10174–10183.
  107. H. Wen, Y. Wu, J. Hu, Z. Wang, H. Duan, and G. Min, “Communication-efficient federated learning on Non-IID data using two-step knowledge distillation,” IEEE Internet Things J., vol. 10, no. 19, pp. 17307–17322, 2023.
  108. S. Zhao, T. Liao, L. Fu, C. Chen, J. Bian, and Z. Zheng, “Data-free knowledge distillation via generator-free data generation for Non-IID federated learning,” in Med. Imag. Deep Learn., 2023, doi: 10.21203/rs.3.rs-3364332/v1.
  109. W. Mai, J. Yao, C. Gong, Y. Zhang, Y. Cheung, and B. Han, “Server-client collaborative distillation for federated reinforcement learning,” ACM Trans. Knowl. Discov. Data, vol. 18, no. 1, pp. 1–22, 2023.
  110. S. Zhu, S.,Q. Qi, Z. Zhuang, J. Wang, H. Sun, and J. Liao, “FedNKD: A dependable federated learning using fine-tuned random noise and knowledge distillation,” in Proc. Int. Conf. Multimedia Retr., 2022, pp. 185–193.
  111. Z. Zhang, T. Shen, J. Zhang, and C. Wu, “Feddtg: federated data-free knowledge distillation via three-player generative adversarial networks,” 2022, arXiv preprint arXiv:2201.03169.
  112. V. Tsouvalas, A. Saeed, T. Ozcelebi, and N. Meratnia, “Federated learning with noisy labels,” 2022, arXiv preprint arXiv:2208.09378.
  113. Y. Ning, J. Wang, D. Li, D. Yan, and X. Li, “GFedKRL: graph federated knowledge re-Learning for effective molecular property prediction via privacy protection,” in Int. Conf. Artif. Neural Netw., 2023, pp. 426–438.
  114. N. Wu, L. Yu, X. Jiang, K. Cheng, Z. Yan, “FedNoRo: towards noise-robust federated learning by addressing class imbalance and label noise heterogeneity,” in Proc. Int. Joint Conf. Artif. Intell., 2023.
  115. S. Divi, H. Farrukh, and B. Celik, “Unifying distillation with personalization in federated learning,” 2021, arXiv preprint arXiv:2105.15191.
  116. X. Li, Y. Yang, and D. Zhan, “MrTF: model refinery for transductive federated learning,” Data Min. Knowl. Discov., vol. 37, no. 5, pp. 2046–2069, 2023.
  117. H. Chen, C. Wang, and H. Vikalo, “The Best of both worlds: accurate global and personalized models through federated learning with data-free hyper-knowledge distillation,” in Proc. Int. Conf. Learn. Represents, 2023.
  118. X. Feng, X. Feng, X. Du, M. Kan, and B. Qin, “Adapter-based selective knowledge distillation for federated multi-domain meeting summarization,” 2023, arXiv preprint arXiv:2308.03275.
  119. Q. Zeng, J. Liu, H. Xu, Z. Wang, Y. Xu, and Y. Zhao, “Enhanced federated learning with adaptive block-wise regularization and knowledge distillation,” in IEEE/ACM Int. Symp. Qual. Serv., 2023, pp. 1–4.
  120. Y. Yang, C. Liu, X. Cai, S. Huang, H. Lu, and Y. Ding, “UNIDEAL: curriculum knowledge distillation federated learning,” 2023, arXiv preprint arXiv:2309.08961.
  121. Y. Shen, Y. Zhou, and L. Yu, “Cd2-pfed: cyclic distillation-guided channel decoupling for model personalization in federated learning,” in Proc. IEEE/CVF Conf. Comput. Vis. Pattern Recognit., 2022, pp. 10041–10080.
  122. H. Chen, Y. Zhang, D. Krompass, J. Gu, and V. Tresp, “FedDAT: an approach for foundation model finetuning in multi-modal heterogeneous federated learning,” 2023, arXiv preprint arXiv:2308.12305.
  123. Y. Zhang, L. Liu, and L. Liu, “Cuing without sharing: a federated cued speech recognition framework via mutual knowledge distillation,” in Proc. ACM Int. Conf. Multimedia, 2023, pp. 8781–8789.
  124. X. Jiang, S. Sun, Y. Wang, and M. Liu, “Towards federated learning against noisy labels via local self-regularization,” in Proc. ACM Int. Conf. Inf. Knowledge Manage., 2022, pp. 862–873.
  125. H. Xing, Z. Xiao, R. Qu, Z. Zhu, and B. Zhao, “An efficient federated distillation learning system for multitask time series classification,” IEEE Trans. Instrum. Meas., vol. 71, pp. 1–12, 2022.
  126. H. Wang, Y. Li, W. Xu, R. Li, Y. Zhan, and Z. Zeng, “DaFKD: domain-aware federated knowledge distillation,” in Proc. IEEE/CVF Conf. Comput. Vis. Pattern Recognit., 2023, pp. 20412–20421.
  127. X. Li, B. Chen, W. Lu, “FedDKD: federated learning with decentralized knowledge distillation,” Appl. Intell., pp. 1–17, 2023.
  128. H. Chen, and W. Chao, “Fedbe: making bayesian model ensemble applicable to federated learning,” in Proc. Int. Conf. Learn. Represents, 2020, [Online]. Available: https://openreview.net/forum?id=dgtpE6gKjHn.
  129. P. Jain, S. Goenka, S. Bagchi, B. Banerjee, and S. Chaterji, “Federated action recognition on heterogeneous embedded devices,” 2021, arXiv preprint arXiv:2107.12147.
  130. E. Tanghatari, M. Kamal, A. Afzali-Kusha, and M. Pedram, “Federated learning by employing knowledge distillation on edge devices with limited hardware resources,” Neurocomputing, vol. 531, pp. 87–99, 2023.
  131. C. He, M. Annavaram, and S. Avestimehr, “Group knowledge transfer: federated learning of large cnns at the edge,” in Proc. Adv. Neural Inf. Process. Syst., 2020, vol. 33, pp. 14068–14080.
  132. F. Ilhan, G. Su, and L. Liu, “ScaleFL: resource-adaptive federated learning with heterogeneous clients,” in Proc. IEEE/CVF Conf. Comput. Vis. Pattern Recognit., 2023, pp. 24532–24541.
  133. K. Ozkara, N. Singh, D. Data, and S. Diggavi, “Quped: quantized personalization via distillation with applications to federated learning,” in Proc. Adv. Neural Inf. Process. Syst., 2021, vol. 34, pp. 3622–3634.
  134. W. Yuan, L. Qu, L. Cui, Y. Tong, X. Zhou, and H. Yin, “HeteFedRec: federated recommender systems with model heterogeneity,” 2023, arXiv preprint arXiv:2307.12810.
  135. H. Shi, V. Radu, and P. Yang, “Closing the gap between client and global model performance in heterogeneous federated learning,” 2022, arXiv preprint arXiv:2211.03457.
  136. M. Hamood, A. Albaseer, M. Abdallah, and A. Al-Fuqaha, “Clustered and multi-tasked federated distillation for heterogeneous and resource constrained industrial IoT applications,” IEEE Internet Things Mag., vol. 6, no. 2, pp. 64–69, 2023.
  137. S. Horvath, S. Laskaridis, M. Almeida, I. Leontiadis, S. Venieris, and N. Lane, “Fjord: fair and accurate federated learning under heterogeneous targets with ordered dropout,” in Proc. Adv. Neural Inf. Process. Syst., 2021, vol. 34, pp. 12876–12889.
  138. L. Zhang, D. Wu, and X. Yuan, “Fedzkt: zero-shot knowledge transfer towards resource-constrained federated learning with heterogeneous on-device models,” in Proc. IEEE Int. Conf. Distrib. Comput. Syst., 2022, pp. 928–938.
  139. X. Wang, N. Cheng, L. Ma, R. Sun, R. Chai, and N. Lu, “Digital twin-assisted knowledge distillation framework for heterogeneous federated learning,” China Commun., vol. 20, no. 2, pp. 61–78, 2023.
  140. Z. Chen, P. Tian, W. Liao, X. Chen, G. Xu, and W. Yu, “Resource-aware knowledge distillation for federated learning,” IEEE Trans. Emerg. Top. Comput., vol. 11, no. 3, pp. 706–719, 2023.
  141. A. Imteaj, and M. Amini, “FedMDP: A federated learning framework to handle system and model heterogeneity in resource-constrained environments,” in Proc. AAAI Conf. Artif. Intell., 2023.
  142. K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,” in Proc. IEEE/CVF Conf. Comput. Vis. Pattern Recognit., 2016, pp. 770–778.
  143. M. Tan, and Q. Le, “Efficientnet: rethinking model scaling for convolutional neural networks,” in Proc. Int. Conf. Mach. Learn., 2019, pp. 6108–6114.
  144. G. Gudur, and S. Perepu, “Federated learning with heterogeneous labels and models for mobile activity monitoring,” in Proc. Adv. Neural Inf. Process. Syst., 2020.
  145. H. Chang, V. Shejwalkar, R. Shokri, and A. Houmansadr, “Cronus: robust and heterogeneous collaborative learning with black-box knowledge transfer,” 2019, arXiv preprint arXiv:1912.11279.
  146. L. Liu, J. Zhang, S. Song, and K. Letaief, “Communication-efficient federated distillation with active data sampling,” in Proc. IEEE Int. Conf. Commun., 2022, pp. 201–206.
  147. Y. Chan, and E. Ngai, “Fedhe: heterogeneous models and communication-efficient federated learning,” in Proc. Int. Conf. Mobile Ad-Hoc Sensor Netw., 2021, pp. 207–214.
  148. F. Sattler, T. Korjakow, R. Rischke, and W. Samek, “Fedaux: leveraging unlabeled auxiliary data in federated learning,” IEEE Trans. Neural Netw. Learn. Syst., vol. 34, no. 9, pp. 5531–5543, 2021.
  149. Y. Cho, J. Wang, T. Chirvolu, and G. Joshi, “Communication-efficient and model-heterogeneous personalized federated learning via clustered knowledge transfer,” IEEE J. Sel. Topics Signal Process., vol. 17, no. 1, pp. 234–247, 2023.
  150. G. Ye, T. Chen, Y. Li, L. Cui, Q. Nguyen, and H. Yin, “Heterogeneous collaborative learning for personalized healthcare analytics via messenger distillation,” IEEE J. Biomed. Health Inform., pp. 1–10, 2023.
  151. D. Nguyen, S. Yu, J. Muñoz, and A. Jannesari, “Enhancing heterogeneous federated learning with knowledge extraction and multi-model fusion,” in Proceedings SC’23 Workshop Int. Conf. High Perform. Comput., Netw., Storage, Analysis, 2023, pp. 36–43.
  152. K. Luo, S. Wang, Y. Fu, X. Li, Y. Lan, and M. Gao, “DFRD: data-free robustness distillation for heterogeneous federated learning,” in Proc. Adv. Neural Inf. Process. Syst., 2023, vol. 36.
  153. M. Yashwanth, G. Nayak, A. Singh, Y. Singh, and A. Chakraborty, “Federated learning on heterogeneous data via adaptive self-distillation,” 2023, arXiv preprint arXiv:2305.19600.
  154. J. Shen, and S. Chen, “Knowledge discrepancy-aware federated learning for Non-IID data,” in Proc. IEEE Wireless Commun. Netw. Conf., 2023, pp. 1–6.
  155. J. Guo, H. Liu, S. Sun, T. Guo, M. Zhang, and C. Si, C. “FSAR: federated skeleton-based action recognition with adaptive topology structure and knowledge distillation,” 2023, arXiv preprint arXiv:2306.11046.
  156. X. Zhou, X. Lei, C. Yang, Y. Shi, X. Zhang, and J. Shi, “Handling data heterogeneity for IoT devices in federated learning: a knowledge fusion approach,” IEEE Internet Things J., vol. 11, no. 5, pp. 8090–8104, 2023.
  157. M. Mendieta, T. Yang, P. Wang, M. Lee, Z. Ding, and C. Chen, “Local learning matters: rethinking data heterogeneity in federated learning,” in Proc. IEEE/CVF Conf. Comput. Vis. Pattern Recognit., 2022, pp. 8397–8406.
  158. Y. He, Y. Chen, X. Yang, Y. Zhang, and B. Zeng, “Class-wise adaptive self distillation for heterogeneous federated learning,” in Proc. AAAI Conf. Artif. Intell., 2022, vol. 22, pp. 1–6.
  159. Z. Zhu, J. Hong, and J. Zhou, “Data-free knowledge distillation for heterogeneous federated learning,” in Proc. Int. Conf. Mach. Learn., 2021, pp. 12878–12889.
  160. Q. Yang, J. Chen, X. Yin, J. Xie, and Q. Wen, “FedMMD: heterogenous federated learning based on multi-teacher and multi-feature distillation,” in Int. Conf. Computer and Commun. Syst., 2022, pp. 897–902.
  161. X. Zhu, G. Li, and W. Hu, “Heterogeneous federated knowledge graph embedding learning and unlearning,” 2023, arXiv preprint arXiv:2302.02069.
  162. G. Lee, M. Jeong, Y. Shin, S. Bae, and S. Yun, “Preservation of the Global Knowledge by Not-True Distillation in Federated Learning,” in Proc. Adv. Neural Inf. Process. Syst., 2021, vol. 35, pp. 38461–38474.
  163. Y. Ma, Z. Xie, J. Wang, K. Chen, and L. Shou, “Continual federated learning based on knowledge distillation,” in Proc. Int. Joint Conf. Artif. Intell., 2022, pp. 2182–2188.
  164. C. Liu, X. Qu, J. Wang, and J. Xiao, “FedET: a communication-efficient federated class-incremental learning framework based on enhanced transformer,” in Proc. Int. Joint Conf. Artif. Intell., 2023, pp. 3984–3992.
  165. J. Dong, Y. Cong, G. Sun, Y. Zhang, B. Schiele, and D. Dai, “No one left behind: real-world federated class-incremental learning,” 2023, arXiv preprint arXiv:2302.00903.
  166. S. Lee, K. Yoo, and N. Kwak, “Asynchronous edge learning using cloned knowledge distillation,” 2020, arXiv preprint arXiv:2010.10338.
  167. S. Liu, X. Feng, and H. Zheng, “Overcoming forgetting in local adaptation of federated learning model,” in Pac.-Asia Knowl. Discov. Data Min., 2022, pp. 613–625.
  168. S.Halbe, J. Smith, J. Tian, and Z. Kira, “HePCo: data-free heterogeneous prompt consolidation for continual federated learning,” in Proc. Adv. Neural Inf. Process. Syst. Workshop, 2023, [Online]. Available: https://openreview.net/forum?id=dsWg7n6zoo.
  169. J. Zhang, C. Chen, W. Zhuang, and L. Lyu, “TARGET: federated class-continual learning via exemplar-free distillation,” in Proc. IEEE/CVF Int. Conf. Comput. Vis., 2023, pp. 4782–4793.
  170. P. Dhar, R. Singh, K. Peng, Z. Wu, and R. Chellappa, “Learning without memorizing,” in Proc. IEEE/CVF Conf. Comput. Vis. Pattern Recognit., 2019, pp. 5138–5146.
  171. J. Dong, D. Zhang, Y. Cong, W. Cong, H. Ding, and D. Dai, “Federated incremental semantic segmentation,” in Proc. IEEE/CVF Conf. Comput. Vis. Pattern Recognit., 2023, pp. 3934–3943.
  172. W. Huang, M. Ye, and B. Du, “Learn from others and be yourself in heterogeneous federated learning,” in Proc. IEEE/CVF Conf. Comput. Vis. Pattern Recognit., 2022, pp. 10143–10153.
  173. Y. He, Y. Chen, X. Yang, H. Yu, Y. Huang, and Y. Gu, “Learning critically: selective self-distillation in federated learning on non-iid data,” IEEE Trans. Big Data, vol. 1, pp. 1–12, 2022.
  174. Z. Jin, J. Zhou, B. Li, X. Wu, and C. Duan, “FL-IIDS: a novel federated learning-based incremental intrusion detection system,” Futur. Gener. Comp. Syst., vol. 151, pp. 57-70, 2023.
  175. A. Usmanova, F. Portet, P. Lalanda, and G. Vega, “A distillation-based approach integrating continual learning and federated learning for pervasive services,” in Proc. Int. Joint Conf. Artif. Intell., 2021.
  176. G. Wei, and X. Li, “Knowledge lock: overcoming catastrophic forgetting in federated learning,” in Pac.-Asia Knowl. Discov. Data Min., 2022, pp. 601–612.
  177. T. Liu, Z. Qi, Z. Chen, X. Meng, and L. Meng, “Cross-training with prototypical distillation for improving the generalization of federated learning,” in Proc. IEEE Int. Conf. Multimedia Expo., 2023, pp. 648–653.
  178. A. Psaltis, C. Chatzikonstantinou, C. Patrikakis, and P. Daras, “FedRCIL: federated knowledge distillation for representation based contrastive incremental learning,” in Proc. IEEE/CVF Int. Conf. Comput. Vis., 2023, pp.3463–3472.
  179. X. Zhou, Y. Tian, and X Wang, “Source-target unified knowledge distillation for memory-efficient federated domain adaptation on edge devices,” in Proc. Int. Conf. Learn. Represents, 2022, pp. 1–18.
  180. H. McMahan, D. Ramage, K. Talwar, and L. Zhang, “Learning differentially private recurrent language models,” in Proc. Int. Conf. Learn. Represents, 2018.
  181. D. Sui, Y. Chen, J. Zhao, Y. Jia, Y. Xie, and W. Sun, “Feded: federated learning via ensemble distillation for medical relation extraction,” in Proc. Conf. Empir. Methods Nat. Lang. Proc., 2020, pp. 2118–2128.
  182. Y. Etiabi, M. Chafii, and E. Amhoud, “Federated distillation based indoor localization for IoT networks,” IEEE Sensors J., early Access, Jan. 29, 2024. doi: 10.1109/JSEN.2024.3357798.
  183. J. Ahn, O. Simeone, and J. Kang, “Wireless federated distillation for distributed edge learning with heterogeneous data,” in IEEE Int. Symp. Pers., Indoor Mob. Commun., 2019, pp. 1–6.
  184. H. Le, M. Nguyen, C. Thwal, Y. Qiao, C. Zhang, and C. Hong, “FedMEKT: distillation-based embedding knowledge transfer for multimodal federated learning,” 2023, arXiv preprint arXiv:2307.13214.
  185. J. Liu, K. Huang, and L. Xie, “HeteFed: heterogeneous federated learning with privacy-preserving binary low-rank matrix decomposition method,” in Proc. Int. Conf. Comput. Support. Coop. Work Des., 2023, pp. 1238–1244.
  186. F. Liu, and F. Yang, “Medical image segmentation based on federated distillation optimization learning on Non-IID data,” in Int. Conf. Intell. Comput., 2023, pp. 347–358.
  187. C. Wu, F. Wu, L. Lyu, Y. Huang, and X. Xie, “Communication-efficient federated learning via knowledge distillation,” Nat. Commun., vop. 13, no. 1, pp. 2032, 2022.
  188. Z. Mo, Z. Gao, C Zhao, and Y. Lin, “FedDQ: A communication-efficient federated learning approach for internet of vehicles,” in J. Syst. Architect., vol. 131, pp. 102690, 2022.
  189. H. Hoech, R. Rischke, K.-R. Müller, and W. Samek, “FedAUXfdp: differentially private one-shot federated distillation,” 2022, arXiv preprint arXiv:2205.14960.
  190. X. Gong, A. Sharma, S. Karanam, Z. Wu, T. Chen, D. Doermann, and A. Innanje, “Preserving privacy in federated learning with ensemble cross-domain knowledge distillation,” in Proc. AAAI Conf. Artif. Intell., 2022, vol. 36, no. 11, pp. 11891–11899.
  191. M. Eren, L. Richards, M. Bhattarai, R. Yus, C. Nicholas, and B. Alexandrov, “FedSPLIT: one-shot federated recommendation system based on non-negative joint matrix factorization and knowledge distillation,” 2022, arXiv preprint arXiv:2205.02359.
  192. N. Guha, A. Talwalkar, and V. Smith, “One-shot federated learning,” 2019, in arXiv preprint arXiv:1902.11175.
  193. M. Elmahallawy, and T. Luo, “One-shot federated learning for LEO constellations that reduces convergence time from days to 90 minutes,” 2023, arXiv preprint arXiv:2305.12316.
  194. H. Desai, A. Hilal, and H. Eldardiry, “Resource-efficient federated learning for heterogenous and resource-constrained environments,” 2023, arXiv preprint arXiv:2308.13662.
  195. S. Park, K. Hong, G. Hwang, “Towards understanding ensemble distillation in federated learning,” in Proc. Int. Conf. Mach. Learn., 2023, pp. 27132–27187.
  196. L. Melis, C. Song, E. De Cristofaro, and V. Shmatikov, “Exploiting unintended feature leakage in collaborative learning,” in Proc. IEEE Symp. Security Privacy, 2019, pp. 691–706.
  197. L. Zhu, Z. Liu, and S. Han, “Deep leakage from gradients,” in Proc. Adv. Neural Inf. Process. Syst., 2019, pp. 14774–14784.
  198. C. Wu, S. Zhu, and P. Mitra, “Unlearning backdoor attacks in federated learning,” in Proc. Int. Conf. Learn. Represents Workshop, 2023.
  199. Z. Liu, W. He, C. Chang, J. Ye, H. Li, and X. Li, “SPFL: a self-purified federated learning method against poisoning attacks,” 2023, arXiv preprint arXiv:2309.10607.
  200. H. Takahashi, J. Liu, and Y. Liu, “Breaching FedMD: image recovery via paired-logits inversion attack,” in Proc. IEEE/CVF Conf. Comput. Vis. Pattern Recognit., 2023, pp. 12198–12207.
  201. S. Wan, D. Gao, H. Gu, and D. Hu, “FedPDD: A privacy-preserving double distillation framework for cross-silo federated recommendation,” 2023, arXiv preprint arXiv:2305.06272.
  202. L. Sun, and L. Lyu, “Federated model distillation with noise-free differential privacy,” in Proc. Int. Joint Conf. Artif. Intell., pp. 1563–1570, 2020.
  203. S. Liu, and F. Dong, “MIA-FedDL: A membership inference attack against federated distillation learning,” in Proc. Int. Conf. Comput. Support. Coop. Work Des., 2023, pp. 1148–1153.
  204. C. Zhu, J. Zhang, X. Sun, B. Chen, and W. Meng, “ADFL: defending backdoor attacks in federated learning via adversarial distillation,” Comput. Secur., pp. 103366, 2023.
  205. Z. Yang, Y. Zhao, and J. Zhang, “FD-Leaks: membership inference attacks against federated distillation learning,” in Asia-Pacific Web Web-Age Inf. Manag. Joint Int. Conf. Web Big Data, 2023, pp. 364–378.
  206. L. Zou, H. Le, A. Raha, D. Kim, and C. Hong, “EFCKD: edge-assisted federated contrastive knowledge distillation approach for energy management: energy theft perspective,” in Proc. Asia-Pacific Netw. Oper. Manage. Symp., 2023, pp. 30–35.
  207. C. Chen, G. Yao, L. Liu, Q. Pei, H. Song, and S. Dustdar, “A cooperative vehicle-infrastructure system for road hazards detection with edge intelligence,” IEEE Trans. Intell. Transp. Syst., vol. 24, no. 5, pp. 5186-5198, 2023.
  208. H. Touvron, M. Cord, A. Sablayrolles, G. Synnaeve, and H. Jégou, “Going deeper with image transformers,” in Proc. IEEE/CVF Int. Conf. Comput. Vis., 2021, pp. 32–42.
  209. S. Khan, M. Naseer, M. Hayat, S. Zamir, F. Khan, and M. Shah, “Transformers in vision: A survey,” ACM Comput. Surv., vol. 54, no. 10S, pp. 1-41, 2022.
  210. Y. LeCun, L. Bottou, Y. Bengio, and P. Haffner, “Gradient-based learning applied to document recognition,” Proc. IEEE, 1998, vol. 86, no. 11, pp. 2278–2324.
  211. H. Xiao, K. Rasul, and R. Vollgraf, “Fashion-mnist: a novel image dataset for benchmarking machine learning algorithms,” 2017, arXiv preprint arXiv:1708.07747.
  212. A. Krizhevsky, “Learning multiple layers of features from tiny imagesm,” in Technical report, University of Toronto, 2009, [Online]. Available: http://www.cs.utoronto.ca/ kriz/learning-features-2009-TR.pdf.
  213. T. Khoa, D. Nguyen, M. Dao, and K. Zettsu, “Fed xData: A federated learning framework for enabling contextual health monitoring in a cloud-edge network,” in IEEE Int. Conf. Big Data, 2021, pp. 4979–4988.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Lin Li (329 papers)
  2. Jianping Gou (7 papers)
  3. Baosheng Yu (51 papers)
  4. Lan Du (46 papers)
  5. Zhang Yiand Dacheng Tao (1 paper)
Citations (4)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets