Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Towards Efficient Communications in Federated Learning: A Contemporary Survey (2208.01200v2)

Published 2 Aug 2022 in cs.DC

Abstract: In the traditional distributed machine learning scenario, the user's private data is transmitted between clients and a central server, which results in significant potential privacy risks. In order to balance the issues of data privacy and joint training of models, federated learning (FL) is proposed as a particular distributed machine learning procedure with privacy protection mechanisms, which can achieve multi-party collaborative computing without revealing the original data. However, in practice, FL faces a variety of challenging communication problems. This review seeks to elucidate the relationship between these communication issues by methodically assessing the development of FL communication research from three perspectives: communication efficiency, communication environment, and communication resource allocation. Firstly, we sort out the current challenges existing in the communications of FL. Second, we have collated FL communications-related papers and described the overall development trend of the field based on their logical relationship. Ultimately, we discuss the future directions of research for communications in FL.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Zihao Zhao (42 papers)
  2. Yuzhu Mao (6 papers)
  3. Yang Liu (2253 papers)
  4. Linqi Song (93 papers)
  5. Ye Ouyang (16 papers)
  6. Xinlei Chen (106 papers)
  7. Wenbo Ding (53 papers)
Citations (49)

Summary

We haven't generated a summary for this paper yet.