Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

DeceFL: A Principled Decentralized Federated Learning Framework (2107.07171v2)

Published 15 Jul 2021 in cs.LG, cs.DC, cs.SY, and eess.SY

Abstract: Traditional machine learning relies on a centralized data pipeline, i.e., data are provided to a central server for model training. In many applications, however, data are inherently fragmented. Such a decentralized nature of these databases presents the biggest challenge for collaboration: sending all decentralized datasets to a central server raises serious privacy concerns. Although there has been a joint effort in tackling such a critical issue by proposing privacy-preserving machine learning frameworks, such as federated learning, most state-of-the-art frameworks are built still in a centralized way, in which a central client is needed for collecting and distributing model information (instead of data itself) from every other client, leading to high communication pressure and high vulnerability when there exists a failure at or attack on the central client. Here we propose a principled decentralized federated learning algorithm (DeceFL), which does not require a central client and relies only on local information transmission between clients and their neighbors, representing a fully decentralized learning framework. It has been further proven that every client reaches the global minimum with zero performance gap and achieves the same convergence rate $O(1/T)$ (where $T$ is the number of iterations in gradient descent) as centralized federated learning when the loss function is smooth and strongly convex. Finally, the proposed algorithm has been applied to a number of applications to illustrate its effectiveness for both convex and nonconvex loss functions, demonstrating its applicability to a wide range of real-world medical and industrial applications.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (15)
  1. Ye Yuan (274 papers)
  2. Jun Liu (606 papers)
  3. Dou Jin (1 paper)
  4. Zuogong Yue (8 papers)
  5. Ruijuan Chen (3 papers)
  6. Maolin Wang (29 papers)
  7. Chuan Sun (8 papers)
  8. Lei Xu (172 papers)
  9. Feng Hua (2 papers)
  10. Xin He (135 papers)
  11. Xinlei Yi (50 papers)
  12. Tao Yang (520 papers)
  13. Hai-Tao Zhang (13 papers)
  14. Shaochun Sui (1 paper)
  15. Han Ding (38 papers)
Citations (8)

Summary

We haven't generated a summary for this paper yet.