Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

BFLN: A Blockchain-based Federated Learning Model for Non-IID Data (2407.05276v3)

Published 7 Jul 2024 in cs.DC

Abstract: As the application of federated learning becomes increasingly widespread, the issue of imbalanced training data distribution has emerged as a significant challenge. Federated learning utilizes local data stored on different training clients for model training, rather than centralizing data on a server, thereby greatly enhancing the privacy and security of training data. However, the distribution of training data across different clients may be imbalanced, with different categories of data potentially residing on different clients. This presents a challenge to traditional federated learning, which assumes data distribution is independent and identically distributed (IID). This paper proposes a Blockchain-based Federated Learning Model for Non-IID Data (BFLN), which combines federated learning with blockchain technology. By introducing a new aggregation method and incentive algorithm, BFLN enhances the model performance of federated learning on non-IID data. Experiments on public datasets demonstrate that, compared to other state-of-the-art models, BFLN improves training accuracy and provides a sustainable incentive mechanism for personalized federated learning.

Summary

We haven't generated a summary for this paper yet.