Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
143 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Blockchain-based Decentralized Federated Learning Framework with Committee Consensus (2004.00773v1)

Published 2 Apr 2020 in cs.DC and cs.LG

Abstract: Federated learning has been widely studied and applied to various scenarios. In mobile computing scenarios, federated learning protects users from exposing their private data, while cooperatively training the global model for a variety of real-world applications. However, the security of federated learning is increasingly being questioned, due to the malicious clients or central servers' constant attack to the global model or user privacy data. To address these security issues, we proposed a decentralized federated learning framework based on blockchain, i.e., a Blockchain-based Federated Learning framework with Committee consensus (BFLC). The framework uses blockchain for the global model storage and the local model update exchange. To enable the proposed BFLC, we also devised an innovative committee consensus mechanism, which can effectively reduce the amount of consensus computing and reduce malicious attacks. We then discussed the scalability of BFLC, including theoretical security, storage optimization, and incentives. Finally, we performed experiments using real-world datasets to verify the effectiveness of the BFLC framework.

Citations (357)

Summary

  • The paper presents a novel blockchain-based federated learning framework that eliminates the need for a central server by storing local updates and global models on a decentralized ledger.
  • The committee consensus mechanism validates model updates efficiently, reducing computational overhead and mitigating risks from malicious contributions.
  • Empirical results on the FEMNIST dataset show that BFLC achieves performance comparable to centralized FL while enhancing security through incentive-based node management.

Decentralizing Federated Learning Through a Blockchain-Based Framework

The paper presents an innovative framework, "Blockchain-based Federated Learning Framework with Committee Consensus" (BFLC), designed to enhance security and decentralization in Federated Learning (FL) systems. The integration of blockchain technology with FL is posited as a solution to address key security concerns associated with traditional FL settings, particularly those related to malicious clients or central servers.

Core Contributions and Methodology

The authors propose using a blockchain infrastructure to store both global models and local model updates, thereby eliminating the central server typically employed in FL. This storage mechanism leverages blockchain's decentralized nature to enhance security and reliability. Central to this framework is the introduction of a novel committee consensus mechanism designed to optimize consensus efficiency and reduce susceptibility to malicious activities.

Key features of BFLC include:

  • Blockchain Storage Architecture: The framework employs a specific block structure that maintains both model and update blocks. This structure allows for efficient model storage and utilization, enabling nodes to access the latest global model quickly.
  • Committee Consensus Mechanism: The proposed consensus method replaces the typical broadcasting approach with delegated validation by a committee. This committee, composed of a small number of nodes, validates updates and reduces computational overhead. It also safeguards against malicious updates by ensuring only validated contributions are integrated into the global model.
  • Node Management and Incentive Design: BFLC includes an incentive mechanism, "profit sharing by contribution," to encourage active participation in the learning process. New nodes undergo a vetting process to join the network, ensuring permission and security are well-managed.

Results and Evaluation

Empirical evaluations conducted on the FEMNIST dataset demonstrate that BFLC achieves performance comparable to traditional centralized FL, even under varying proportions of participant nodes. Under scenarios involving malicious attacks, BFLC shows pronounced resilience, outperforming both the basic FL framework and other robust aggregation methods like CwMed.

Implications and Future Directions

The implications of this research are significant for both theoretical exploration and practical deployment of FL systems. By eliminating reliance on a central server, BFLC addresses major vulnerabilities inherent in current models and lays the groundwork for further decentralization of training processes.

Future research could explore enhancing the BFLC framework's scalability and efficiency through more sophisticated consensus mechanisms and cross-disciplinary innovations. Additionally, the integration of privacy-preserving technologies within this framework could further bolster the security and application potential of decentralized FL systems.

In conclusion, the paper presents a compelling case for integrating blockchain technology into federated learning frameworks, offering a solution to longstanding security concerns while paving the way for innovation in secure machine learning architectures.