Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
149 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Transparent Contribution Evaluation for Secure Federated Learning on Blockchain (2101.10572v2)

Published 26 Jan 2021 in cs.LG and cs.CR

Abstract: Federated Learning is a promising machine learning paradigm when multiple parties collaborate to build a high-quality machine learning model. Nonetheless, these parties are only willing to participate when given enough incentives, such as a fair reward based on their contributions. Many studies explored Shapley value based methods to evaluate each party's contribution to the learned model. However, they commonly assume a semi-trusted server to train the model and evaluate the data owners' model contributions, which lacks transparency and may hinder the success of federated learning in practice. In this work, we propose a blockchain-based federated learning framework and a protocol to transparently evaluate each participant's contribution. Our framework protects all parties' privacy in the model building phase and transparently evaluates contributions based on the model updates. The experiment with the handwritten digits dataset demonstrates that the proposed method can effectively evaluate the contributions.

Citations (34)

Summary

We haven't generated a summary for this paper yet.