Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
133 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Robust softmax aggregation on blockchain based federated learning with convergence guarantee (2311.07027v2)

Published 13 Nov 2023 in cs.CR

Abstract: Blockchain based federated learning is a distributed learning scheme that allows model training without participants sharing their local data sets, where the blockchain components eliminate the need for a trusted central server compared to traditional Federated Learning algorithms. In this paper we propose a softmax aggregation blockchain based federated learning framework. First, we propose a new blockchain based federated learning architecture that utilizes the well-tested proof-of-stake consensus mechanism on an existing blockchain network to select validators and miners to aggregate the participants' updates and compute the blocks. Second, to ensure the robustness of the aggregation process, we design a novel softmax aggregation method based on approximated population loss values that relies on our specific blockchain architecture. Additionally, we show our softmax aggregation technique converges to the global minimum in the convex setting with non-restricting assumptions. Our comprehensive experiments show that our framework outperforms existing robust aggregation algorithms in various settings by large margins.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (32)
  1. Federated learning using peer-to-peer network for decentralized orchestration of model weights. TechRxiv, 14267468, 2021.
  2. Analyzing federated learning through an adversarial lens. In Proceedings of the 36th International Conference on Machine Learning, volume 97, pages 634–643, 2019.
  3. Machine learning with adversaries: Byzantine tolerant gradient descent. In Advances in Neural Information Processing Systems, volume 30, 2017.
  4. Optimization methods for large-scale machine learning. SIAM Review, 60(2):223–311, 2018.
  5. Robust blockchained federated learning with model validation and proof-of-stake inspired consensus. arXiv, 2101.03300, 2021.
  6. On security analysis of proof-of-elapsed-time (poet). In Stabilization, Safety, and Security of Distributed Systems, pages 282–297, 2017.
  7. Creat: Blockchain-assisted compression algorithm of federated learning for content caching in edge computing. IEEE Internet of Things Journal, 9(16):14151–14161, 2022.
  8. Mitigating sybils in federated learning poisoning. arXiv, 1808.04866, 2020.
  9. On the properties of the softmax function with application in game theory and reinforcement learning. arXiv, 1704.00805, 2017.
  10. Scalable and communication-efficient decentralized federated edge learning with multi-blockchain framework. In Blockchain and Trustworthy Systems, pages 152–165, 2020.
  11. Blockchained on-device federated learning. IEEE Communications Letters, 24(6):1279–1283, 2020.
  12. Blockchain for federated learning toward secure distributed machine learning systems: A systemic survey. Soft Computing, 26(9):4423–4440, 2022.
  13. Federated learning: Challenges, methods, and future directions. IEEE Signal Processing Magazine, 37(3):50–60, 2020a.
  14. On the convergence of fedavg on non-iid data. In The International Conference on Learning Representations, 2020b.
  15. A blockchain-based decentralized federated learning framework with committee consensus. IEEE Network, 35(1):234–241, 2021.
  16. Threats, attacks and defenses to federated learning: issues, taxonomy and perspectives. Cybersecurity, 5:1–19, 2022.
  17. Communication-Efficient Learning of Deep Networks from Decentralized Data. In Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, volume 54, pages 1273–1282, 2017.
  18. The hidden vulnerability of distributed learning in byzantium. In International Conference on Machine Learning, 2018.
  19. Blockflow: Decentralized, privacy-preserving, and accountable federated machine learning. In Blockchain and Applications, pages 233–242, 2022.
  20. Satoshi Nakamoto. Bitcoin: A peer-to-peer electronic cash system. Bitcoin.org, 2008.
  21. Proof of federated learning: A novel energy-recycling consensus algorithm. IEEE Transactions on Parallel and Distributed Systems, 32(8):2074–2085, 2021.
  22. P. Ramanan and K. Nakayama. Baffle : Blockchain based aggregator free federated learning. In 2020 IEEE International Conference on Blockchain, pages 72–81, 2020.
  23. Braintorrent: A peer-to-peer environment for decentralized federated learning. arXiv, 1905.06731, 2019.
  24. Aggregation techniques in federated learning: Comprehensive survey, challenges and opportunities. In 2022 2nd International Conference on Advance Computing and Innovative Technologies in Engineering, pages 1962–1967, 2022.
  25. A detailed survey on federated learning attacks and defenses. Electronics, 12(2), 2023.
  26. Dfl: High-performance blockchain-based federated learning. arXiv, 2110.15457, 2023.
  27. Peer-to-peer variational federated learning over arbitrary graphs. IEEE Journal on Selected Areas in Information Theory, 3(2):172–182, 2022.
  28. Blockchain-based federated learning: A comprehensive survey. arXiv, 2110.02182, 2021.
  29. Aggregation delayed federated learning. In 2022 IEEE International Conference on Big Data, pages 85–94, 2022.
  30. Byzantine-robust distributed learning: Towards optimal statistical rates. In Proceedings of the 35th International Conference on Machine Learning, volume 80, pages 5650–5659, 2018.
  31. On the convergence properties of a k-step averaging stochastic gradient descent algorithm for nonconvex optimization. In Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence, pages 3219–3227, 2018.
  32. Mandera: Malicious node detection in federated learning via ranking. arXiv, 2110.11736, 2023.
Citations (1)

Summary

We haven't generated a summary for this paper yet.