Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 105 tok/s
Gemini 2.5 Pro 52 tok/s Pro
GPT-5 Medium 45 tok/s
GPT-5 High 34 tok/s Pro
GPT-4o 108 tok/s
GPT OSS 120B 473 tok/s Pro
Kimi K2 218 tok/s Pro
2000 character limit reached

Quantum Federated Learning for Distributed Quantum Networks (2212.12913v2)

Published 25 Dec 2022 in quant-ph

Abstract: Federated learning is a framework that can learn from distributed networks. It attempts to build a global model based on virtual fusion data without sharing the actual data. Nevertheless, the traditional federated learning process encounters two main challenges: high computational cost and message transmission security. To address these challenges, we propose a quantum federated learning for distributed quantum networks by utilizing interesting characteristics of quantum mechanics. First, we give two methods to extract the data information to the quantum state. It can cope with different acquisition frequencies of data information. Next, a quantum gradient descent algorithm is provided to help clients in the distributed quantum networks to train local models. In other words, the algorithm gives the clients a mechanism to estimate the gradient of the local model in parallel. Compared with the classical counterpart, the proposed algorithm achieves exponential acceleration in dataset scale and quadratic speedup in data dimensionality. And, a quantum secure multi-party computation protocol is designed, which utilizes the Chinese residual theorem. It could avoid errors and overflow problems that may occur in the process of large number operation. Security analysis shows that this quantum protocol can resist common external and internal attacks. Finally, to demonstrate the effectiveness of the proposed framework, we use it to the train federated linear regression model and execute essential computation steps on the Qiskit quantum computing framework.

Citations (3)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

Authors (3)

X Twitter Logo Streamline Icon: https://streamlinehq.com