Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

CryptoQFL: Quantum Federated Learning on Encrypted Data (2307.07012v1)

Published 13 Jul 2023 in quant-ph and cs.ET

Abstract: Recent advancements in Quantum Neural Networks (QNNs) have demonstrated theoretical and experimental performance superior to their classical counterparts in a wide range of applications. However, existing centralized QNNs cannot solve many real-world problems because collecting large amounts of training data to a common public site is time-consuming and, more importantly, violates data privacy. Federated Learning (FL) is an emerging distributed machine learning framework that allows collaborative model training on decentralized data residing on multiple devices without breaching data privacy. Some initial attempts at Quantum Federated Learning (QFL) either only focus on improving the QFL performance or rely on a trusted quantum server that fails to preserve data privacy. In this work, we propose CryptoQFL, a QFL framework that allows distributed QNN training on encrypted data. CryptoQFL is (1) secure, because it allows each edge to train a QNN with local private data, and encrypt its updates using quantum \homo~encryption before sending them to the central quantum server; (2) communication-efficient, as CryptoQFL quantize local gradient updates to ternary values, and only communicate non-zero values to the server for aggregation; and (3) computation-efficient, as CryptoQFL presents an efficient quantum aggregation circuit with significantly reduced latency compared to state-of-the-art approaches.

Citations (6)

Summary

We haven't generated a summary for this paper yet.