Papers
Topics
Authors
Recent
2000 character limit reached

DyLoC: A Dual-Layer Architecture for Secure and Trainable Quantum Machine Learning Under Polynomial-DLA constraint (2512.00699v1)

Published 30 Nov 2025 in quant-ph and cs.CR

Abstract: Variational quantum circuits face a critical trade-off between privacy and trainability. High expressivity required for robust privacy induces exponentially large dynamical Lie algebras. This structure inevitably leads to barren plateaus. Conversely, trainable models restricted to polynomial-sized algebras remain transparent to algebraic attacks. To resolve this impasse, DyLoC is proposed. This dual-layer architecture employs an orthogonal decoupling strategy. Trainability is anchored to a polynomial-DLA ansatz while privacy is externalized to the input and output interfaces. Specifically, Truncated Chebyshev Graph Encoding (TCGE) is employed to thwart snapshot inversion. Dynamic Local Scrambling (DLS) is utilized to obfuscate gradients. Experiments demonstrate that DyLoC maintains baseline-level convergence with a final loss of 0.186. It outperforms the baseline by increasing the gradient reconstruction error by 13 orders of magnitude. Furthermore, snapshot inversion attacks are blocked when the reconstruction mean squared error exceeds 2.0. These results confirm that DyLoC effectively establishes a verifiable pathway for secure and trainable quantum machine learning.

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.