Stochastic Shadow Descent: Training Parametrized Quantum Circuits with Shadows of Gradients
Abstract: In this paper, we focus on the task of optimizing the parameters in Parametrized Quantum Circuits (PQCs). While popular algorithms, such as Simultaneous Perturbation Stochastic Approximation (SPSA), limit the number of circuit-execution to two per iteration, irrespective of the number of parameters in the circuit, they have their own challenges. These methods use central-differences to calculate biased estimates of directional derivatives. We show, both theoretically and numerically, that this may lead to instabilities in \emph{training} the PQCs. To remedy this, we propose Stochastic Shadow Descent (\texttt{SSD}), which uses random-projections (or \emph{shadows}) of the gradient to update the parameters iteratively. We eliminate the bias in directional derivatives by employing the Parameter-Shift Rule, along with techniques from Quantum Signal Processing, to construct a quantum circuit that parsimoniously computes \emph{unbiased estimates} of directional derivatives. Finally, we prove the convergence of the \texttt{SSD} algorithm, provide worst-case bounds on the number of iterations, and numerically demonstrate its efficacy.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.