Quantum Circuit Optimization through Iteratively Pre-Conditioned Gradient Descent (2309.09957v1)
Abstract: For typical quantum subroutines in the gate-based model of quantum computing, explicit decompositions of circuits in terms of single-qubit and two-qubit entangling gates may exist. However, they often lead to large-depth circuits that are challenging for noisy intermediate-scale quantum (NISQ) hardware. Additionally, exact decompositions might only exist for some modular quantum circuits. Therefore, it is essential to find gate combinations that approximate these circuits to high fidelity with potentially low depth, for example, using gradient-based optimization. Traditional optimizers often run into problems of slow convergence requiring many iterations, and perform poorly in the presence of noise. Here we present iteratively preconditioned gradient descent (IPG) for optimizing quantum circuits and demonstrate performance speedups for state preparation and implementation of quantum algorithmic subroutines. IPG is a noise-resilient, higher-order algorithm that has shown promising gains in convergence speed for classical optimizations, converging locally at a linear rate for convex problems and superlinearly when the solution is unique. Specifically, we show an improvement in fidelity by a factor of $104$ for preparing a 4-qubit W state and a maximally entangled 5-qubit GHZ state compared to other commonly used classical optimizers tuning the same ansatz. We also show gains for optimizing a unitary for a quantum Fourier transform using IPG, and report results of running such optimized circuits on IonQ's quantum processing unit (QPU). Such faster convergence with promise for noise-resilience could provide advantages for quantum algorithms on NISQ hardware, especially since the cost of running each iteration on a quantum computer is substantially higher than the classical optimizer step.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.