Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
158 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Denoising Gradient Descent in Variational Quantum Algorithms (2403.03826v1)

Published 6 Mar 2024 in quant-ph, cs.NA, and math.NA

Abstract: In this article we introduce an algorithm for mitigating the adverse effects of noise on gradient descent in variational quantum algorithms. This is accomplished by computing a {\emph{regularized}} local classical approximation to the objective function at every gradient descent step. The computational overhead of our algorithm is entirely classical, i.e., the number of circuit evaluations is exactly the same as when carrying out gradient descent using the parameter-shift rules. We empirically demonstrate the advantages offered by our algorithm on randomized parametrized quantum circuits.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (26)
  1. On quantum backpropagation, information reuse, and cheating measurement collapse. In Thirty-seventh Conference on Neural Information Processing Systems, 2023.
  2. Backpropagation scaling in parameterised quantum circuits, 2023.
  3. Error mitigation with Clifford quantum-circuit data. Quantum, 5:592, November 2021.
  4. Validating quantum computers using randomized model circuits. Phys. Rev. A, 100:032328, Sep 2019.
  5. Evaluating the noise resilience of variational quantum algorithms. Phys. Rev. A, 104:022403, Aug 2021.
  6. Ridge regression: Biased estimation for nonorthogonal problems. Technometrics, 12(1):55–67, 1970.
  7. Beyond convexity: Stochastic quasi-convex optimization. In Neural Information Processing Systems, 2015.
  8. Adam: A method for stochastic optimization. In International Conference on Learning Representations (ICLR), San Diega, CA, USA, 2015.
  9. Efficient variational quantum simulator incorporating active error minimization. Phys. Rev. X, 7:021050, Jun 2017.
  10. Stochastic noise can be helpful for variational quantum algorithms, 2023.
  11. Estimating the gradient and higher-order derivatives on quantum hardware. Physical Review A, 103(1), jan 2021.
  12. Quantum circuit learning. Physical Review A, 98(3), sep 2018.
  13. Yurii Nesterov. A method for unconstrained convex minimization problem with the rate of convergence o⁢(1/k2)𝑜1superscript𝑘2o(1/k^{2})italic_o ( 1 / italic_k start_POSTSUPERSCRIPT 2 end_POSTSUPERSCRIPT ). Doklady AN USSR, 269:543–547, 1983.
  14. A variational eigenvalue solver on a photonic quantum processor. Nature Communications, 5(1):4213, 2014.
  15. Real-time error mitigation for variational optimization on quantum hardware, 2023.
  16. Challenges of variational quantum optimization with measurement shot noise, 2023.
  17. Evaluating analytic gradients on quantum hardware. Physical Review A, 99(3), mar 2019.
  18. Interpolating parametrized quantum circuits using blackbox queries, 2023.
  19. Effect of data encoding on the expressive power of variational quantum-machine-learning models. Physical Review A, 103(3), mar 2021.
  20. Normalized gradient descent for variational quantum algorithms. In 2021 IEEE International Conference on Quantum Computing and Engineering (QCE), pages 1–9, Los Alamitos, CA, USA, oct 2021. IEEE Computer Society.
  21. Error mitigation for short-depth quantum circuits. Phys. Rev. Lett., 119:180509, Nov 2017.
  22. Mitigating depolarizing noise on quantum computers with noise-estimation circuits. Phys. Rev. Lett., 127:270502, Dec 2021.
  23. Model-free readout-error mitigation for quantum expectation values. Phys. Rev. A, 105:032620, Mar 2022.
  24. Vladimir Vovk. Kernel Ridge Regression, pages 105–116. Springer Berlin Heidelberg, Berlin, Heidelberg, 2013.
  25. Noise-induced barren plateaus in variational quantum algorithms. Nature Communications, 12(1):6961, 2021.
  26. General parameter-shift rules for quantum gradients. Quantum, 6:677, March 2022.
Citations (2)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com