Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
157 tokens/sec
GPT-4o
8 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Novel Noise-Aware Classical Optimizer for Variational Quantum Algorithms (2401.10121v2)

Published 18 Jan 2024 in quant-ph and math.OC

Abstract: A key component of variational quantum algorithms (VQAs) is the choice of classical optimizer employed to update the parameterization of an ansatz. It is well recognized that quantum algorithms will, for the foreseeable future, necessarily be run on noisy devices with limited fidelities. Thus, the evaluation of an objective function (e.g., the guiding function in the quantum approximate optimization algorithm (QAOA) or the expectation of the electronic Hamiltonian in variational quantum eigensolver (VQE)) required by a classical optimizer is subject not only to stochastic error from estimating an expected value but also to error resulting from intermittent hardware noise. Model-based derivative-free optimization methods have emerged as popular choices of a classical optimizer in the noisy VQA setting, based on empirical studies. However, these optimization methods were not explicitly designed with the consideration of noise. In this work we adapt recent developments from the ``noise-aware numerical optimization'' literature to these commonly used derivative-free model-based methods. We introduce the key defining characteristics of these novel noise-aware derivative-free model-based methods that separate them from standard model-based methods. We study an implementation of such noise-aware derivative-free model-based methods and compare its performance on demonstrative VQA simulations to classical solvers packaged in \texttt{scikit-quant}.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (50)
  1. Variational quantum algorithms. Nature Reviews Physics, 3(9):625–644, August 2021. doi:10.1038/s42254-021-00348-9.
  2. The theory of variational hybrid quantum-classical algorithms. New Journal of Physics, 18(2):023023, February 2016. doi:10.1088/1367-2630/18/2/023023.
  3. A quantum approximate optimization algorithm. arXiv:1411.4028, 2014. doi:10.48550/arXiv.1411.4028.
  4. Quantum approximate optimization algorithm for MaxCut: A fermionic view. Physical Review A, 97(2), February 2018. doi:10.1103/physreva.97.022304.
  5. Parameter transfer for quantum approximate optimization of weighted MaxCut. ACM Transactions on Quantum Computing, 4(3):1–15, 2023a. doi:10.1145/3584706.
  6. Evidence of scaling advantage for the quantum approximate optimization algorithm on a classically intractable problem. arXiv:2308.02342, 2023b. doi:10.48550/arXiv.2308.02342.
  7. Hardware-efficient variational quantum eigensolver for small molecules and quantum magnets. Nature, 549(7671):242–246, September 2017. doi:10.1038/nature23879.
  8. An adaptive variational algorithm for exact molecular simulations on a quantum computer. Nature Communications, 10(1), July 2019. doi:10.1038/s41467-019-10988-2.
  9. Quantum chemistry as a benchmark for near-term quantum computers. npj Quantum Information, 5(1):99, 2019. doi:10.1038/s41534-019-0209-0.
  10. Benchmarking quantum chemistry computations with variational, imaginary time evolution, and Krylov space solver algorithms. Advanced Quantum Technologies, 4(7):2100012, 2021. doi:10.1002/qute.202100012.
  11. Quantum algorithms for quantum chemistry and quantum materials science. Chemical Reviews, 120(22):12685–12717, October 2020. doi:10.1021/acs.chemrev.9b00829.
  12. Evaluating analytic gradients on quantum hardware. Physical Review A, 99(3), March 2019. doi:10.1103/physreva.99.032331.
  13. Latency considerations for stochastic optimizers in variational quantum algorithms. Quantum, 7:949, 2023a. doi:10.22331/q-2023-03-16-949.
  14. A stochastic quasi-Newton method in the absence of common random numbers. arXiv:2302.09128, 2023b. doi:10.48550/arXiv.2302.09128.
  15. An adaptive optimizer for measurement-frugal variational algorithms. Quantum, 4:263, 2020. doi:10.22331/q-2020-05-11-263.
  16. Operator sampling for shot-frugal optimization in variational algorithms. arXiv:2004.06252, 2020. doi:10.48550/arXiv.2004.06252.
  17. Adaptive shot allocation for fast convergence in variational quantum algorithms. arXiv:2108.10434, 2021. doi:10.48550/arXiv.2108.10434.
  18. Kosuke Ito. Latency-aware adaptive shot allocation for run-time efficient variational quantum algorithms. arXiv:2302.04422, 2023. doi:10.48550/arXiv.2302.04422.
  19. Multistart methods for quantum approximate optimization. In Proceedings of the IEEE High Performance Extreme Computing Conference, 2019. doi:10.1109/hpec.2019.8916288.
  20. Derivative-free optimization methods. Acta Numerica, 28:287–404, 2019. doi:10.1017/s0962492919000060.
  21. Classical optimizers for noisy intermediate-scale quantum devices. In IEEE International Conference on Quantum Computing and Engineering, pages 267–277. IEEE, 2020. doi:10.1109/QCE49297.2020.00041.
  22. Introduction to Derivative-Free Optimization. SIAM, 2009. doi:10.1137/1.9780898718768.
  23. Michael J. D. Powell. The BOBYQA algorithm for bound constrained optimization without derivatives. Technical Report DAMTP 2009/NA06, University of Cambridge, 2009. URL http://www.damtp.cam.ac.uk/user/na/NA_papers/NA2009_06.pdf.
  24. Estimating computational noise. SIAM Journal on Scientific Computing, 33(3):1292–1314, 2011. doi:10.1137/100786125.
  25. Stochastic optimization using a trust-region method and random models. Mathematical Programming, 169(2):447–487, 2018. doi:10.1007/s10107-017-1141-8.
  26. Convergence rate analysis of a stochastic trust-region method via supermartingales. INFORMS Journal on Optimization, 1(2):92–119, 2019. doi:10.1287/ijoo.2019.0016.
  27. Stochastic derivative-free optimization using a trust region framework. Computational Optimization and Applications, 64(3):619–645, 2016. doi:10.1007/s10589-016-9827-z.
  28. ASTRO-DF: A class of adaptive sampling trust-region algorithms for derivative-free stochastic optimization. SIAM Journal on Optimization, 28(4):3145–3176, 2018. doi:10.1137/15m1042425.
  29. Stochastic trust-region response-surface method (STRONG)–A new response-surface framework for simulation optimization. INFORMS Journal on Computing, 25(2):230–243, May 2013. doi:10.1287/ijoc.1120.0498.
  30. A trust-region method for derivative-free nonlinear constrained stochastic optimization. arXiv:1703.04156, 2022. doi:10.48550/arXiv.1703.04156.
  31. A trust region method for the optimization of noisy functions. arXiv:2201.00973, 2022. doi:10.48550/arXiv.2201.00973.
  32. First- and second-order high probability complexity bounds for trust-region methods with noisy oracles. Mathematical Programming, July 2023. doi:10.1007/s10107-023-01999-5.
  33. Detecting and tracking drift in quantum information processors. Nature Communications, 11(1), October 2020. doi:10.1038/s41467-020-19074-4.
  34. A theoretical and empirical comparison of gradient approximations in derivative-free optimization. Foundations of Computational Mathematics, 22(2):507–560, 2022. doi:10.1007/s10208-021-09513-z.
  35. Stefan M. Wild. MNH: A derivative-free optimization algorithm using minimal norm Hessians. In Tenth Copper Mountain Conference on Iterative Methods, 2008. URL http://grandmaster.colorado.edu/~copper/2008/SCWinners/Wild.pdf.
  36. Stefan M Wild. Chapter 40: POUNDERS in TAO: Solving derivative-free nonlinear least-squares problems with POUNDERS. In Advances and trends in optimization with engineering applications, pages 529–539. SIAM, 2017. doi:10.1137/1.9781611974683.ch40.
  37. M.J.D. Powell. Least Frobenius norm updating of quadratic models that satisfy interpolation conditions. Mathematical Programming, 100(1), November 2003. doi:10.1007/s10107-003-0490-7.
  38. Improving the flexibility and robustness of model-based derivative-free optimization solvers. ACM Transactions on Mathematical Software, 45(3):1–41, 2019. doi:10.1145/3338517.
  39. Sébastien Le Digabel. Algorithm 909: NOMAD: Nonlinear optimization with the MADS algorithm. ACM Transactions on Mathematical Software, 37(4):1–15, February 2011. doi:10.1145/1916461.1916468.
  40. C. T. Kelley. Implicit Filtering. Society for Industrial and Applied Mathematics, 2011. doi:10.1137/1.9781611971903.
  41. SNOBFIT – Stable noisy optimization by branch and fit. ACM Transactions on Mathematical Software, 35(2):1–25, July 2008. doi:10.1145/1377612.1377613.
  42. P. Gilmore and C. T. Kelley. An implicit filtering algorithm for optimization of functions with many local minima. SIAM Journal on Optimization, 5(2):269–285, May 1995. doi:10.1137/0805015.
  43. Mesh adaptive direct search algorithms for constrained optimization. SIAM Journal on Optimization, 17(1):188–217, January 2006. doi:10.1137/040603371.
  44. Stochastic mesh adaptive direct search for blackbox optimization using probabilistic estimates. Computational Optimization and Applications, 79(1):1–34, 2021. doi:10.1007/s10589-020-00249-0.
  45. J.C. Spall. Multivariate stochastic approximation using a simultaneous perturbation gradient approximation. IEEE Transactions on Automatic Control, 37(3):332–341, March 1992. doi:10.1109/9.119632.
  46. László Gerencsér. Rate of Convergence of Moments of Spall’s SPSA Method, pages 67–75. Birkhäuser Boston, 1997. doi:10.1007/978-1-4612-1980-4_7.
  47. Simulation-based optimization with stochastic approximation using common random numbers. Management Science, 45(11):1570–1578, November 1999. doi:10.1287/mnsc.45.11.1570.
  48. IBMQ. IBM Quantum Documentation: SPSA, 2024. Available at https://docs.quantum.ibm.com/api/qiskit/qiskit.algorithms.optimizers.SPSA accessed on Jan. 12, 2024.
  49. Qiskit Contributors. Qiskit: An open-source framework for quantum computing, 2023.
  50. Variational quantum eigensolvers by variance minimization. Chinese Physics B, 31(12):120301, 2022. doi:10.1088/1674-1056/ac8a8d.
Citations (3)

Summary

We haven't generated a summary for this paper yet.