Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 134 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 41 tok/s Pro
GPT-5 High 39 tok/s Pro
GPT-4o 89 tok/s Pro
Kimi K2 192 tok/s Pro
GPT OSS 120B 437 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Performance comparison of optimization methods on variational quantum algorithms (2111.13454v3)

Published 26 Nov 2021 in quant-ph

Abstract: Variational quantum algorithms (VQAs) offer a promising path toward using near-term quantum hardware for applications in academic and industrial research. These algorithms aim to find approximate solutions to quantum problems by optimizing a parametrized quantum circuit using a classical optimization algorithm. A successful VQA requires fast and reliable classical optimization algorithms. Understanding and optimizing how off-the-shelf optimization methods perform in this context is important for the future of the field. In this work, we study the performance of four commonly used gradient-free optimization methods: SLSQP, COBYLA, CMA-ES, and SPSA, at finding ground-state energies of a range of small chemistry and material science problems. We test a telescoping sampling scheme (where the accuracy of the cost-function estimate provided to the optimizer is increased as the optimization converges) on all methods, demonstrating mixed results across our range of optimizers and problems chosen. We further hyperparameter tune two of the four optimizers (CMA-ES and SPSA) across a large range of models and demonstrate that with appropriate hyperparameter tuning, CMA-ES is competitive with and sometimes outperforms SPSA (which is not observed in the absence of hyperparameter tuning). Finally, we investigate the ability of an optimizer to beat the `sampling noise floor' given by the sampling noise on each cost-function estimate provided to the optimizer. Our results demonstrate the necessity for tailoring and hyperparameter-tuning known optimization techniques for inherently-noisy variational quantum algorithms and that the variational landscape that one finds in a VQA is highly problem- and system-dependent. This provides guidance for future implementations of these algorithms in the experiment.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.