Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 63 tok/s
Gemini 2.5 Pro 48 tok/s Pro
GPT-5 Medium 27 tok/s Pro
GPT-5 High 27 tok/s Pro
GPT-4o 49 tok/s Pro
Kimi K2 182 tok/s Pro
GPT OSS 120B 433 tok/s Pro
Claude Sonnet 4.5 35 tok/s Pro
2000 character limit reached

Improving Parameter Training for VQEs by Sequential Hamiltonian Assembly (2312.05552v1)

Published 9 Dec 2023 in quant-ph and cs.LG

Abstract: A central challenge in quantum machine learning is the design and training of parameterized quantum circuits (PQCs). Similar to deep learning, vanishing gradients pose immense problems in the trainability of PQCs, which have been shown to arise from a multitude of sources. One such cause are non-local loss functions, that demand the measurement of a large subset of involved qubits. To facilitate the parameter training for quantum applications using global loss functions, we propose a Sequential Hamiltonian Assembly, which iteratively approximates the loss function using local components. Aiming for a prove of principle, we evaluate our approach using Graph Coloring problem with a Varational Quantum Eigensolver (VQE). Simulation results show, that our approach outperforms conventional parameter training by 29.99% and the empirical state of the art, Layerwise Learning, by 5.12% in the mean accuracy. This paves the way towards locality-aware learning techniques, allowing to evade vanishing gradients for a large class of practically relevant problems.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (44)
  1. Adiabatic quantum computation is equivalent to standard quantum computation. In Proceedings of the 45th Annual IEEE Symposium on Foundations of Computer Science, FOCS ’04, page 42–51, USA. IEEE Computer Society.
  2. Demonstration of a scaling advantage for a quantum annealer over simulated annealing. Phys. Rev. X, 8:031016.
  3. Bansal, N. (2014). New Developments in Iterated Rounding (Invited Talk). In Raman, V. and Suresh, S. P., editors, 34th International Conference on Foundation of Software Technology and Theoretical Computer Science (FSTTCS 2014), volume 29 of Leibniz International Proceedings in Informatics (LIPIcs), pages 1–10, Dagstuhl, Germany. Schloss Dagstuhl–Leibniz-Zentrum fuer Informatik.
  4. Greedy layer-wise training of deep networks. In Schölkopf, B., Platt, J., and Hoffman, T., editors, Advances in Neural Information Processing Systems, volume 19. MIT Press.
  5. A review on quantum approximate optimization algorithm and its variants.
  6. Beweis des Adiabatensatzes. Zeitschrift für Phys., 51(3):165–180.
  7. Bradley, D. M. (2010). Learning in modular systems. Carnegie Mellon University.
  8. Abrupt transitions in variational quantum circuit training. Phys. Rev. A, 103:032607.
  9. Variational quantum algorithms. Nat. Rev. Phys., 3(9):625–644.
  10. Cost function dependent barren plateaus in shallow parametrized quantum circuits. Nature communications, 12(1):1791.
  11. Rapid solution of problems by quantum computation. Proceedings of the Royal Society of London. Series A: Mathematical and Physical Sciences, 439(1907):553–558.
  12. Quantum circuit architecture search for variational quantum algorithms. npj Quantum Inf., 8(1):62.
  13. A quantum approximate optimization algorithm.
  14. The adjoint is all you need: Characterizing barren plateaus in quantum ansätze.
  15. Gilbert, E. N. (1959). Random Graphs. The Annals of Mathematical Statistics, 30(4):1141 – 1144.
  16. Understanding the difficulty of training deep feedforward neural networks. In Teh, Y. W. and Titterington, M., editors, Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, volume 9 of Proceedings of Machine Learning Research, pages 249–256, Chia Laguna Resort, Sardinia, Italy. PMLR.
  17. A Direct Search Optimization Method That Models the Objective and Constraint Functions by Linear Interpolation, pages 51–67. Springer Netherlands, Dordrecht.
  18. Grover, L. K. (1996). A fast quantum mechanical algorithm for database search. In Proceedings of the Twenty-Eighth Annual ACM Symposium on Theory of Computing, STOC ’96, page 212–219, New York, NY, USA. Association for Computing Machinery.
  19. Exploring network structure, dynamics, and function using networkx. Technical report, Los Alamos National Lab.(LANL), Los Alamos, NM (United States).
  20. Connecting ansatz expressibility to gradient magnitudes and barren plateaus. PRX Quantum, 3:010313.
  21. An empirical study of optimizers for quantum machine learning. In 2020 IEEE 6th International Conference on Computer and Communications (ICCC), pages 1560–1566.
  22. Evaluating the performance of some local optimizers for variational quantum classifiers. Journal of Physics: Conference Series, 1817(1):012015.
  23. The impact of cost function globality and locality in hybrid quantum neural networks on nisq devices. Machine Learning: Science and Technology, 4(1):015004.
  24. Optimal quantum measurements of expectation values of observables. Physical Review A, 75(1):012328.
  25. Lanczos, C. (2012). The Variational Principles of Mechanics. Dover Books on Physics. Dover Publications.
  26. Layer vqe: A variational approach for combinatorial optimization on noisy quantum computers. IEEE Transactions on Quantum Engineering, 3:1–20.
  27. Lucas, A. (2014). Ising formulations of many np problems. Frontiers in Physics, 2.
  28. Barren plateaus in quantum neural network training landscapes. Nat. Commun., 9(1):4812.
  29. Quantum circuit learning. Phys. Rev. A, 98:032309.
  30. Quantum Computation and Quantum Information: 10th Anniversary Edition. Cambridge University Press.
  31. A variational eigenvalue solver on a photonic quantum processor. Nat. Commun., 5(1):4213.
  32. An in-principle super-polynomial quantum advantage for approximating combinatorial optimization problems.
  33. A unified theory of barren plateaus for deep parametrized quantum circuits.
  34. Avoiding barren plateaus using classical shadows. PRX Quantum, 3:020365.
  35. Evaluating analytic gradients on quantum hardware. Phys. Rev. A, 99:032331.
  36. Effect of data encoding on the expressive power of variational quantum-machine-learning models. Phys. Rev. A, 103:032430.
  37. Expressibility and entangling capability of parameterized quantum circuits for hybrid quantum-classical algorithms. Advanced Quantum Technologies, 2(12):1900070.
  38. Layerwise learning for quantum neural networks. Quantum Mach. Intell., 3(1):5.
  39. Limitations of optimization algorithms on noisy quantum devices. Nature Physics, 17(11):1221–1227.
  40. Quantum optimization for the graph coloring problem with space-efficient embedding. In 2020 IEEE International Conference on Quantum Computing and Engineering (QCE), pages 56–62.
  41. On barren plateaus and cost function locality in variational quantum algorithms. Journal of Physics A: Mathematical and Theoretical, 54(24):245301.
  42. Noise-induced barren plateaus in variational quantum algorithms. Nature communications, 12(1):6961.
  43. Phase transitions in the coloring of random graphs. Phys. Rev. E, 76:031131.
  44. Escaping from the barren plateau via gaussian initializations in deep variational quantum circuits. In Koyejo, S., Mohamed, S., Agarwal, A., Belgrave, D., Cho, K., and Oh, A., editors, Advances in Neural Information Processing Systems, volume 35, pages 18612–18627. Curran Associates, Inc.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 1 post and received 0 likes.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube