Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 173 tok/s
Gemini 2.5 Pro 54 tok/s Pro
GPT-5 Medium 28 tok/s Pro
GPT-5 High 30 tok/s Pro
GPT-4o 77 tok/s Pro
Kimi K2 187 tok/s Pro
GPT OSS 120B 440 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

Modified Conjugate Quantum Natural Gradient (2501.05847v1)

Published 10 Jan 2025 in quant-ph

Abstract: The efficient optimization of variational quantum algorithms (VQAs) is critical for their successful application in quantum computing. The Quantum Natural Gradient (QNG) method, which leverages the geometry of quantum state space, has demonstrated improved convergence compared to standard gradient descent [Quantum 4, 269 (2020)]. In this work, we introduce the Modified Conjugate Quantum Natural Gradient (CQNG), an optimization algorithm that integrates QNG with principles from the nonlinear conjugate gradient method. Unlike QNG, which employs a fixed learning rate, CQNG dynamically adjusts hyperparameters at each step, enhancing both efficiency and flexibility. Numerical simulations show that CQNG achieves faster convergence than QNG across various optimization scenarios, even when strict conjugacy conditions are not always satisfied -- hence the term ``Modified Conjugate.'' These results highlight CQNG as a promising optimization technique for improving the performance of VQAs.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.