Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Line Search Strategy for Navigating through Barren Plateaus in Quantum Circuit Training (2402.05227v3)

Published 7 Feb 2024 in quant-ph

Abstract: Variational quantum algorithms are viewed as promising candidates for demonstrating quantum advantage on near-term devices. These approaches typically involve the training of parameterized quantum circuits through a classical optimization loop. However, they often encounter challenges attributed to the exponentially diminishing gradient components, known as the barren plateau (BP) problem. This work introduces a novel optimization approach designed to alleviate the adverse effects of BPs during circuit training. In contrast to conventional gradient descent methods with a small learning parameter, our approach relies on making a finite hops along the search direction determined on a randomly chosen subsets of the free parameters. The optimization search direction, together with the range of the search, is determined by the distant features of the cost-function landscape. This enables the optimization path to navigate around barren plateaus without the need for external control mechanisms. We have successfully applied our optimization strategy to quantum circuits comprising 16 qubits and 15000 entangling gates, demonstrating robust resistance against BPs. Additionally, we have extended our optimization strategy by incorporating an evolutionary selection framework, enhancing its ability to avoid local minima in the landscape. The modified algorithm has been successfully utilized in quantum gate synthesis applications, showcasing a significantly improved efficiency in generating highly compressed quantum circuits compared to traditional gradient-based optimization approaches.

Citations (2)

Summary

We haven't generated a summary for this paper yet.