Scaling advantage with quantum-enhanced memetic tabu search for LABS (2511.04553v1)
Abstract: We introduce quantum-enhanced memetic tabu search (QE-MTS), a non-variational hybrid algorithm that achieves state-of-the-art scaling for the low-autocorrelation binary sequence (LABS) problem. By seeding the classical MTS with high-quality initial states from digitized counterdiabatic quantum optimization (DCQO), our method suppresses the empirical time-to-solution scaling to $\mathcal{O}(1.24N)$ for sequence length $N \in [27,37]$. This scaling surpasses the best-known classical heuristic $\mathcal{O}(1.34N)$ and improves upon the $\mathcal{O}(1.46N)$ of the quantum approximate optimization algorithm, achieving superior performance with a $6\times$ reduction in circuit depth. A two-stage bootstrap analysis confirms the scaling advantage and projects a crossover point at $N \gtrsim 47$, beyond which QE-MTS outperforms its classical counterpart. These results provide evidence that quantum enhancement can directly improve the scaling of classical optimization algorithms for the paradigmatic LABS problem.
Sponsored by Paperpile, the PDF & BibTeX manager trusted by top AI labs.
Get 30 days freePaper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.