Papers
Topics
Authors
Recent
AI Research Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 73 tok/s
Gemini 2.5 Pro 46 tok/s Pro
GPT-5 Medium 13 tok/s Pro
GPT-5 High 14 tok/s Pro
GPT-4o 86 tok/s Pro
Kimi K2 156 tok/s Pro
GPT OSS 120B 388 tok/s Pro
Claude Sonnet 4 37 tok/s Pro
2000 character limit reached

A competitive NISQ and qubit-efficient solver for the LABS problem (2506.17391v1)

Published 20 Jun 2025 in quant-ph

Abstract: Pauli Correlation Encoding (PCE) has recently been introduced as a qubit-efficient approach to combinatorial optimization problems within variational quantum algorithms (VQAs). The method offers a polynomial reduction in qubit count and a super-polynomial suppression of barren plateaus. Moreover, it has been shown to feature a competitive performance with classical state-of-the-art methods on MaxCut. Here, we extend the PCE-based framework to solve the Low Autocorrelation Binary Sequences (LABS) problem. This is a notoriously hard problem with a single instance per problem size, considered a major benchmark for classical and quantum solvers. We simulate our PCE variational quantum solver for LABS instances of up to $N=44$ binary variables using only $n=6$ qubits and a brickwork circuit Ansatz of depth $10$, with a total of $30$ two-qubit gates, i.e. well inside the NISQ regime. We observe a significant scaling advantage in the total time to (the exact) solution of our solver with respect to previous studies using QAOA, and even a modest advantage with respect to the leading classical heuristic, given by Tabu search. Our findings point at PCE-based solvers as a promising quantum-inspired classical heuristic for hard-in-practice problems as well as a tool to reduce the resource requirements for actual quantum algorithms, with both fundamental and applied potential implications.

Summary

We haven't generated a summary for this paper yet.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.