Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 82 tok/s
Gemini 2.5 Pro 49 tok/s Pro
GPT-5 Medium 18 tok/s
GPT-5 High 12 tok/s Pro
GPT-4o 96 tok/s
GPT OSS 120B 467 tok/s Pro
Kimi K2 217 tok/s Pro
2000 character limit reached

Algorithmic Polynomial Freiman-Ruzsa Theorems (2509.02338v1)

Published 2 Sep 2025 in math.CO

Abstract: We prove algorithmic versions of the polynomial Freiman-Ruzsa theorem of Gowers, Green, Manners, and Tao (Annals of Mathematics, 2025) in additive combinatorics. In particular, we give classical and quantum polynomial-time algorithms that, for $A \subseteq \mathbb{F}_2n$ with doubling constant $K$, learn an explicit description of a subspace $V \subseteq \mathbb{F}_2n$ of size $|V| \leq |A|$ such that $A$ can be covered by $KC$ translates of $V$, for a universal constant $C>1$.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

  • The paper introduces efficient randomized algorithms that, given query access to a set in F2ⁿ, construct a covering subspace while matching the polynomial Freiman-Ruzsa guarantees.
  • It demonstrates that the quantum algorithm achieves a quadratic query complexity improvement over the classical method, both nearly optimal up to logarithmic factors.
  • The study extends its framework to homomorphism testing and structure-vs-randomness decomposition, impacting property testing, coding theory, and extractor constructions.

Algorithmic Polynomial Freiman-Ruzsa Theorems: Efficient Structure Discovery in Additive Combinatorics

Introduction and Context

The paper "Algorithmic Polynomial Freiman-Ruzsa Theorems" (2509.02338) addresses the algorithmic aspects of the Polynomial Freiman-Ruzsa (PFR) theorem in the context of additive combinatorics over the vector space F2n\mathbb{F}_2^n. The classical Freiman-Ruzsa theorem asserts that sets with small doubling constant KK (i.e., A+AKA|A+A| \leq K|A|) are covered by exponentially many translates of a subspace of size at most A|A|. The PFR conjecture, recently resolved, improves this bound to polynomial in KK. This work advances the state of the art by providing explicit, efficient algorithms—both classical and quantum—that, given query and sample access to AA, construct a covering subspace VV and a polynomial number of translates, matching the existential guarantees of the PFR theorem.

Main Results and Algorithmic Contributions

Classical and Quantum Algorithmic PFR

The central technical contribution is the design of randomized algorithms that, for AF2nA \subseteq \mathbb{F}_2^n with doubling constant KK, efficiently learn a subspace VV of size VA|V| \leq |A| such that AA is covered by KCK^C translates of VV for a universal constant C>1C > 1. The classical algorithm runs in O~(n4)\tilde{O}(n^4) time, uses O(logA)O(\log|A|) random samples, and O~(log2A)\tilde{O}(\log^2|A|) queries to AA. The quantum algorithm achieves O(n3)O(n^3) time and O(logA)O(\log|A|) quantum queries, demonstrating a quadratic improvement in query complexity over the classical approach.

Both algorithms are shown to be optimal up to logarithmic factors: Ω(n2)\Omega(n^2) queries are necessary classically, and Ω(n)\Omega(n) quantum queries are necessary, as established by information-theoretic lower bounds.

Algorithmic Homomorphism Testing and Structure-vs-Randomness

The paper extends the algorithmic framework to two key structural results:

  • Homomorphism Testing: If f:F2mF2nf: \mathbb{F}_2^m \to \mathbb{F}_2^n satisfies a local affine-linear constraint with probability at least $1/K$, then there exists an affine-linear gg such that f(x)=g(x)f(x) = g(x) for at least 2m/P2(K)2^m/P_2(K) values, and gg can be efficiently learned.
  • Structured Approximate Homomorphism: If ff is locally an approximate homomorphism (i.e., the set {f(x)+f(y)f(x+y)}\{f(x)+f(y)-f(x+y)\} is small), then ff decomposes as g+hg+h with gg linear and Im(h)|\mathrm{Im}(h)| polynomially bounded in KK, and gg is efficiently learnable.

These results are algorithmic analogues of classical structure-vs-randomness decompositions, with efficient learning guarantees.

Technical Approach

Quantum-to-Classical Algorithmization

A key innovation is the use of quantum algorithms for stabilizer state learning as a bridge to algorithmic additive combinatorics. The quantum algorithm leverages the efficient agnostic learning of stabilizer states (Chen et al., STOC 2025), exploiting the connection between the Gowers U3U^3-norm and additive structure. The authors then dequantize this approach using machinery from Briët and Castro-Silva [briet2025near], yielding classical algorithms with polynomial overhead in query complexity.

Dense Model Localization and Freiman Isomorphisms

The algorithms begin by localizing AA via random sampling and linear span, ensuring that a large fraction of AA is captured in a smaller ambient space. A random linear map is then used to construct a dense model SS of AA, with high probability of being Freiman-isomorphic. This step is crucial for reducing the ambient dimension and enabling efficient learning.

Quadratic Goldreich-Levin and Stabilizer Learning

The core learning step involves finding a quadratic function with high correlation to a function encoding the additive structure of AA. The quadratic Goldreich-Levin algorithm (and its quantum analogue) is used to efficiently identify this function, which corresponds to the desired subspace VV. The quantum algorithm achieves this via state preparation and fidelity estimation with stabilizer states, while the classical algorithm emulates this process via query access.

Covering via Ruzsa's Lemma

Once the subspace VV is identified, Ruzsa's covering lemma is applied to guarantee that AA is covered by a polynomial number of translates of VV, matching the combinatorial PFR guarantee.

Numerical and Complexity Guarantees

  • Classical Algorithm: O~(n4)\tilde{O}(n^4) time, O(logA)O(\log|A|) samples, O~(log2A)\tilde{O}(\log^2|A|) queries, polynomial dependence on KK.
  • Quantum Algorithm: O(n3)O(n^3) time, O(logA)O(\log|A|) quantum queries, polynomial dependence on KK.
  • Lower Bounds: Ω(n2)\Omega(n^2) classical queries, Ω(n)\Omega(n) quantum queries, both tight up to logarithmic factors.

The algorithms succeed with probability at least $2/3$, which can be amplified via standard error reduction.

Implications and Future Directions

The results have significant implications for theoretical computer science, particularly in areas where explicit structure discovery is required, such as property testing, coding theory, extractor constructions, and communication complexity. The algorithmic PFR theorem enables efficient transition from combinatorial to algebraic structure, facilitating practical applications in these domains.

The quantum-to-classical dequantization paradigm introduced here suggests further exploration of quantum algorithms as a source of efficient classical algorithms in additive combinatorics and beyond. An open problem remains to improve the dependence on the doubling constant KK for growing KK, potentially yielding algorithms with polynomial complexity in both nn and KK.

Conclusion

This work establishes the first efficient algorithmic versions of the polynomial Freiman-Ruzsa theorem, providing both classical and quantum algorithms that match the existential combinatorial guarantees with optimal query and time complexity. The approach leverages deep connections between quantum learning, Gowers norms, and additive combinatorics, and extends to algorithmic homomorphism testing and structure-vs-randomness decompositions. The results open new avenues for algorithmic additive combinatorics and highlight the utility of quantum-inspired techniques in classical algorithm design.

Ai Generate Text Spark Streamline Icon: https://streamlinehq.com

Paper Prompts

Sign up for free to create and run prompts on this paper using GPT-5.

X Twitter Logo Streamline Icon: https://streamlinehq.com

alphaXiv