Papers
Topics
Authors
Recent
Search
2000 character limit reached

Rational degree is polynomially related to degree

Published 13 Jan 2026 in cs.CC, cs.DM, and quant-ph | (2601.08727v1)

Abstract: We prove that $\mathrm{deg}(f) \leq 2 \, \mathrm{rdeg}(f)4$ for every Boolean function $f$, where $\mathrm{deg}(f)$ is the degree of $f$ and $\mathrm{rdeg}(f)$ is the rational degree of $f$. This resolves the second of the three open problems stated by Nisan and Szegedy, and attributed to Fortnow, in 1994.

Summary

  • The paper demonstrates a quartic upper bound: deg(f) ≤ 2·rdeg(f)^4, establishing polynomial equivalence between classical degree and rational degree.
  • It employs innovative techniques such as hitting sets, polynomial symmetrization, and Markov’s inequality to connect rational, sign, and nondeterministic degrees.
  • The findings constrain postselected quantum query complexity and set a benchmark for structural separations in Boolean function complexity.

Introduction and Background

The paper "Rational degree is polynomially related to degree" (2601.08727) settles a long-standing open question in Boolean function complexity theory concerning the relationship between the degree (deg(f)\deg(f)) and the rational degree ($\rdeg(f)$) of Boolean functions f:{0,1}n{0,1}f : \{0,1\}^n \to \{0,1\}. While it is straightforward that $\rdeg(f) \leq \deg(f)$ (by choosing the denominator to be $1$), the magnitude by which deg(f)\deg(f) can exceed $\rdeg(f)$ was unknown for over three decades. This paper resolves this by showing that the degree of any Boolean function is bounded above by a quartic function of its rational degree: specifically, $\deg(f) \leq 2\rdeg(f)^4$.

This result addresses the second of three open problems posed by Nisan and Szegedy (attributed to Fortnow) in 1994, concerning the tightness of the relationship between these complexity measures. Beyond addressing a foundational complexity theory question, the result closes a gap in our quantitative understanding of postselected quantum query complexity, threshold phenomena, and the structural properties of rational representations for Boolean functions.

Main Results

The primary technical result is the proof that for any Boolean function ff,

$\deg(f) \leq 2\rdeg(f)^4.$

Moreover, a stronger structural result is established:

$\deg(f) \leq O(\sdeg(f)^2 \rdeg(f)^2),$

where $\sdeg(f)$ denotes the sign degree (threshold degree) of ff. This yields sharper bounds in contexts where the sign degree is much less than the rational degree.

The analysis also yields a result in terms of nondeterministic degree ($\ndeg(f)$, the minimum degree of a polynomial that is nonzero exactly where ff is $1$):

$\deg(f) \leq 2\ndeg(f)^{2} \ndeg(\neg f)^{2} \leq 2\rdeg(f)^4.$

A key corollary is that rational degree, a complexity measure capturing the (exact) quantum query complexity with postselection, is polynomially equivalent (up to a quartic gap) to classical degree---closing a potential for exponential separation between classical Boolean function degree and its rational counterpart.

Techniques and Proof Structure

The proof is based on a sophisticated interplay between several Boolean function complexity measures and algebraic constructs:

  • The paper exploits a characterization $\rdeg(f) = \max\{\ndeg(f), \ndeg(\neg f)\}$, linking rational degree to nondeterministic degree.
  • It establishes a new bound relating sign degree to the minimal block sensitivity of ff, a lower bound conceptually related but not previously combined with sign degree in this way.
  • The decision tree complexity D(f)D(f) is shown to be bounded by $O(\sdeg(f)^2 \rdeg(f)^2)$, via inductively restricting the function using variables from hitting sets of the nondeterministic representations.
  • An explicit construction using hitting sets, polynomial symmetrization, and Markov's inequality underpins the analytic lower bound, while a combinatorial analysis gives the necessary upper bounds via decision tree simulations.
  • Via this method, the quartic upper bound follows as a consequence: degree is polynomially related to rational degree, and explicit separation witnesses are discussed.

Theoretical and Practical Implications

Complexity Theory

Resolving whether rational degree is polynomially related to degree tightens the landscape of Boolean function complexity measures: rational degree is now known to sit polynomially between degree and several other measures (e.g., decision tree complexity, approximate degree, quantum query complexity variants).

This relationship has immediate consequences:

  • Postselected Quantum Query Complexity: Given that rational degree characterizes the exact postselected quantum query complexity, the result bounds the computational power of quantum algorithms with postselection in terms of classical degree, preventing super-polynomial speedups in this model.
  • Nullstellensatz over Hypercube: The results yield an effective Nullstellensatz for the Boolean hypercube: if a collection of polynomials has no simultaneous zeros on the cube, then the certificate polynomials can also be chosen with degree essentially bounded by a quartic of the maximum involved in the input polynomials.
  • Complexity Class Separations: As detailed in the paper, the rational degree bound implies certain collapse results for classes defined via zero-testing using generic oracles, echoing classic results tied to polynomial degree.

Further Open Problems

The tightness of the quartic (d4d^4) exponent is not fully understood. The best-known explicit separations (e.g., for the balanced AND-OR tree or pointer function constructions) are only quadratic. Thus, a central avenue for progress is to tighten either the lower or upper bounds, possibly reducing the exponent or finding new function families with higher separation.

The connections established with sign degree raise the possibility that related phenomena in polynomial threshold functions might benefit from these structural insights. Additionally, the implications for Gotsman–Linial-type conjectures and influence bounds are explored, providing algebraic tools that might generalize to further settings involving approximate degrees and influences.

The extension to partial Boolean functions is precluded by explicit counterexamples, demonstrating that the polynomial relationship is an artifact of the total (totalized) Boolean function setting.

Numerical Strengths and Contrasts

The main theorem provides a fully quantified polynomial relationship: for total Boolean functions, the gap between degree and rational degree is at most quartic. The existing separation functions do not reach this exponent, and the question of tightness remains open.

A significant contrast is established with partial Boolean functions, for which much larger gaps can exist. This demarcates the precise landscape for total functions and draws a sharp boundary for structural theorems in Boolean complexity.

Speculation on Future Developments

  • Separation Construction: Constructing Boolean function families with degree/rational degree separations matching the quartic upper bound remains a focus.
  • Approximate Measures: Extensions to approximate rational degree and their relationships to classical/quantum approximate degree could leverage these techniques.
  • Decision Tree Models: The analytic/combinatorial framework developed for relating hitting sets, block sensitivity, and degree should export to other query models and structural complexity questions.
  • Connection with Influence and Thresholds: The potential impact on conjectures relating polynomial degree and total influence, such as the Gotsman-Linial conjecture, as well as implications for learning theory, are immediate targets.

Conclusion

This paper settles a pivotal foundational question in Boolean function analysis: for total functions, rational degree cannot be substantially smaller than classical degree; they are polynomially related, with at most a quartic gap. The robust analytic-combinatorial framework deployed here deepens the connections between disparate complexity measures and further integrates rational representations into the core narrative of Boolean complexity theory. The work catalyzes both new questions regarding optimal separations and a program for unifying degree-based measures under a common algebraic and quantum perspective.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Explain it Like I'm 14

Overview

This paper is about two ways to measure how “complicated” a yes/no rule on bits is. Imagine you have n light switches (each 0 or 1), and a rule f decides yes/no from the switch positions. One way to measure complexity is the degree of a single polynomial that exactly equals f on all 0–1 inputs. Another (more flexible) way is the rational degree, where you can use a fraction of two polynomials to equal f. The big question (asked in 1994) was: Can using a fraction make things dramatically simpler, or are these measures always within a reasonable (polynomial) factor?

The authors resolve this open problem. They prove that for every such rule f, the usual degree is at most a fixed constant times the fourth power of the rational degree:

  • Main result: deg(f) ≤ 2 * rdeg(f)4

So, using fractions can help, but not more than a polynomial (fourth power) amount.

What questions did the authors ask?

In simple terms, they asked:

  • If a Boolean function f can be written as “one polynomial divided by another” with low degree (small rdeg), how small can the usual single-polynomial degree be in comparison?
  • Is there always a polynomial relationship between those two measures?
  • Can this connection help us understand other complexity ideas (like decision trees, quantum query models with postselection, and some algebraic identities on the 0–1 cube)?

How did they approach the problem?

They combine ideas from algebra, probability, and combinatorics. Here are the core ingredients, each with an everyday analogy:

  • Block sensitivity (fragile groups): For any input, look for disjoint groups of switches such that flipping all switches in any one group flips the answer. The more such groups, the more “fragile” the function is at that input.
  • Hitting sets for big terms (cut the tallest trees): A polynomial is a sum of terms; the tallest “trees” are the highest-degree terms. A hitting set picks at least one variable from each tallest term. If you learn (query) these variables’ values, you force the polynomial’s degree to drop when you plug those values in.
  • A decision tree (20 questions): You can think of computing f by asking questions about input bits. The number of questions needed (in the best strategy) is the decision-tree complexity D(f).

They also use two key math tools:

  • Symmetrization (make it fair): Average the polynomial so that all variables are treated equally; this gives a one-variable summary that’s easier to analyze.
  • Markov’s inequality for polynomials (speed limit for slopes): If a polynomial stays within a vertical band on an interval, there’s a limit on how steep it can be. This helps bound degrees by how much a curve must change.

Putting it all together, their strategy looks like this:

  1. From a special “sign” polynomial for f, they find an input where the function is not too fragile (low block sensitivity).
  2. From special “nondeterministic” polynomials for f and for its opposite (not f), they build small hitting sets that, when queried, strictly reduce the polynomials’ degrees.
  3. They repeat: each round queries only a limited number of bits (controlled by the function’s “sign degree”) and guarantees the polynomial degrees drop (controlled by the “rational/nondeterministic” degrees).
  4. After a bounded number of rounds, the function becomes constant on what remains, so they’ve computed f with a limited number of questions.
  5. Finally, they use the known fact that deg(f) ≤ D(f) to turn this into a bound on deg(f) in terms of rdeg(f).

A key intermediate bound they prove is:

  • D(f) ≤ 4 * sdeg(f)2 * rdeg(f)2, where sdeg(f) is the “sign degree” (the degree of a polynomial that is just negative when f=1 and positive when f=0). Since sdeg(f) ≤ 2·rdeg(f), this gives deg(f) ≤ D(f) ≤ 16 * rdeg(f)4. With a more refined argument, they strengthen the constant to 2, which is their headline result.

What did they find, and why is it important?

Main findings:

  • The usual polynomial degree and the rational degree are polynomially related: deg(f) ≤ 2 * rdeg(f)4.
  • In particular, using a fraction of polynomials never makes the problem exponentially easier. The worst possible speedup is at most a fourth-power factor.

Why this matters:

  • It resolves a 30-year-old open problem posed by Nisan and Szegedy (attributed to Fortnow).
  • The rational degree matches an exact measure in a model of quantum algorithms with postselection. So this result ties a quantum complexity measure tightly to classical polynomial degree.
  • Degree is connected to many other complexity measures (decision trees, sensitivities, influences, certificate complexities). Bounding degree in terms of rational degree immediately bounds these other measures too, showing a unified picture: lots of different ways to measure “how hard f is” are all polynomially connected.

Implications and potential impact

Here are several clean takeaways and what they could lead to:

  • A better understanding of “questions to answers”: Their decision-tree bound D(f) ≤ 4 * sdeg(f)2 * rdeg(f)2 shows how “threshold-like behavior” (sign degree) and “fractional representability” (rational degree) combine to limit the number of questions needed to compute f. This bridges algebraic and algorithmic views.
  • Algebra on the 0–1 cube (effective Nullstellensatz): They derive a concrete bound for when two polynomials never vanish together on {0,1}n but one of them vanishes at every 0-input and the other at every 1-input. In plain terms: you can build an identity h1·g1 + h2·g2 = 1 on the cube with controlled degrees. This is a strong algebraic consequence with neat applications.
  • Limits of generalization: They show that their kind of algebraic identity cannot be pushed to “partial” functions (where f is not defined on all inputs) without losing control of degrees. So their theorem is sharp in an important sense.
  • Universal lower bound on rational degree: Any function that truly depends on n inputs must have rdeg(f) at least on the order of log n. So you can’t represent a genuinely n-variable function with an extremely tiny rational degree.
  • Open directions:
    • Is the fourth power necessary? The authors conjecture the quartic relation may be tight (some functions might really need deg ≈ rdeg4).
    • Extend and sharpen connections to other measures (like influences), and to approximate versions (where tiny errors are allowed).
    • Quantum angle: since rdeg equals exact postselected quantum query complexity, these results inform how powerful postselection is compared to classical measures.

In short, the paper draws a strong, precise line between two fundamental ways of representing Boolean functions (single polynomials vs. polynomial fractions), settles a long-standing question, and opens new paths linking algebra, combinatorics, and both classical and quantum computation.

Knowledge Gaps

Knowledge gaps, limitations, and open questions

Below is a concise list of gaps and open directions left unresolved by the paper. Each item is framed to be actionable for future research.

  • Tightness of the quartic exponent: Determine whether the upper bound deg(f) ≤ O(rdeg(f)4) is optimal. Either:
    • Construct an explicit family with D(f) ≥ Ω(rdeg(f)4) (as conjectured), implying deg(f) can be quartically larger than rdeg(f); or
    • Improve the upper bound to deg(f) ≤ O(rdeg(f)3) or deg(f) ≤ O(rdeg(f)2), or prove such improvements are impossible.
  • Tightness and role of sign degree in the decision-tree bound: Assess whether D(f) ≤ 4·sdeg(f)2·rdeg(f)2 can be improved (e.g., to linear dependence in sdeg(f)). Identify functions that make this sdeg-dependent factor necessary or provide constructions showing the bound is not tight.
  • Approximate nondeterministic degree—combinatorial side: Extend Lemma 3.5 (hitting-set upper bound) to the approximate setting to match Theorem 6.1 analytically, thereby proving D(f) ≤ O(ndeg_ε(f)2·ndeg_ε(¬f)2) for constant ε. Clarify whether approximate hitting sets exist with size bounded in terms of degree and min-block sensitivity.
  • Generalization of the Effective Hypercube Nullstellensatz: Prove the conjectured extension for m ≥ 2 polynomials g₁,…,g_m with the product-zero condition on {0,1}ⁿ, establishing h₁,…,h_m such that ∑ h_i·g_i ≡ 1 on {0,1}ⁿ, with deg(ȳh_i g_i) bounded by a polynomial in deg(g_i). Determine optimal degree dependence (e.g., linear vs quadratic vs higher).
  • Nullstellensatz beyond product-zero: Characterize promise classes (subsets of the hypercube) for which an effective Nullstellensatz (polynomial degree bounds for Bezout-type certificates) holds. The paper’s counterexample shows the general statement fails even for m = 2; identify structural conditions under which it succeeds.
  • Influence-based bounds and Gotsman–Linial variants: Resolve the conjecture Inf[f] ≤ O(√n * ndeg(f)) and quantify its consequences for rdeg(f). Alternatively, develop different techniques to strengthen unconditional lower bounds on rdeg(f) for functions depending on n variables beyond Ω(log n).
  • Number of relevant variables versus rdeg: Improve the upper bound that a Boolean function of rational degree d can depend on at most O(d⁴·2{2d}) variables. Aim for sharper dependence (e.g., O(poly(d)·2{d})) and produce matching lower bounds.
  • Properties of minimum block sensitivity: Systematically study min_x bs_x(f) introduced in this work:
    • Tighten the inequality min_x bs_x(f) ≤ 2·sdeg(f)² (e.g., to linear or near-linear dependence).
    • Establish its relationships to other measures (sensitivity, certificate complexity, approximate degree, etc.).
    • Characterize its behavior under restrictions and composition.
  • Constants and tight examples: Beyond PARITY₂, identify functions that make deg(f) = 2·rdeg(f)4 or D(f) = 4·sdeg(f)2·rdeg(f)2 tight. Clarify whether the bound deg(f) ≤ 16·rdeg(f)4 (via sign degree) is ever tight, and if not, improve constants.
  • Polynomial relations to other measures: The quartic relation turns question marks to “4” in known tables (e.g., Iyer 2025). Derive matching lower bounds or improved upper bounds to pin down tight exponents and constants relating rdeg to:
    • Certificate complexity, sensitivity, block sensitivity, approximate degree, and exact quantum query complexity Q_E.
    • Explore whether Q_E(f) ≤ O(ndeg(f)·ndeg(¬f)) (de Wolf’s conjecture) holds, and its implications for rdeg–degree relations.
  • Postselected quantum query complexity for ε > 0: Establish whether the factor-2 slack in rdeg(f) ≤ 2·PostQ_ε(f) (for ε ∈ (0, 1/2)) can be eliminated or is inherent. Characterize exact constants in the relationship between rational degree and postselected quantum query complexity for ε > 0.
  • Composition behavior of rational degree: Develop general composition theorems (beyond AND-composition) for rdeg, ndeg, and sdeg. Use these to build explicit separations, optimize upper bounds, or obtain lower bounds via structured function compositions.
  • Strengthening analytic tools: Improve Corollary 2.4 (√(n/2) ≤ deg(p) under specific sign/maximum conditions). Determine if the √(n/2) constant can be sharpened, generalized to broader constraints, or leveraged to reduce exponents in the main bounds.
  • Optimizing the decision-tree construction: Design alternative query strategies that avoid or reduce the sdeg(f)² factor in Theorem 4.1 (e.g., via refined hitting-set selection or structural properties of nondeterministic polynomials), potentially yielding D(f) ≤ O(rdeg(f)2) or better.
  • Partial functions—polynomial relation criteria: Since general polynomial relation between rdeg and degree fails for partial functions, develop necessary and sufficient conditions (e.g., promise structure, regularity, monotonicity) under which deg(f) and rdeg(f) remain polynomially related in the partial setting.

Practical Applications

Immediate Applications

The items below outline practical use cases that can be deployed today, based directly on the paper’s proven bounds, constructions, and techniques. Each item notes relevant sectors, what a tool/product/workflow could look like, and assumptions or dependencies that affect feasibility.

  • Query-efficient testing of Boolean decision logic
    • Sector: software engineering (QA), security (policy testing), embedded systems
    • What: A “Polynomial Hitting-Set Test Planner” that builds deterministic decision trees for black-box Boolean logic using the paper’s iterative hitting-set construction (Theorem D, Corollary Final).
    • How:
    • If you can compute or approximate nondeterministic representations p (for f) and q (for ¬f), the algorithm queries hitting sets of max-degree monomials to strictly reduce degree at each step, guaranteeing termination in at most O(rdeg(f)) iterations and query depth ≤ 4·sdeg(f)^2·rdeg(f)^2.
    • For approximate variants, use the extension to approximate nondeterministic degree (ndeg_ε) to trade exactness for fewer queries (Theorem on approximate versions).
    • Assumptions/dependencies:
    • Requires access to or approximation of p/q or ndeg structures; computing exact ndeg/rdeg can be hard in general.
    • Best suited to structured logic (e.g., rule engines, policy checks) where polynomial representations are tractable.
    • Applies to total Boolean functions; partial functions are outside the guaranteed regime (counterexample in the paper).
  • Influence-based complexity auditing for models and rule systems
    • Sector: machine learning governance, MLOps, software auditing
    • What: An “Influence-to-Rational-Degree Estimator” that uses empirical influence measurements to lower bound rational degree (rdeg) and infer variable dependence complexity (Theorem log n and inequality combining influence with rdeg).
    • How:
    • Empirically estimate Inf_i[f] by flipping bits and measuring changes; use Inf_i[f] ≥ 2^{-2·rdeg(f)} and ∑_i Inf_i[f] ≤ deg(f) to infer that rdeg(f) = Ω(log n) for functions depending on n variables.
    • Use this to certify that a deployed rule set or model has irreducible complexity beyond trivial (e.g., cannot depend on many variables with very low rdeg).
    • Assumptions/dependencies:
    • Requires reliable black-box access to the function and enough samples to estimate influences.
    • Gives lower bounds, not exact rdeg; complements other structure-based estimates.
  • Algebraic certificates for disjoint zero-sets on the hypercube
    • Sector: formal methods (SAT/SMT), program verification, symbolic computation
    • What: A “Hypercube Nullstellensatz Certificate Generator” that produces polynomial Bezout-type identities with bounded degrees when two multilinear polynomials’ zero-sets partition the Boolean cube (Effective Hypercube Nullstellensatz).
    • How:
    • Given g1, g2 with disjoint zero-sets and g1(x)·g2(x)=0 on {0,1}^n, construct h1, h2 s.t. h1·g1 + h2·g2 ≡ 1 on the cube and deg(h_i·g_i) bounded by O(deg(g1)^2·deg(g2)^2) after multilinearization.
    • Use as compact certificates in proof systems to attest separation/coverage or unsatisfiability on the cube.
    • Assumptions/dependencies:
    • Requires the zero-product and disjoint-zero-set conditions; extensions to arbitrary subsets or multiple polynomials need further research (and fail in general for partial functions per the paper’s counterexample).
  • Benchmarks and sanity checks in quantum query complexity research
    • Sector: quantum computing (theory), algorithm design
    • What: A “Postselected Query Complexity Analyzer” leveraging the equivalence rdeg(f) = PostQ_0(f) to benchmark problem instances and sanity-check proposed algorithms (the paper’s restated fact with proof sketch).
    • How:
    • Use rational degree bounds to characterize exact postselected query complexity and compare against proposed quantum procedures.
    • Provide quick feasibility checks: if deg(f) is small, rdeg(f) cannot be arbitrarily small due to deg(f) ≤ O(sdeg(f)^2·rdeg(f)^2).
    • Assumptions/dependencies:
    • Postselection is a theoretical model; results inform algorithmic lower/upper bounds and design constraints rather than near-term hardware implementations.
  • Educational and research tooling for Boolean complexity
    • Sector: education, academic research
    • What: Lightweight libraries or notebooks implementing:
    • Symmetrization routines (Bernoulli and Minsky-Papert symmetrization).
    • Influence estimation and lower bound checks.
    • Hitting-set construction for decision tree synthesis.
    • How:
    • Package the paper’s techniques into reproducible examples for courses on analysis of Boolean functions and quantum query complexity.
    • Assumptions/dependencies:
    • Educational value is immediate; practical optimization value depends on problem structure and availability of polynomial representations.

Long-Term Applications

These items require further research, scaling, algorithmic development, or engineering to be viable in production settings. They build on the paper’s core contributions and identified conjectures.

  • Automated derivation of nondeterministic and rational polynomial representations for real-world rule systems
    • Sector: software engineering (rule engines), policy analysis, decision support systems
    • What: Compilers that convert high-level boolean rules/policies into nondeterministic polynomials p/q or rational representations, enabling the hitting-set decision tree synthesis pipeline end-to-end.
    • Potential workflow:
    • Parse rules → symbolic boolean simplification → construct/approximate ndeg representations → apply the paper’s iterative hitting-set decision tree builder → deploy minimized testers.
    • Assumptions/dependencies:
    • Automated construction of ndeg/rdeg representations at scale is an open challenge; heuristics and domain-specific templates may be needed.
    • Robustness to noisy or partial specifications must be addressed (partial function limitations highlighted by the paper).
  • SAT/SMT proof systems enhanced by hypercube-effective Nullstellensatz
    • Sector: formal verification, automated theorem proving
    • What: Proof engines that exploit bounded-degree hypercube Nullstellensatz certificates to accelerate proofs for specialized Boolean constraints (e.g., combinatorial designs, parity-like structures).
    • Potential products:
    • “Hypercube Algebraic Prover” integrating bounded-degree Bezout identities into SAT/SMT solvers, with fallback to conventional resolution.
    • Assumptions/dependencies:
    • Efficient identification of g1,…,gm satisfying the required conditions; extending to the multi-polynomial case is conjectural.
    • Handling general partial constraints remains nontrivial (paper’s counterexamples show limits).
  • Active learning and query-efficient diagnostics via degree-guided strategies
    • Sector: machine learning (active learning), healthcare diagnostics, industrial troubleshooting
    • What: Algorithms that adaptively query features/tests guided by degree/hitting-set insights to minimize the number of queries needed to classify or diagnose.
    • Potential workflow:
    • Build approximate nondeterministic representations (ndeg_ε) for classifiers → derive hitting-set-based query sequences → deploy in constrained testing environments (e.g., medical triage checklists, equipment diagnostics).
    • Assumptions/dependencies:
    • Requires models amenable to polynomial approximation with interpretable monomials.
    • Must handle noise and probabilistic outcomes; approximate versions (ndeg_ε) are promising but need robust pipelines.
  • Hardware verification and IC testing optimized by block sensitivity and hitting sets
    • Sector: semiconductors, hardware verification
    • What: Test pattern generators that plan group flips (blocks) and targeted probes based on block sensitivity lower/upper bounds and monomial hitting sets to reduce test time.
    • Assumptions/dependencies:
    • Mapping hardware behaviors to Boolean functions with tractable ndeg/rdeg approximations is nontrivial.
    • Industrial viability hinges on scalable inference of polynomial structure from circuit descriptions.
  • Influence-calibrated feature selection and sample complexity planning
    • Sector: data science, experimental design
    • What: Tooling that uses the paper’s influence-to-degree relationships to set expectations for the minimum number of samples needed to detect variable dependence (and thereby plan data collection and feature selection).
    • Assumptions/dependencies:
    • Influence estimation must be reliable; connecting influence bounds to actionable sample sizes requires statistical modeling around the paper’s worst-case inequalities.
  • Quantum algorithm design heuristics using rational degree as a design knob
    • Sector: quantum computing (algorithm prototyping)
    • What: Construct oracle problems or subroutines with targeted rdeg profiles to explore postselected algorithm behaviors (theoretically), potentially informing practical, non-postselected heuristics.
    • Assumptions/dependencies:
    • Postselection is not a physical resource; translation from postselected insights to realizable quantum procedures requires further theory.

Notes on assumptions and dependencies across applications

  • Constructing ndeg, sdeg, or rdeg exactly is often computationally hard; many applications assume we can approximate or bound these degrees via structural analysis, heuristics, or empirical measurements (e.g., influences).
  • The paper’s results are for total Boolean functions; extensions to partial functions are generally false (explicit counterexamples provided), so workflows must ensure domain-totality or handle partiality carefully.
  • Postselection equivalence (rdeg(f) = PostQ_0(f)) is a theoretical bridge for quantum complexity; practical quantum applications will use it primarily for benchmarking or design constraints.
  • Gains depend on function structure: random or adversarial Boolean functions typically have large degrees, reducing the advantage of degree-guided strategies.
  • Effective Hypercube Nullstellensatz requires disjoint zero-sets and the zero-product condition on the cube; broader Nullstellensatz generalizations are conjectural and currently not guaranteed.

Glossary

  • Address function: A Boolean function that returns the bit of an input specified by an index; often used to show tightness of bounds. "Note that this bound is tight for the address function \cite{survey_buhrman_2002}."
  • AND-OR tree: A Boolean formula arranged in a balanced tree alternating AND and OR gates. "The balanced $\ANDOR$ tree on m2m^2 variables also simultaneously separates sign degree, rational degree, and degree: $\sdeg=O(\sqrt{m\log m})$ \cite{apxdeg_bun_2022}, $\rdeg=m$ \cite{rdeg_iyer_2025}, deg=m2\deg=m^2."
  • Bernoulli symmetrization: A technique that maps a multivariate polynomial to a univariate polynomial by averaging over a Bernoulli product distribution. "Then, for all y[0,1]y \in [0,1], \begin{equation} P(y) = \expect_{x \sim \ber_yn}[ \, p(x) \, ]. \end{equation}"
  • Block sensitivity: For a Boolean function at input x, the maximum number of disjoint blocks of variables whose flips each change the output. "The block sensitivity of ff at xx, denoted $\bs_x(f)$, is the maximum number of disjoint sensitive blocks of ff at xx."
  • Decision tree complexity: The minimum depth of a decision tree that computes the function exactly on all inputs. "The decision tree complexity of f ⁣:{0,1}n{0,1}f\colon \{0,1\}^n \to \{0,1\}, denoted D(f)D(f), is the minimum depth of a decision tree that, for all x{0,1}nx\in \{0,1\}^n, queries bits of xx to exactly compute f(x)f(x) at a leaf."
  • Effective Hypercube Nullstellensatz: A constructive Nullstellensatz on the Boolean hypercube giving degree bounds for representations of 1 as a combination of polynomials. "The main result of this work can be framed as an effective Hypercube Nullstellensatz."
  • Exact quantum query complexity: The minimum number of queries a quantum algorithm needs to compute a Boolean function with zero error. "where QEQ_E denotes the exact quantum query complexity of ff."
  • Generic oracle: In complexity theory, an oracle chosen generically (e.g., at random) to study relativized statements. "Polynomially relating rational degree to degree also implies P=(C=Pco-C=P)\mathbf{P} = (\text{C}_{=}\mathbf{P} \cap \text{co-C}_{=}\mathbf{P}) with respect to generic oracles \cite{oracle_fortnow_2003}."
  • Gotsman-Linial conjecture: A conjecture relating total influence to the threshold/sign degree of Boolean functions. "The long-standing Gotsman-Linial conjecture~\cite{spectral_gotsman_1994} posits that $\Inf[f] \leq O(\sqrt{n} \sdeg(f))$."
  • Hitting set (of a polynomial): A set of variable indices intersecting every max-degree monomial (maxonomial) of the polynomial. "We say that H[n]H\subseteq [n] is a hitting set of pp if HMH \cap M is nonempty for all MM(p)M \in M(p)."
  • Influence (of a Boolean function): The probability that flipping a particular input bit changes the function’s value; total influence sums this over all bits. "For f ⁣:{0,1}n{0,1}f\colon \{0,1\}^n \to \{0,1\} and i[n]i\in [n], the iith influence of ff is defined by $\Inf_i[f] \coloneqq \Pr[f(x) \neq f(x^i)]$, where xix^i is xx with the iith bit flipped, and the probability is over uniformly random x{0,1}nx\in \{0,1\}^n."
  • Markov's inequality (for polynomials): A bound on the derivative of a real polynomial constrained on an interval (distinct from the probabilistic inequality). "Our proof involves using polynomial symmetrization followed by Markov's inequality \cite{question_markov_1890}."
  • Maxonomial: A monomial of maximum degree present in a polynomial. "its set of maxonomials is the set M(p){M[n] ⁣:M=deg(p),cM0}M(p) \coloneqq \{M \subseteq [n] \colon {M} = \deg(p), c_M\neq 0\}."
  • Minsky-Papert symmetrization: A symmetrization technique converting multivariate polynomials over the hypercube to univariate polynomials in the Hamming weight. "The first, Minsky-Papert \cite{perceptrons_minsky_1969} symmetrization, is used to show the impossibility of generalizing our main result to all partial Boolean functions."
  • Multilinearization: The process of reducing a polynomial modulo the relations Xi2 = Xi to obtain a unique multilinear representative on the hypercube. "where the overline denotes multilinearization using the relations X12=X1,,Xn2=XnX_1^2=X_1,\dots,X_n^2=X_n."
  • Nondeterministic degree: The minimum degree of a polynomial that is nonzero exactly on the 1-inputs (or equivalently, zero exactly on the 0-inputs) of a Boolean function. "The nondeterministic degree of ff, denoted $\ndeg(f)$, is the minimum value of deg(p)\deg(p) over all nondeterministic representations of ff."
  • Nullstellensatz (for the hypercube): An algebraic principle ensuring combinations of polynomials equal 1 on {0,1}n under disjoint zero sets, with attention to degree bounds. "the relation between rational degree and degree governs the effectiveness of a natural Nullstellensatz for the hypercube, in the sense of \cite{bounds_brownawell_1987,sharp_kollar_1988,nullstellensatz_alon_1999,effective_jelonek_2005}."
  • One-sided 0-approximate degree: The minimum degree of a polynomial that is small on 0-inputs and at least 1 on 1-inputs (one-sided approximation). "equals the one-sided $0$-approximate degree of ff, up to a factor of $2$, as defined by Sherstov \cite{breaking_sherstov_2018}."
  • Pointer function: A constructed Boolean function used to separate decision tree and quantum query complexities via embedded pointers. "or the ``pointer function'' that quadratically separates DD from exact quantum query complexity \cite{pointer_ambainis_2017}."
  • Polynomial method (in quantum query complexity): A technique connecting quantum query algorithms to low-degree polynomial representations of Boolean functions. "Then, using the polynomial method \cite{polynomial_beals_2001} following \cite[Proof of Theorem 1]{postqe_mahadev_2015}, we deduce that there exists integer m2m \geq 2 and complex multilinear polynomials αsC[X1,,Xn]\alpha_s \in \mathbb{C}[X_1,\dots,X_n]..."
  • Postselected quantum query complexity: The query complexity of quantum algorithms that are allowed to postselect on measurement outcomes. "We also record the fact that rational degree exactly equals the ϵ\epsilon-approximate postselected quantum query complexity $\PostQ_\epsilon$ with ϵ=0\epsilon = 0 as defined in \cite{postqe_mahadev_2015}."
  • Postselected randomized query complexity: The classical analogue allowing postselection; equal to certificate complexity. "The analogous randomized query measure, exact postselected randomized query complexity, equals the certificate complexity \cite[Theorem 16]{thesis_cade_2020}."
  • Rational degree: The minimum of the maximum degrees of numerator and denominator in a rational polynomial representation that exactly matches a Boolean function on the hypercube. "Rational degree characterizes the exact postselected quantum query complexity \cite{postqe_mahadev_2015}."
  • Rational representation: Expressing a Boolean function as p/q, where p and q are polynomials that agree with the function on {0,1}n and the denominator is nonzero on the hypercube. "We say that p/qR(X1,,Xn)p/q\in \mathbb{R}(X_1,\dots,X_n), where p,qp,q are multilinear, is a rational representation of f ⁣:{0,1}n{0,1}f\colon \{0,1\}^n \to \{0,1\} if for all x{0,1}nx\in \{0,1\}^n, q(x)0q(x)\neq 0 and p(x)/q(x)=f(x)p(x)/q(x)=f(x)."
  • Sign degree (threshold degree): The minimum degree of a polynomial whose sign agrees with (1)f(x)(-1)^{f(x)} on the hypercube; also called threshold degree. "We have $\sdeg(f) \leq 2\rdeg(f)$ by squaring and shifting pp, where $\sdeg(f)$ is the sign degree (or threshold degree) of ff, that is, the minimum degree of a real polynomial that agrees in sign with (1)f(x)(-1)^{f(x)} for all x{0,1}nx\in \{0,1\}^n."

Open Problems

We're still in the process of identifying open problems mentioned in this paper. Please check back in a few minutes.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 4 tweets with 130 likes about this paper.