Rational degree is polynomially related to degree
Abstract: We prove that $\mathrm{deg}(f) \leq 2 \, \mathrm{rdeg}(f)4$ for every Boolean function $f$, where $\mathrm{deg}(f)$ is the degree of $f$ and $\mathrm{rdeg}(f)$ is the rational degree of $f$. This resolves the second of the three open problems stated by Nisan and Szegedy, and attributed to Fortnow, in 1994.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Explain it Like I'm 14
Overview
This paper is about two ways to measure how “complicated” a yes/no rule on bits is. Imagine you have n light switches (each 0 or 1), and a rule f decides yes/no from the switch positions. One way to measure complexity is the degree of a single polynomial that exactly equals f on all 0–1 inputs. Another (more flexible) way is the rational degree, where you can use a fraction of two polynomials to equal f. The big question (asked in 1994) was: Can using a fraction make things dramatically simpler, or are these measures always within a reasonable (polynomial) factor?
The authors resolve this open problem. They prove that for every such rule f, the usual degree is at most a fixed constant times the fourth power of the rational degree:
- Main result: deg(f) ≤ 2 * rdeg(f)4
So, using fractions can help, but not more than a polynomial (fourth power) amount.
What questions did the authors ask?
In simple terms, they asked:
- If a Boolean function f can be written as “one polynomial divided by another” with low degree (small rdeg), how small can the usual single-polynomial degree be in comparison?
- Is there always a polynomial relationship between those two measures?
- Can this connection help us understand other complexity ideas (like decision trees, quantum query models with postselection, and some algebraic identities on the 0–1 cube)?
How did they approach the problem?
They combine ideas from algebra, probability, and combinatorics. Here are the core ingredients, each with an everyday analogy:
- Block sensitivity (fragile groups): For any input, look for disjoint groups of switches such that flipping all switches in any one group flips the answer. The more such groups, the more “fragile” the function is at that input.
- Hitting sets for big terms (cut the tallest trees): A polynomial is a sum of terms; the tallest “trees” are the highest-degree terms. A hitting set picks at least one variable from each tallest term. If you learn (query) these variables’ values, you force the polynomial’s degree to drop when you plug those values in.
- A decision tree (20 questions): You can think of computing f by asking questions about input bits. The number of questions needed (in the best strategy) is the decision-tree complexity D(f).
They also use two key math tools:
- Symmetrization (make it fair): Average the polynomial so that all variables are treated equally; this gives a one-variable summary that’s easier to analyze.
- Markov’s inequality for polynomials (speed limit for slopes): If a polynomial stays within a vertical band on an interval, there’s a limit on how steep it can be. This helps bound degrees by how much a curve must change.
Putting it all together, their strategy looks like this:
- From a special “sign” polynomial for f, they find an input where the function is not too fragile (low block sensitivity).
- From special “nondeterministic” polynomials for f and for its opposite (not f), they build small hitting sets that, when queried, strictly reduce the polynomials’ degrees.
- They repeat: each round queries only a limited number of bits (controlled by the function’s “sign degree”) and guarantees the polynomial degrees drop (controlled by the “rational/nondeterministic” degrees).
- After a bounded number of rounds, the function becomes constant on what remains, so they’ve computed f with a limited number of questions.
- Finally, they use the known fact that deg(f) ≤ D(f) to turn this into a bound on deg(f) in terms of rdeg(f).
A key intermediate bound they prove is:
- D(f) ≤ 4 * sdeg(f)2 * rdeg(f)2, where sdeg(f) is the “sign degree” (the degree of a polynomial that is just negative when f=1 and positive when f=0). Since sdeg(f) ≤ 2·rdeg(f), this gives deg(f) ≤ D(f) ≤ 16 * rdeg(f)4. With a more refined argument, they strengthen the constant to 2, which is their headline result.
What did they find, and why is it important?
Main findings:
- The usual polynomial degree and the rational degree are polynomially related: deg(f) ≤ 2 * rdeg(f)4.
- In particular, using a fraction of polynomials never makes the problem exponentially easier. The worst possible speedup is at most a fourth-power factor.
Why this matters:
- It resolves a 30-year-old open problem posed by Nisan and Szegedy (attributed to Fortnow).
- The rational degree matches an exact measure in a model of quantum algorithms with postselection. So this result ties a quantum complexity measure tightly to classical polynomial degree.
- Degree is connected to many other complexity measures (decision trees, sensitivities, influences, certificate complexities). Bounding degree in terms of rational degree immediately bounds these other measures too, showing a unified picture: lots of different ways to measure “how hard f is” are all polynomially connected.
Implications and potential impact
Here are several clean takeaways and what they could lead to:
- A better understanding of “questions to answers”: Their decision-tree bound D(f) ≤ 4 * sdeg(f)2 * rdeg(f)2 shows how “threshold-like behavior” (sign degree) and “fractional representability” (rational degree) combine to limit the number of questions needed to compute f. This bridges algebraic and algorithmic views.
- Algebra on the 0–1 cube (effective Nullstellensatz): They derive a concrete bound for when two polynomials never vanish together on {0,1}n but one of them vanishes at every 0-input and the other at every 1-input. In plain terms: you can build an identity h1·g1 + h2·g2 = 1 on the cube with controlled degrees. This is a strong algebraic consequence with neat applications.
- Limits of generalization: They show that their kind of algebraic identity cannot be pushed to “partial” functions (where f is not defined on all inputs) without losing control of degrees. So their theorem is sharp in an important sense.
- Universal lower bound on rational degree: Any function that truly depends on n inputs must have rdeg(f) at least on the order of log n. So you can’t represent a genuinely n-variable function with an extremely tiny rational degree.
- Open directions:
- Is the fourth power necessary? The authors conjecture the quartic relation may be tight (some functions might really need deg ≈ rdeg4).
- Extend and sharpen connections to other measures (like influences), and to approximate versions (where tiny errors are allowed).
- Quantum angle: since rdeg equals exact postselected quantum query complexity, these results inform how powerful postselection is compared to classical measures.
In short, the paper draws a strong, precise line between two fundamental ways of representing Boolean functions (single polynomials vs. polynomial fractions), settles a long-standing question, and opens new paths linking algebra, combinatorics, and both classical and quantum computation.
Knowledge Gaps
Knowledge gaps, limitations, and open questions
Below is a concise list of gaps and open directions left unresolved by the paper. Each item is framed to be actionable for future research.
- Tightness of the quartic exponent: Determine whether the upper bound deg(f) ≤ O(rdeg(f)4) is optimal. Either:
- Construct an explicit family with D(f) ≥ Ω(rdeg(f)4) (as conjectured), implying deg(f) can be quartically larger than rdeg(f); or
- Improve the upper bound to deg(f) ≤ O(rdeg(f)3) or deg(f) ≤ O(rdeg(f)2), or prove such improvements are impossible.
- Tightness and role of sign degree in the decision-tree bound: Assess whether D(f) ≤ 4·sdeg(f)2·rdeg(f)2 can be improved (e.g., to linear dependence in sdeg(f)). Identify functions that make this sdeg-dependent factor necessary or provide constructions showing the bound is not tight.
- Approximate nondeterministic degree—combinatorial side: Extend Lemma 3.5 (hitting-set upper bound) to the approximate setting to match Theorem 6.1 analytically, thereby proving D(f) ≤ O(ndeg_ε(f)2·ndeg_ε(¬f)2) for constant ε. Clarify whether approximate hitting sets exist with size bounded in terms of degree and min-block sensitivity.
- Generalization of the Effective Hypercube Nullstellensatz: Prove the conjectured extension for m ≥ 2 polynomials g₁,…,g_m with the product-zero condition on {0,1}ⁿ, establishing h₁,…,h_m such that ∑ h_i·g_i ≡ 1 on {0,1}ⁿ, with deg(ȳh_i g_i) bounded by a polynomial in deg(g_i). Determine optimal degree dependence (e.g., linear vs quadratic vs higher).
- Nullstellensatz beyond product-zero: Characterize promise classes (subsets of the hypercube) for which an effective Nullstellensatz (polynomial degree bounds for Bezout-type certificates) holds. The paper’s counterexample shows the general statement fails even for m = 2; identify structural conditions under which it succeeds.
- Influence-based bounds and Gotsman–Linial variants: Resolve the conjecture Inf[f] ≤ O(√n * ndeg(f)) and quantify its consequences for rdeg(f). Alternatively, develop different techniques to strengthen unconditional lower bounds on rdeg(f) for functions depending on n variables beyond Ω(log n).
- Number of relevant variables versus rdeg: Improve the upper bound that a Boolean function of rational degree d can depend on at most O(d⁴·2{2d}) variables. Aim for sharper dependence (e.g., O(poly(d)·2{d})) and produce matching lower bounds.
- Properties of minimum block sensitivity: Systematically study min_x bs_x(f) introduced in this work:
- Tighten the inequality min_x bs_x(f) ≤ 2·sdeg(f)² (e.g., to linear or near-linear dependence).
- Establish its relationships to other measures (sensitivity, certificate complexity, approximate degree, etc.).
- Characterize its behavior under restrictions and composition.
- Constants and tight examples: Beyond PARITY₂, identify functions that make deg(f) = 2·rdeg(f)4 or D(f) = 4·sdeg(f)2·rdeg(f)2 tight. Clarify whether the bound deg(f) ≤ 16·rdeg(f)4 (via sign degree) is ever tight, and if not, improve constants.
- Polynomial relations to other measures: The quartic relation turns question marks to “4” in known tables (e.g., Iyer 2025). Derive matching lower bounds or improved upper bounds to pin down tight exponents and constants relating rdeg to:
- Certificate complexity, sensitivity, block sensitivity, approximate degree, and exact quantum query complexity Q_E.
- Explore whether Q_E(f) ≤ O(ndeg(f)·ndeg(¬f)) (de Wolf’s conjecture) holds, and its implications for rdeg–degree relations.
- Postselected quantum query complexity for ε > 0: Establish whether the factor-2 slack in rdeg(f) ≤ 2·PostQ_ε(f) (for ε ∈ (0, 1/2)) can be eliminated or is inherent. Characterize exact constants in the relationship between rational degree and postselected quantum query complexity for ε > 0.
- Composition behavior of rational degree: Develop general composition theorems (beyond AND-composition) for rdeg, ndeg, and sdeg. Use these to build explicit separations, optimize upper bounds, or obtain lower bounds via structured function compositions.
- Strengthening analytic tools: Improve Corollary 2.4 (√(n/2) ≤ deg(p) under specific sign/maximum conditions). Determine if the √(n/2) constant can be sharpened, generalized to broader constraints, or leveraged to reduce exponents in the main bounds.
- Optimizing the decision-tree construction: Design alternative query strategies that avoid or reduce the sdeg(f)² factor in Theorem 4.1 (e.g., via refined hitting-set selection or structural properties of nondeterministic polynomials), potentially yielding D(f) ≤ O(rdeg(f)2) or better.
- Partial functions—polynomial relation criteria: Since general polynomial relation between rdeg and degree fails for partial functions, develop necessary and sufficient conditions (e.g., promise structure, regularity, monotonicity) under which deg(f) and rdeg(f) remain polynomially related in the partial setting.
Practical Applications
Immediate Applications
The items below outline practical use cases that can be deployed today, based directly on the paper’s proven bounds, constructions, and techniques. Each item notes relevant sectors, what a tool/product/workflow could look like, and assumptions or dependencies that affect feasibility.
- Query-efficient testing of Boolean decision logic
- Sector: software engineering (QA), security (policy testing), embedded systems
- What: A “Polynomial Hitting-Set Test Planner” that builds deterministic decision trees for black-box Boolean logic using the paper’s iterative hitting-set construction (Theorem D, Corollary Final).
- How:
- If you can compute or approximate nondeterministic representations
p(forf) andq(for¬f), the algorithm queries hitting sets of max-degree monomials to strictly reduce degree at each step, guaranteeing termination in at mostO(rdeg(f))iterations and query depth≤ 4·sdeg(f)^2·rdeg(f)^2. - For approximate variants, use the extension to approximate nondeterministic degree (
ndeg_ε) to trade exactness for fewer queries (Theorem on approximate versions). - Assumptions/dependencies:
- Requires access to or approximation of
p/qorndegstructures; computing exactndeg/rdegcan be hard in general. - Best suited to structured logic (e.g., rule engines, policy checks) where polynomial representations are tractable.
- Applies to total Boolean functions; partial functions are outside the guaranteed regime (counterexample in the paper).
- Influence-based complexity auditing for models and rule systems
- Sector: machine learning governance, MLOps, software auditing
- What: An “Influence-to-Rational-Degree Estimator” that uses empirical influence measurements to lower bound rational degree (
rdeg) and infer variable dependence complexity (Theorem log n and inequality combining influence withrdeg). - How:
- Empirically estimate
Inf_i[f]by flipping bits and measuring changes; useInf_i[f] ≥ 2^{-2·rdeg(f)}and∑_i Inf_i[f] ≤ deg(f)to infer thatrdeg(f) = Ω(log n)for functions depending onnvariables. - Use this to certify that a deployed rule set or model has irreducible complexity beyond trivial (e.g., cannot depend on many variables with very low
rdeg). - Assumptions/dependencies:
- Requires reliable black-box access to the function and enough samples to estimate influences.
- Gives lower bounds, not exact
rdeg; complements other structure-based estimates.
- Algebraic certificates for disjoint zero-sets on the hypercube
- Sector: formal methods (SAT/SMT), program verification, symbolic computation
- What: A “Hypercube Nullstellensatz Certificate Generator” that produces polynomial Bezout-type identities with bounded degrees when two multilinear polynomials’ zero-sets partition the Boolean cube (Effective Hypercube Nullstellensatz).
- How:
- Given
g1, g2with disjoint zero-sets andg1(x)·g2(x)=0on{0,1}^n, constructh1, h2s.t.h1·g1 + h2·g2 ≡ 1on the cube anddeg(h_i·g_i)bounded byO(deg(g1)^2·deg(g2)^2)after multilinearization. - Use as compact certificates in proof systems to attest separation/coverage or unsatisfiability on the cube.
- Assumptions/dependencies:
- Requires the zero-product and disjoint-zero-set conditions; extensions to arbitrary subsets or multiple polynomials need further research (and fail in general for partial functions per the paper’s counterexample).
- Benchmarks and sanity checks in quantum query complexity research
- Sector: quantum computing (theory), algorithm design
- What: A “Postselected Query Complexity Analyzer” leveraging the equivalence
rdeg(f) = PostQ_0(f)to benchmark problem instances and sanity-check proposed algorithms (the paper’s restated fact with proof sketch). - How:
- Use rational degree bounds to characterize exact postselected query complexity and compare against proposed quantum procedures.
- Provide quick feasibility checks: if
deg(f)is small,rdeg(f)cannot be arbitrarily small due todeg(f) ≤ O(sdeg(f)^2·rdeg(f)^2). - Assumptions/dependencies:
- Postselection is a theoretical model; results inform algorithmic lower/upper bounds and design constraints rather than near-term hardware implementations.
- Educational and research tooling for Boolean complexity
- Sector: education, academic research
- What: Lightweight libraries or notebooks implementing:
- Symmetrization routines (Bernoulli and Minsky-Papert symmetrization).
- Influence estimation and lower bound checks.
- Hitting-set construction for decision tree synthesis.
- How:
- Package the paper’s techniques into reproducible examples for courses on analysis of Boolean functions and quantum query complexity.
- Assumptions/dependencies:
- Educational value is immediate; practical optimization value depends on problem structure and availability of polynomial representations.
Long-Term Applications
These items require further research, scaling, algorithmic development, or engineering to be viable in production settings. They build on the paper’s core contributions and identified conjectures.
- Automated derivation of nondeterministic and rational polynomial representations for real-world rule systems
- Sector: software engineering (rule engines), policy analysis, decision support systems
- What: Compilers that convert high-level boolean rules/policies into nondeterministic polynomials
p/qor rational representations, enabling the hitting-set decision tree synthesis pipeline end-to-end. - Potential workflow:
- Parse rules → symbolic boolean simplification → construct/approximate
ndegrepresentations → apply the paper’s iterative hitting-set decision tree builder → deploy minimized testers. - Assumptions/dependencies:
- Automated construction of
ndeg/rdegrepresentations at scale is an open challenge; heuristics and domain-specific templates may be needed. - Robustness to noisy or partial specifications must be addressed (partial function limitations highlighted by the paper).
- SAT/SMT proof systems enhanced by hypercube-effective Nullstellensatz
- Sector: formal verification, automated theorem proving
- What: Proof engines that exploit bounded-degree hypercube Nullstellensatz certificates to accelerate proofs for specialized Boolean constraints (e.g., combinatorial designs, parity-like structures).
- Potential products:
- “Hypercube Algebraic Prover” integrating bounded-degree Bezout identities into SAT/SMT solvers, with fallback to conventional resolution.
- Assumptions/dependencies:
- Efficient identification of
g1,…,gmsatisfying the required conditions; extending to the multi-polynomial case is conjectural. - Handling general partial constraints remains nontrivial (paper’s counterexamples show limits).
- Active learning and query-efficient diagnostics via degree-guided strategies
- Sector: machine learning (active learning), healthcare diagnostics, industrial troubleshooting
- What: Algorithms that adaptively query features/tests guided by degree/hitting-set insights to minimize the number of queries needed to classify or diagnose.
- Potential workflow:
- Build approximate nondeterministic representations (
ndeg_ε) for classifiers → derive hitting-set-based query sequences → deploy in constrained testing environments (e.g., medical triage checklists, equipment diagnostics). - Assumptions/dependencies:
- Requires models amenable to polynomial approximation with interpretable monomials.
- Must handle noise and probabilistic outcomes; approximate versions (
ndeg_ε) are promising but need robust pipelines.
- Hardware verification and IC testing optimized by block sensitivity and hitting sets
- Sector: semiconductors, hardware verification
- What: Test pattern generators that plan group flips (blocks) and targeted probes based on block sensitivity lower/upper bounds and monomial hitting sets to reduce test time.
- Assumptions/dependencies:
- Mapping hardware behaviors to Boolean functions with tractable
ndeg/rdegapproximations is nontrivial. - Industrial viability hinges on scalable inference of polynomial structure from circuit descriptions.
- Influence-calibrated feature selection and sample complexity planning
- Sector: data science, experimental design
- What: Tooling that uses the paper’s influence-to-degree relationships to set expectations for the minimum number of samples needed to detect variable dependence (and thereby plan data collection and feature selection).
- Assumptions/dependencies:
- Influence estimation must be reliable; connecting influence bounds to actionable sample sizes requires statistical modeling around the paper’s worst-case inequalities.
- Quantum algorithm design heuristics using rational degree as a design knob
- Sector: quantum computing (algorithm prototyping)
- What: Construct oracle problems or subroutines with targeted
rdegprofiles to explore postselected algorithm behaviors (theoretically), potentially informing practical, non-postselected heuristics. - Assumptions/dependencies:
- Postselection is not a physical resource; translation from postselected insights to realizable quantum procedures requires further theory.
Notes on assumptions and dependencies across applications
- Constructing
ndeg,sdeg, orrdegexactly is often computationally hard; many applications assume we can approximate or bound these degrees via structural analysis, heuristics, or empirical measurements (e.g., influences). - The paper’s results are for total Boolean functions; extensions to partial functions are generally false (explicit counterexamples provided), so workflows must ensure domain-totality or handle partiality carefully.
- Postselection equivalence (
rdeg(f) = PostQ_0(f)) is a theoretical bridge for quantum complexity; practical quantum applications will use it primarily for benchmarking or design constraints. - Gains depend on function structure: random or adversarial Boolean functions typically have large degrees, reducing the advantage of degree-guided strategies.
- Effective Hypercube Nullstellensatz requires disjoint zero-sets and the zero-product condition on the cube; broader Nullstellensatz generalizations are conjectural and currently not guaranteed.
Glossary
- Address function: A Boolean function that returns the bit of an input specified by an index; often used to show tightness of bounds. "Note that this bound is tight for the address function \cite{survey_buhrman_2002}."
- AND-OR tree: A Boolean formula arranged in a balanced tree alternating AND and OR gates. "The balanced $\ANDOR$ tree on variables also simultaneously separates sign degree, rational degree, and degree: $\sdeg=O(\sqrt{m\log m})$ \cite{apxdeg_bun_2022}, $\rdeg=m$ \cite{rdeg_iyer_2025}, ."
- Bernoulli symmetrization: A technique that maps a multivariate polynomial to a univariate polynomial by averaging over a Bernoulli product distribution. "Then, for all , \begin{equation} P(y) = \expect_{x \sim \ber_yn}[ \, p(x) \, ]. \end{equation}"
- Block sensitivity: For a Boolean function at input x, the maximum number of disjoint blocks of variables whose flips each change the output. "The block sensitivity of at , denoted $\bs_x(f)$, is the maximum number of disjoint sensitive blocks of at ."
- Decision tree complexity: The minimum depth of a decision tree that computes the function exactly on all inputs. "The decision tree complexity of , denoted , is the minimum depth of a decision tree that, for all , queries bits of to exactly compute at a leaf."
- Effective Hypercube Nullstellensatz: A constructive Nullstellensatz on the Boolean hypercube giving degree bounds for representations of 1 as a combination of polynomials. "The main result of this work can be framed as an effective Hypercube Nullstellensatz."
- Exact quantum query complexity: The minimum number of queries a quantum algorithm needs to compute a Boolean function with zero error. "where denotes the exact quantum query complexity of ."
- Generic oracle: In complexity theory, an oracle chosen generically (e.g., at random) to study relativized statements. "Polynomially relating rational degree to degree also implies with respect to generic oracles \cite{oracle_fortnow_2003}."
- Gotsman-Linial conjecture: A conjecture relating total influence to the threshold/sign degree of Boolean functions. "The long-standing Gotsman-Linial conjecture~\cite{spectral_gotsman_1994} posits that $\Inf[f] \leq O(\sqrt{n} \sdeg(f))$."
- Hitting set (of a polynomial): A set of variable indices intersecting every max-degree monomial (maxonomial) of the polynomial. "We say that is a hitting set of if is nonempty for all ."
- Influence (of a Boolean function): The probability that flipping a particular input bit changes the function’s value; total influence sums this over all bits. "For and , the th influence of is defined by $\Inf_i[f] \coloneqq \Pr[f(x) \neq f(x^i)]$, where is with the th bit flipped, and the probability is over uniformly random ."
- Markov's inequality (for polynomials): A bound on the derivative of a real polynomial constrained on an interval (distinct from the probabilistic inequality). "Our proof involves using polynomial symmetrization followed by Markov's inequality \cite{question_markov_1890}."
- Maxonomial: A monomial of maximum degree present in a polynomial. "its set of maxonomials is the set ."
- Minsky-Papert symmetrization: A symmetrization technique converting multivariate polynomials over the hypercube to univariate polynomials in the Hamming weight. "The first, Minsky-Papert \cite{perceptrons_minsky_1969} symmetrization, is used to show the impossibility of generalizing our main result to all partial Boolean functions."
- Multilinearization: The process of reducing a polynomial modulo the relations Xi2 = Xi to obtain a unique multilinear representative on the hypercube. "where the overline denotes multilinearization using the relations ."
- Nondeterministic degree: The minimum degree of a polynomial that is nonzero exactly on the 1-inputs (or equivalently, zero exactly on the 0-inputs) of a Boolean function. "The nondeterministic degree of , denoted $\ndeg(f)$, is the minimum value of over all nondeterministic representations of ."
- Nullstellensatz (for the hypercube): An algebraic principle ensuring combinations of polynomials equal 1 on {0,1}n under disjoint zero sets, with attention to degree bounds. "the relation between rational degree and degree governs the effectiveness of a natural Nullstellensatz for the hypercube, in the sense of \cite{bounds_brownawell_1987,sharp_kollar_1988,nullstellensatz_alon_1999,effective_jelonek_2005}."
- One-sided 0-approximate degree: The minimum degree of a polynomial that is small on 0-inputs and at least 1 on 1-inputs (one-sided approximation). "equals the one-sided $0$-approximate degree of , up to a factor of $2$, as defined by Sherstov \cite{breaking_sherstov_2018}."
- Pointer function: A constructed Boolean function used to separate decision tree and quantum query complexities via embedded pointers. "or the ``pointer function'' that quadratically separates from exact quantum query complexity \cite{pointer_ambainis_2017}."
- Polynomial method (in quantum query complexity): A technique connecting quantum query algorithms to low-degree polynomial representations of Boolean functions. "Then, using the polynomial method \cite{polynomial_beals_2001} following \cite[Proof of Theorem 1]{postqe_mahadev_2015}, we deduce that there exists integer and complex multilinear polynomials ..."
- Postselected quantum query complexity: The query complexity of quantum algorithms that are allowed to postselect on measurement outcomes. "We also record the fact that rational degree exactly equals the -approximate postselected quantum query complexity $\PostQ_\epsilon$ with as defined in \cite{postqe_mahadev_2015}."
- Postselected randomized query complexity: The classical analogue allowing postselection; equal to certificate complexity. "The analogous randomized query measure, exact postselected randomized query complexity, equals the certificate complexity \cite[Theorem 16]{thesis_cade_2020}."
- Rational degree: The minimum of the maximum degrees of numerator and denominator in a rational polynomial representation that exactly matches a Boolean function on the hypercube. "Rational degree characterizes the exact postselected quantum query complexity \cite{postqe_mahadev_2015}."
- Rational representation: Expressing a Boolean function as p/q, where p and q are polynomials that agree with the function on {0,1}n and the denominator is nonzero on the hypercube. "We say that , where are multilinear, is a rational representation of if for all , and ."
- Sign degree (threshold degree): The minimum degree of a polynomial whose sign agrees with on the hypercube; also called threshold degree. "We have $\sdeg(f) \leq 2\rdeg(f)$ by squaring and shifting , where $\sdeg(f)$ is the sign degree (or threshold degree) of , that is, the minimum degree of a real polynomial that agrees in sign with for all ."
Collections
Sign up for free to add this paper to one or more collections.