Papers
Topics
Authors
Recent
Search
2000 character limit reached

Average-Case Hardness Conjectures

Updated 17 April 2026
  • Average-case hardness conjectures are formal claims that some problems remain intractable for typical instances, thereby offering a rigorous foundation for cryptographic security and quantum sampling tasks.
  • Methodologies such as random self-reducibility and worst-to-average-case reductions are central, enabling the translation of worst-case complexity into average-case settings across domains like high-dimensional statistics and tensor networks.
  • The survey identifies key challenges and open problems, including tightening additive-error thresholds in quantum sampling and developing robust average-case proxies beyond low-degree heuristics.

Average-case hardness conjectures assert the existence of computational problems that remain intractable for efficient algorithms on typical, rather than worst-case, instances. Such conjectures are central to cryptography, complexity theory, high-dimensional statistics, quantum computing, and the study of fine-grained computational barriers. They provide a theoretical foundation for cryptographically secure constructions, explain observed statistical-computational gaps in inference, and underlie the evidence for quantum advantage in sampling tasks. This article surveys the major forms, methodologies, and implications of average-case hardness conjectures across domains, emphasizing precise formal statements, evidence, and outstanding open questions.

1. Definitions and Canonical Problem Templates

A problem exhibits average-case hardness if, for a natural input distribution or random ensemble, there is no polynomial-time algorithm that solves it with high probability. Precise statements vary by domain and distributional setting:

  • Planted Clique and Hypergraphic Planted Clique (HPC): In the classical planted clique problem, one distinguishes G(n,1/2)G(n,1/2) from G(n,1/2,k)G(n,1/2,k) where kk vertices are chosen as a clique. The hypergraphic extension replaces the adjacency matrix with an rr-uniform adjacency tensor A∈{0,1}nr\mathcal{A} \in \{0,1\}^{n^r}, and the goal is to detect a planted clique of size kk in an rr-uniform random hypergraph. The average-case hardness conjecture posits that no polynomial-time algorithm can distinguish the two hypotheses when k=o(n1/2−τ)k = o(n^{1/2-\tau}) for any fixed Ï„>0\tau>0 (Luo et al., 2020).
  • Fine-grained Parity Problems and kk-SUM: For counting or parity-G(n,1/2,k)G(n,1/2,k)0-SUM, G(n,1/2,k)G(n,1/2,k)1-XOR, G(n,1/2,k)G(n,1/2,k)2-OV, random inputs are sampled from structured distributions (e.g., integers chosen iid from an interval or uniform vectors over G(n,1/2,k)G(n,1/2,k)3), and one must compute the parity of G(n,1/2,k)G(n,1/2,k)4-tuples summing to G(n,1/2,k)G(n,1/2,k)5. Under natural distributional assumptions, these problems are conjectured to be G(n,1/2,k)G(n,1/2,k)6-hard, even on average (Dalirrooyfard et al., 27 Mar 2025).
  • BosonSampling and Quantum Sampling Problems: For BosonSampling, the average-case conjectures ask for the hardness of computing the output probabilities or sampling from the output distribution of linear-optical interferometers, where the unitary describing the circuit is drawn Haar-randomly or from low-depth architectures. The "Gaussian Permanent Estimation" (GPE±) conjecture formalizes this for output probabilities G(n,1/2,k)G(n,1/2,k)7 where G(n,1/2,k)G(n,1/2,k)8 (Bouland et al., 2024, Go et al., 2024).
  • PEPS and Tensor Network Contraction: Contracting a random 2D Projected Entangled Pair State (PEPS) tensor network is shown to be as hard on average as in the worst case, i.e., #P-complete for almost all random choices of local tensors (Haferkamp et al., 2018).
  • Average-case Hardness in Proof Complexity: For coNP-complete languages such as TAUT (the set of tautologies), the existence of "dense hard sequences"—families of inputs requiring superpolynomial time with positive upper density—implies average-case hardness under balanced distributions related by polynomial-time isomorphisms (Monroe, 2022, Monroe, 2023).

2. Key Conjectures, Formal Barriers, and Evidence

A representative set of average-case hardness conjectures include:

Domain Conjecture/Barrier Formal Regime / Model
Planted Clique G(n,1/2,k)G(n,1/2,k)9 is undetectable poly-time kk0 vs kk1
HPC Equivalence PC(kk2) kk3 HPCkk4 Random kk5-uniform hypergraphs
kk6-SUM kk7-time average-case algorithm kk8 Worst-case SIVP faster than kk9 (Brakerski et al., 2020)
BosonSampling Additive error rr0 is #P-hard GPE±, Haar-random rr1
Parity Problems rr2 average-case hardness Parity-rr3-OV/XOR/SUM, explicit distributions (Dalirrooyfard et al., 27 Mar 2025)
Proof Complexity Dense hard sequences rr4 rr5 AvgP Balanced distributions, p-isomorphisms (Monroe, 2022, Monroe, 2023)

Substantial evidence supports these for specific computational models (e.g., Sum-of-Squares, Statistical Query, low-degree polynomials) or via average-case reductions from conjecturally hard problems. For example, linear-time average-case intractability of parity-rr6-OV under the Strong Exponential Time Hypothesis is established for structured distributions by randomized self-reduction frameworks (Dalirrooyfard et al., 27 Mar 2025).

Convexity and polynomial-interpolation barriers arise in quantum sampling: worst-to-average-case reductions based on polynomial or rational interpolation are limited by degree-induced robustness blow-ups. Recent work overcomes these for BosonSampling and Random Circuit Sampling via coefficient extraction and dilution techniques, removing prior noise-invariance limitations and matching average- and worst-case hardness up to additive errors rr7 (Bouland et al., 2024).

3. Methodologies: Worst-to-Average-Case Reductions and Amplification

Random self-reducibility and related techniques are central. The canonical pipeline:

  1. Random Self-Reducibility: For problems like the permanent, PEPS contraction, or output probabilities of certain quantum circuits, any instance can be written as a low-degree function of a mixing parameter between a worst-case and an average-case input. Success on a constant fraction of random instances yields success everywhere through polynomial interpolation or Berlekamp–Welch decoding (Haferkamp et al., 2018, Movassagh, 2018, Bouland et al., 2024).
  2. Reduction Frameworks: In high-dimensional statistics, average-case reductions propagate computational lower bounds between learning and detection problems (e.g., planted clique rr8 sparse PCA rr9 robust sparse mean estimation), often utilizing Gaussianization, rejection kernels, and block-wise rotations to transfer planted structure into inference tasks without increasing distinguishing advantage (Brennan et al., 2019, Brennan et al., 2019).
  3. Group-Theoretic Amplification: For parity of A∈{0,1}nr\mathcal{A} \in \{0,1\}^{n^r}0-cliques and similar counting problems, group actions and orbit-stabilizer analysis amplify average-case bias to worst-case solvability for nearly all oracles, achieving fine-grained hardness even under A∈{0,1}nr\mathcal{A} \in \{0,1\}^{n^r}1 fraction average-case solvers (Nareddy et al., 2024).
  4. Average-Case Reductions in Fine-Grained Complexity: Randomized self-reductions for parity-A∈{0,1}nr\mathcal{A} \in \{0,1\}^{n^r}2-OV/XOR/SUM, exploiting partiteness and low-degree polynomial structure, link hardness on explicit distributions to worst-case conjectures such as rETH, A∈{0,1}nr\mathcal{A} \in \{0,1\}^{n^r}3-Sum, or A∈{0,1}nr\mathcal{A} \in \{0,1\}^{n^r}4-Clique (Dalirrooyfard et al., 27 Mar 2025).

Open barriers in these methodologies often concern the existence of non-adaptive worst-to-average-case reductions for NP-complete problems, which are ruled out unless the polynomial hierarchy collapses (Bogdanov–Trevisan, (Holenstein et al., 2013)). Adaptive or non-black-box reductions, while plausible, remain largely unexplored outside lattice-based cryptography and certain quantum problems.

4. Statistical-Computational Phase Transitions and Universality

Average-case hardness conjectures are intimately tied to statistical-computational phase transitions, where inference shifts from statistically tractable to computationally intractable as parameters vary. Notable phenomena:

  • k-to-kA∈{0,1}nr\mathcal{A} \in \{0,1\}^{n^r}5 Gaps: In robust sparse mean estimation and a universality class of sparse mixture models, reductions from planted clique establish average-case hardness in the regime where sample complexity must jump from A∈{0,1}nr\mathcal{A} \in \{0,1\}^{n^r}6 to A∈{0,1}nr\mathcal{A} \in \{0,1\}^{n^r}7 as a function of the sparsity A∈{0,1}nr\mathcal{A} \in \{0,1\}^{n^r}8 (Brennan et al., 2019, Brennan et al., 2019).
  • Tensor Problems and Planted Structure: Average-case hardness for detecting rank-1 spikes or planted communities in random tensors often reduces to hypergraphic planted clique detection; equivalence of the latter with classical planted clique conjecture would unify several tensor inference hardness results (Luo et al., 2020).
  • Quantum Glassiness: In random p-local Pauli Hamiltonians, replica symmetry breaking at small A∈{0,1}nr\mathcal{A} \in \{0,1\}^{n^r}9 (e.g., kk0) induces glassy phases where Gibbs state clusters are macroscopically separated, directly resulting in average-case lower bounds for constant-time stable quantum algorithms (Zlokapa et al., 9 Oct 2025).
  • Proof Complexity and Unprovable Dense Families: Dense hard sequences related to Kolmogorov-random string assertions give rise to average-case hardness for propositional proof systems under natural, balanced distributions, reflecting a deep connection between noncomputability and computational lower bounds (Monroe, 2022, Monroe, 2023).

5. Refinements, Limitations, and Counterexamples

Recent work demonstrates that commonly used proxies for average-case hardness may not suffice:

  • Low-Degree Analogues Insufficiency: Vanishing low-degree advantage (LDA) between planted and null distributions, even under symmetry and noise-tolerance, does not universally imply average-case hardness for all randomized algorithms. Counterexamples exploit list-decoding in codeword constructions and spectral properties of matrix ensembles (Buhai et al., 23 May 2025).
  • Non-adaptive Reduction Barriers for NP: Non-adaptive black-box reductions from worst-case NP to average-case NP distribute promise problems imply collapse of PH (Holenstein et al., 2013). Adaptive or algebraic reductions, as used in lattice-based cryptography (SIS/LWE), are an exception, but only for problems in NP ∩ coNP.
  • Distributional Dependence and Rare-Case Hardness: For parity-counting and graph subgraph problems, hardness amplification reductions often apply only to "almost all" error sets, and may fail for a small subclass of "bad" oracles, showing that average-case hardness bounds can rest on subtle distributional properties (Nareddy et al., 2024).
  • Robustness Gaps in Quantum Sampling: While exact average-case #P-hardness is established for random circuit sampling, closing the robustness gap to the conjectured additive error thresholds (e.g., from kk1 in the exponent down to kk2) remains open for both BosonSampling and quantum circuits (Bouland et al., 2024, Go et al., 2024).

6. Applications, Open Problems, and Outlook

Average-case hardness conjectures underpin foundational advances in cryptography, learning theory, quantum computing, and statistical inference.

Current applications include:

Open problems and directions:

  • Proving or refuting conjectured average-case equivalence between planted clique and hypergraphic planted clique detection.
  • Tightening the additive-error hardness thresholds for average-case quantum sampling and extending them to polynomial-precision.
  • Derandomizing group-theoretic reductions for parity-counting to remove reliance on random relabelings or rare-case instance probabilities (Nareddy et al., 2024).
  • Developing a theory of average-case fine-grained complexity beyond explicit structured distributions, possibly encompassing more natural or uniform random ensembles (Dalirrooyfard et al., 27 Mar 2025, Brakerski et al., 2020).
  • Designing more robust average-case proxies or sufficient conditions—potentially transcending low-degree heuristics—that accurately predict algorithmic hardness.

Average-case hardness conjectures, by mapping out the computational landscape of typical instances, remain at the heart of understanding and leveraging the limits of efficient computation across domains.

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Average-Case Hardness Conjectures.