Average-Case Hardness Conjectures
- Average-case hardness conjectures are formal claims that some problems remain intractable for typical instances, thereby offering a rigorous foundation for cryptographic security and quantum sampling tasks.
- Methodologies such as random self-reducibility and worst-to-average-case reductions are central, enabling the translation of worst-case complexity into average-case settings across domains like high-dimensional statistics and tensor networks.
- The survey identifies key challenges and open problems, including tightening additive-error thresholds in quantum sampling and developing robust average-case proxies beyond low-degree heuristics.
Average-case hardness conjectures assert the existence of computational problems that remain intractable for efficient algorithms on typical, rather than worst-case, instances. Such conjectures are central to cryptography, complexity theory, high-dimensional statistics, quantum computing, and the study of fine-grained computational barriers. They provide a theoretical foundation for cryptographically secure constructions, explain observed statistical-computational gaps in inference, and underlie the evidence for quantum advantage in sampling tasks. This article surveys the major forms, methodologies, and implications of average-case hardness conjectures across domains, emphasizing precise formal statements, evidence, and outstanding open questions.
1. Definitions and Canonical Problem Templates
A problem exhibits average-case hardness if, for a natural input distribution or random ensemble, there is no polynomial-time algorithm that solves it with high probability. Precise statements vary by domain and distributional setting:
- Planted Clique and Hypergraphic Planted Clique (HPC): In the classical planted clique problem, one distinguishes from where vertices are chosen as a clique. The hypergraphic extension replaces the adjacency matrix with an -uniform adjacency tensor , and the goal is to detect a planted clique of size in an -uniform random hypergraph. The average-case hardness conjecture posits that no polynomial-time algorithm can distinguish the two hypotheses when for any fixed (Luo et al., 2020).
- Fine-grained Parity Problems and -SUM: For counting or parity-0-SUM, 1-XOR, 2-OV, random inputs are sampled from structured distributions (e.g., integers chosen iid from an interval or uniform vectors over 3), and one must compute the parity of 4-tuples summing to 5. Under natural distributional assumptions, these problems are conjectured to be 6-hard, even on average (Dalirrooyfard et al., 27 Mar 2025).
- BosonSampling and Quantum Sampling Problems: For BosonSampling, the average-case conjectures ask for the hardness of computing the output probabilities or sampling from the output distribution of linear-optical interferometers, where the unitary describing the circuit is drawn Haar-randomly or from low-depth architectures. The "Gaussian Permanent Estimation" (GPE±) conjecture formalizes this for output probabilities 7 where 8 (Bouland et al., 2024, Go et al., 2024).
- PEPS and Tensor Network Contraction: Contracting a random 2D Projected Entangled Pair State (PEPS) tensor network is shown to be as hard on average as in the worst case, i.e., #P-complete for almost all random choices of local tensors (Haferkamp et al., 2018).
- Average-case Hardness in Proof Complexity: For coNP-complete languages such as TAUT (the set of tautologies), the existence of "dense hard sequences"—families of inputs requiring superpolynomial time with positive upper density—implies average-case hardness under balanced distributions related by polynomial-time isomorphisms (Monroe, 2022, Monroe, 2023).
2. Key Conjectures, Formal Barriers, and Evidence
A representative set of average-case hardness conjectures include:
| Domain | Conjecture/Barrier | Formal Regime / Model |
|---|---|---|
| Planted Clique | 9 is undetectable poly-time | 0 vs 1 |
| HPC Equivalence | PC(2) 3 HPC4 | Random 5-uniform hypergraphs |
| 6-SUM | 7-time average-case algorithm 8 | Worst-case SIVP faster than 9 (Brakerski et al., 2020) |
| BosonSampling | Additive error 0 is #P-hard | GPE±, Haar-random 1 |
| Parity Problems | 2 average-case hardness | Parity-3-OV/XOR/SUM, explicit distributions (Dalirrooyfard et al., 27 Mar 2025) |
| Proof Complexity | Dense hard sequences 4 5 AvgP | Balanced distributions, p-isomorphisms (Monroe, 2022, Monroe, 2023) |
Substantial evidence supports these for specific computational models (e.g., Sum-of-Squares, Statistical Query, low-degree polynomials) or via average-case reductions from conjecturally hard problems. For example, linear-time average-case intractability of parity-6-OV under the Strong Exponential Time Hypothesis is established for structured distributions by randomized self-reduction frameworks (Dalirrooyfard et al., 27 Mar 2025).
Convexity and polynomial-interpolation barriers arise in quantum sampling: worst-to-average-case reductions based on polynomial or rational interpolation are limited by degree-induced robustness blow-ups. Recent work overcomes these for BosonSampling and Random Circuit Sampling via coefficient extraction and dilution techniques, removing prior noise-invariance limitations and matching average- and worst-case hardness up to additive errors 7 (Bouland et al., 2024).
3. Methodologies: Worst-to-Average-Case Reductions and Amplification
Random self-reducibility and related techniques are central. The canonical pipeline:
- Random Self-Reducibility: For problems like the permanent, PEPS contraction, or output probabilities of certain quantum circuits, any instance can be written as a low-degree function of a mixing parameter between a worst-case and an average-case input. Success on a constant fraction of random instances yields success everywhere through polynomial interpolation or Berlekamp–Welch decoding (Haferkamp et al., 2018, Movassagh, 2018, Bouland et al., 2024).
- Reduction Frameworks: In high-dimensional statistics, average-case reductions propagate computational lower bounds between learning and detection problems (e.g., planted clique 8 sparse PCA 9 robust sparse mean estimation), often utilizing Gaussianization, rejection kernels, and block-wise rotations to transfer planted structure into inference tasks without increasing distinguishing advantage (Brennan et al., 2019, Brennan et al., 2019).
- Group-Theoretic Amplification: For parity of 0-cliques and similar counting problems, group actions and orbit-stabilizer analysis amplify average-case bias to worst-case solvability for nearly all oracles, achieving fine-grained hardness even under 1 fraction average-case solvers (Nareddy et al., 2024).
- Average-Case Reductions in Fine-Grained Complexity: Randomized self-reductions for parity-2-OV/XOR/SUM, exploiting partiteness and low-degree polynomial structure, link hardness on explicit distributions to worst-case conjectures such as rETH, 3-Sum, or 4-Clique (Dalirrooyfard et al., 27 Mar 2025).
Open barriers in these methodologies often concern the existence of non-adaptive worst-to-average-case reductions for NP-complete problems, which are ruled out unless the polynomial hierarchy collapses (Bogdanov–Trevisan, (Holenstein et al., 2013)). Adaptive or non-black-box reductions, while plausible, remain largely unexplored outside lattice-based cryptography and certain quantum problems.
4. Statistical-Computational Phase Transitions and Universality
Average-case hardness conjectures are intimately tied to statistical-computational phase transitions, where inference shifts from statistically tractable to computationally intractable as parameters vary. Notable phenomena:
- k-to-k5 Gaps: In robust sparse mean estimation and a universality class of sparse mixture models, reductions from planted clique establish average-case hardness in the regime where sample complexity must jump from 6 to 7 as a function of the sparsity 8 (Brennan et al., 2019, Brennan et al., 2019).
- Tensor Problems and Planted Structure: Average-case hardness for detecting rank-1 spikes or planted communities in random tensors often reduces to hypergraphic planted clique detection; equivalence of the latter with classical planted clique conjecture would unify several tensor inference hardness results (Luo et al., 2020).
- Quantum Glassiness: In random p-local Pauli Hamiltonians, replica symmetry breaking at small 9 (e.g., 0) induces glassy phases where Gibbs state clusters are macroscopically separated, directly resulting in average-case lower bounds for constant-time stable quantum algorithms (Zlokapa et al., 9 Oct 2025).
- Proof Complexity and Unprovable Dense Families: Dense hard sequences related to Kolmogorov-random string assertions give rise to average-case hardness for propositional proof systems under natural, balanced distributions, reflecting a deep connection between noncomputability and computational lower bounds (Monroe, 2022, Monroe, 2023).
5. Refinements, Limitations, and Counterexamples
Recent work demonstrates that commonly used proxies for average-case hardness may not suffice:
- Low-Degree Analogues Insufficiency: Vanishing low-degree advantage (LDA) between planted and null distributions, even under symmetry and noise-tolerance, does not universally imply average-case hardness for all randomized algorithms. Counterexamples exploit list-decoding in codeword constructions and spectral properties of matrix ensembles (Buhai et al., 23 May 2025).
- Non-adaptive Reduction Barriers for NP: Non-adaptive black-box reductions from worst-case NP to average-case NP distribute promise problems imply collapse of PH (Holenstein et al., 2013). Adaptive or algebraic reductions, as used in lattice-based cryptography (SIS/LWE), are an exception, but only for problems in NP ∩ coNP.
- Distributional Dependence and Rare-Case Hardness: For parity-counting and graph subgraph problems, hardness amplification reductions often apply only to "almost all" error sets, and may fail for a small subclass of "bad" oracles, showing that average-case hardness bounds can rest on subtle distributional properties (Nareddy et al., 2024).
- Robustness Gaps in Quantum Sampling: While exact average-case #P-hardness is established for random circuit sampling, closing the robustness gap to the conjectured additive error thresholds (e.g., from 1 in the exponent down to 2) remains open for both BosonSampling and quantum circuits (Bouland et al., 2024, Go et al., 2024).
6. Applications, Open Problems, and Outlook
Average-case hardness conjectures underpin foundational advances in cryptography, learning theory, quantum computing, and statistical inference.
Current applications include:
- Provable security for cryptographic primitives (lattice-based one-way functions and hash families) (Ushakov, 2024).
- Complexity-theoretic evidence for quantum supremacy and quantum advantage in BosonSampling, Random Circuit Sampling, and IQP circuits (Bouland et al., 2024, Bremner et al., 2015, Movassagh, 2018, Go et al., 2024).
- Explanation of statistical-computational trade-offs in unsupervised learning and high-dimensional robust estimation (Brennan et al., 2019, Brennan et al., 2019).
- Structural limitations in proof systems and theorem proving (Monroe, 2022, Monroe, 2023).
Open problems and directions:
- Proving or refuting conjectured average-case equivalence between planted clique and hypergraphic planted clique detection.
- Tightening the additive-error hardness thresholds for average-case quantum sampling and extending them to polynomial-precision.
- Derandomizing group-theoretic reductions for parity-counting to remove reliance on random relabelings or rare-case instance probabilities (Nareddy et al., 2024).
- Developing a theory of average-case fine-grained complexity beyond explicit structured distributions, possibly encompassing more natural or uniform random ensembles (Dalirrooyfard et al., 27 Mar 2025, Brakerski et al., 2020).
- Designing more robust average-case proxies or sufficient conditions—potentially transcending low-degree heuristics—that accurately predict algorithmic hardness.
Average-case hardness conjectures, by mapping out the computational landscape of typical instances, remain at the heart of understanding and leveraging the limits of efficient computation across domains.