Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 134 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 28 tok/s Pro
GPT-5 High 29 tok/s Pro
GPT-4o 71 tok/s Pro
Kimi K2 208 tok/s Pro
GPT OSS 120B 426 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Exponential-Time Hypothesis Overview

Updated 18 October 2025
  • Exponential-Time Hypothesis (ETH) is a central conjecture in fine-grained complexity, asserting that problems like 3-SAT require exponential time with a constant c>0.
  • Reductions such as parsimonious reductions, gadget constructions, and the sparsification lemma leverage ETH to establish tight lower bounds across decision, counting, and parameterized problems.
  • ETH influences optimal algorithm design in exact, approximation, and parameterized settings and extends its implications to areas like graph problems, cryptography, and quantum complexity.

The Exponential-Time Hypothesis (ETH) is a central conjecture in fine-grained complexity theory, positing precise exponential lower bounds for the computational complexity of fundamental NP-complete and counting problems. Its assertion and implications underpin a significant fraction of current lower-bound research, explaining the empirical optimality of known algorithms for a wide array of combinatorial optimization and counting tasks. ETH and its variants (including the counting ETH and SETH) function as benchmarks for ruling out subexponential or even polynomial improvements on classic algorithms in theoretical computer science, parameterized complexity, counting complexity, and beyond.

1. Formal Statement and Standard Variants

ETH was introduced to capture the belief that the best algorithms for k-SAT require exponential time in the number of variables. Formally, ETH asserts that there exists a constant c>0c > 0 such that no deterministic algorithm solves 3-SAT in time exp(cn)\exp(cn), where nn is the number of variables. In notation:

 c>0:3-SAT on n variables cannot be solved in exp(cn) time\exists\ c > 0:\quad \text{3-SAT on } n \text{ variables cannot be solved in } \exp(cn)\text{ time}

A closely related (and strictly stronger) statement is the Strong Exponential-Time Hypothesis (SETH), which posits that for every ϵ<1\epsilon < 1, there exists kk such that kk-CNF-SAT cannot be solved in O(2ϵn)O(2^{\epsilon n}) time (Cygan et al., 2011). The counting versions, denoted as #ETH and #SETH, apply these bounds to the counting variants of SAT (e.g., #3-SAT).

Table: ETH and Variants

Acronym Formal Statement Applies To
ETH  c>0:\exists\ c>0: 3-SAT cannot be solved in exp(cn)\exp(cn) Decision problems (e.g., 3-SAT)
#ETH  c>0:\exists\ c>0: #3-SAT cannot be solved in exp(cn)\exp(cn) Counting problems (e.g., #3-SAT)
SETH ϵ<1,k:\forall \epsilon<1, \exists k: kk-SAT cannot be solved in O(2ϵn)O(2^{\epsilon n}) Decision, all k3k\geq 3

2. Reductions and Methodological Foundations

ETH is pivotal in the design of reductions which aim to rule out faster (subexponential) algorithms for various problems. Through carefully parameter-preserving reductions, lower bounds for SAT propagate to other domains. The central mechanisms include:

  • Parsimonious Reductions: Transform instances of SAT (or #SAT) into target problems (e.g., Hitting Set, Set Splitting, NAE-SAT) such that improvements for the target would imply unexpected improvements for SAT, refuting ETH or SETH (Cygan et al., 2011).
  • Block Interpolation: For counting problems, the block interpolation framework converts #P-hardness (classically via polynomial interpolation) into tight #ETH-based exponential lower bounds for polynomial and graph-related counting problems (Curticapean, 2015).
  • Sparsification Lemma: The sparsification lemma (appropriately extended to the counting setting (Dell et al., 2012)) allows reductions to "sparse" formulas or graphs while controlling blow-up, ensuring the ETH-based lower bounds remain tight in both the number of variables and clauses or edges.
  • Gadget Constructions: Reductions often employ gadgets, especially in problems involving graph parameters, so that the problem's hardness is retained under parameterized complexity considerations (Husfeldt et al., 2010).

3. ETH in Counting and Graph Problems

In the context of counting complexity, ETH (and #ETH) enable nearly tight lower bounds for canonical #P-hard problems. Noteworthy examples include:

  • Permanent and Tutte Polynomial: Under #ETH, computing the permanent of an n×nn\times n $0$-$1$ matrix cannot be done in time exp(o(n))\exp(o(n)), matching Ryser's inclusion-exclusion-based upper bound (Dell et al., 2012, Curticapean, 2015). For the Tutte polynomial, conditional lower bounds extend to almost all evaluation points in the (x,y)(x,y)-plane for simple or sparse graphs, i.e., the time required is at least exp(Ω(n/polylog(n)))\exp(\Omega(n/\operatorname{polylog}(n))), except at certain "easy" points (Dell et al., 2012).
  • Graph Reliability: Eth-based reductions, notably exploiting "bounce graphs" and careful multivariate interpolation, show that the computation of the all-terminal reliability function REL(G,p)(G,p) for a simple graph on mm edges requires time at least exp(Ω(m/log2m))\exp(\Omega(m/\log^2 m)) (Husfeldt et al., 2010).
  • Constraint Satisfaction and Optimization: Problems parameterized by treewidth or treedepth (e.g., Feedback Vertex Set, Connected Dominating Set) are shown to have essentially optimal algorithmic dependence on these parameters under SETH/ETH. For example, single-exponential ctwnO(1)c^{tw}\cdot n^{O(1)} algorithms for many connectivity-type problems are optimal up to the base cc (Cygan et al., 2011, Hegerfeld et al., 2020).

4. Parameterized Complexity and ETH-based Lower Bounds

ETH provides precise boundaries for what is fixed-parameter tractable (FPT) or not in parameterized complexity:

  • 2-CSP and Embedding Lower Bounds: Reductions exploiting expander embeddings show that for 2-CSP with kk constraints, no algorithm runs in f(k)no(k/logk)f(k)\cdot n^{o(k/\log k)} time unless ETH fails (S. et al., 2023). This fine-grained lower bound is transferred to many other parameterized problems through gadget-based reductions.
  • Logical Frameworks and Problems over Bounded Width: Modal-logical frameworks and algebraic characterizations show that a broad class of optimization and CSPs (including VCSP, Max-Ones, etc.) fall within single-exponential-time tractability, with ETH implying this is optimal (Jonsson et al., 2014, Jonsson et al., 2017, Pilipczuk, 2011).

5. Algorithmic Implications and Tightness Results

ETH rigorously explains the empirical optimality of many well-known exponential-time algorithms:

  • Exact Algorithms: For problems ranging from SAT, Hitting Set, and Set Cover to various CSPs and graph problems (Hamiltonian Path, Steiner Tree, Subset Sum), the best known exponential-time algorithms cannot be significantly improved upon without falsifying ETH or SETH (Cygan et al., 2011). A key quantitative manifestation is the "optimal growth rate" limkσ\lim_{k\to\infty} \sigma(CNF-SATk_k / nn) = 1 (Cygan et al., 2011).
  • Approximation Algorithms: Even in the exponential-time approximation regime, ETH-based lower bounds guide which tradeoffs between solution quality (rr-approximation) and running time (e.g., O(2n/r)O^*(2^{n/r})) are achievable. Recent results "shave" factors from the exponent but demonstrate that further progress would refute even stronger hypotheses such as gap-ETH (Bansal et al., 2017).
  • Counting and Meta-complexity: In the meta-complexity field, ETH-based arguments yield superpolynomial lower bounds for naturally restricted computational models (e.g., minimization of branching program size (Glinskih et al., 5 Jul 2024)) and reinforce the hardness of compressing circuit and branching program representations.

6. Foundational and Cross-disciplinary Consequences

ETH's implications extend to space complexity and quantum computation, as well as cryptography:

  • Space Complexity Barriers: Under SETH, deterministic algorithms for NL-complete problems, such as graph connectivity, require at least O(log2n)O(\log^2 n) space, matching Savitch's upper bound and separating L from NL (Czerwinski, 2023).
  • Lattice Problems and Cryptographic Hardness: Reductions from ETH-hard CSPs to closest vector, shortest vector, and bounded distance decoding problems on lattices establish that these fundamental problems in lattice-based cryptography are not solvable in time 2o(n)2^{o(n)} for lattice dimension nn, supporting cryptographic security assumptions (Aggarwal et al., 3 Apr 2025).
  • Quantum Complexity: Quantum analogues of ETH (QSETH) have been formulated, transferring classical ETH-based lower bounds to quantum settings and establishing, for instance, that edit distance remains hard for quantum algorithms, with time lower bounds matching or even exceeding the classical case under quantum SETH assumptions (Buhrman et al., 2019).

7. Limits, Extensions, and Open Directions

While ETH-based results are robust and widely applied, some lines of research address potential limitations or necessary refinements:

  • Limits of Subexponential Algorithms: Results concerning treewidth, treedepth, and domain-size show that even as structural graph parameters change (e.g., treewidth \to treedepth, domain-size in CSP), ETH implies precise lower bounds that cannot be improved except possibly for domain-size increases (Jonsson et al., 2017, Hegerfeld et al., 2020).
  • Complex Weighted CSP: Sharp dichotomies under #ETH hold for complex weighted #CSPs, classifying tractable cases in terms of closure properties (affine or product type functions) and showing that outside these, subexponential-time algorithms are impossible even for bounded-degree cases (Liu, 2022).
  • Parameterization and Approximation Hypotheses: The Parameterized Inapproximability Hypothesis (PIH), previously only derived from stronger assumptions (Gap-ETH), is now obtainable from ETH via gap-producing reductions and structured PCPs, providing a more natural baseline for inapproximability in parameterized complexity (Guruswami et al., 2023).

References

Definition Search Book Streamline Icon: https://streamlinehq.com
References (20)
Forward Email Streamline Icon: https://streamlinehq.com

Follow Topic

Get notified by email when new papers are published related to Exponential-Time Hypothesis.