Papers
Topics
Authors
Recent
Search
2000 character limit reached

Lean 4 Framework for Poly-Time Reductions

Updated 24 January 2026
  • Lean 4 framework is a mechanized formal system that rigorously verifies Karp reductions and complexity claims using dependent type theory.
  • It automates tactic-based reduction composition and explicit polynomial-time bookkeeping to certify NP, coNP, and Σ₂^P memberships.
  • The framework streamlines large-scale complexity proofs through modular abstractions and precise management of polynomial bounds.

A Lean 4 framework for polynomial-time reductions provides a mechanized setting for formalizing reductions and complexity-theoretic proofs in a dependent type theory based on Lean 4. It enables rigorous verification of computational complexity statements and automates many aspects of both reduction construction and complexity class membership certification. The framework centralizes the notion of a Karp (many-one) reduction, incorporates explicit polynomial-time witnesses, establishes compositionality, supplies instance-derivation mechanisms for standard classes (NP, coNP, Σ2P\Sigma_2^P), and encodes combinators and tactics for scalable proofs.

1. Core Structure: PolyReduction

The foundational abstraction is the Lean structure PolyReduction, which formalizes a Karp (many-one) reduction between countable types α\alpha and β\beta. A PolyReduction instance consists of:

  • `reduce : \alpha \to \beta,afunctionmappinginstances,</li><li><code>timebound</code>,abundledwitnessofpolynomialtimecomputability:</li></ul><p>, a function mapping instances,</li> <li><code>time_bound</code>, a bundled witness of polynomial-time computability:</li> </ul> <p>\text{PolyReduction}\,(\alpha,\beta) : \left\{\,\text{reduce}:\alpha\to\beta,\,\exists\,p:\mathbb N\to\mathbb N,\,\text{polynomial } p \wedge \forall x,\,\text{runtime}(\text{reduce }x)\le p(\text{size }x) \right\}</p><p>Here,<code>runtime</code>staticallyreflectsthecomputationalcostofevaluating<code>reducex</code>(asanaturalnumber),and<code>size</code>isthecanonicalmeasurederivedfromthe<code>Countable</code>encodingof</p> <p>Here, <code>runtime</code> statically reflects the computational cost of evaluating <code>reduce x</code> (as a natural number), and <code>size</code> is the canonical measure derived from the <code>Countable</code> encoding of \alpha.Thepredicate<code>polynomialp</code>ensuresthat. The predicate <code>polynomial p</code> ensures that pisboundedbysomepolynomial,satisfyingcomplexitytheoreticrequirements(<ahref="/papers/2601.15571"title=""rel="nofollow"dataturbo="false"class="assistantlink"xdataxtooltip.raw="">Simas,22Jan2026</a>).</p><p>Thisdesignstrictlymandatesthateveryreductionnotonlybefunctionallycorrectbutbeaccompaniedbyexplicit,verifiablepolynomialtimebounds.</p><h2class=paperheadingid=reductioncompositionandtacticautomation>2.ReductionCompositionandTacticAutomation</h2><p>Polynomialtimereductionscompose:if is bounded by some polynomial, satisfying complexity-theoretic requirements (<a href="/papers/2601.15571" title="" rel="nofollow" data-turbo="false" class="assistant-link" x-data x-tooltip.raw="">Simas, 22 Jan 2026</a>).</p> <p>This design strictly mandates that every reduction not only be functionally correct but be accompanied by explicit, verifiable polynomial-time bounds.</p> <h2 class='paper-heading' id='reduction-composition-and-tactic-automation'>2. Reduction Composition and Tactic Automation</h2> <p>Polynomial-time reductions compose: if f : \alpha \mapsto \betaand and g : \beta \mapsto \gammaarepolynomialtimereductions,thensois are polynomial-time reductions, then so is g \circ f : \alpha \mapsto \gamma.Thecorecombinationtheoremis:!!!!0!!!!withthecomposedruntimebound. The core combination theorem is:
    1
    2
    3
    
    theorem comp
      {α β γ : Type} [Countable α] [Countable β] [Countable γ]
      (f : α ↠ₚ β) (g : β ↠ₚ γ) : α ↠ₚ γ
    with the composed runtime bound
    p(n) = p_2(p_1(n)),where, where p_1and and p_2arethepolynomialscertifyingtheruntimeof are the polynomials certifying the runtime of fand and g,respectively.Thecorrectnessofcompositionalityisestablishedbothonthefunctionlevelandinthearithmeticofthepolynomialwitnesses.Tacticsupportin<code>AlgorithmComplexity.lean</code>includes<code>bypolyreduce</code>,whichautoappliesreductionsandhandlespolynomialboundsubgoals,andleveragesauxiliarylemmassuchas<code>polynomial.add</code>,<code>polynomial.comp</code>,andmonotonicityconstructsforinequalities.</p><p>Thesefeaturesprovideascalableandreusableinfrastructureforassemblingcomplexreductionsfrommodularcomponents.</p><h2class=paperheadingid=certifyingcomplexityclassmembership>3.CertifyingComplexityClassMembership</h2><p>MembershipinNP,coNP,and, respectively. The correctness of compositionality is established both on the function level and in the arithmetic of the polynomial witnesses. Tactic support in <code>AlgorithmComplexity.lean</code> includes <code>by poly_reduce</code>, which auto-applies reductions and handles polynomial-bound subgoals, and leverages auxiliary lemmas such as <code>polynomial.add</code>, <code>polynomial.comp</code>, and monotonicity constructs for inequalities.</p> <p>These features provide a scalable and reusable infrastructure for assembling complex reductions from modular components.</p> <h2 class='paper-heading' id='certifying-complexity-class-membership'>3. Certifying Complexity Class Membership</h2> <p>Membership in NP, coNP, and \Sigma_2^PisencodedasLeanclasses(<code>InNP</code>,<code>InCoNP</code>,<code>InSigma2</code>),eachconsistingof:</p><ul><li>awitnesstype(ortypes,for is encoded as Lean classes (<code>InNP</code>, <code>InCoNP</code>, <code>InSigma2</code>), each consisting of:</p> <ul> <li>a witness type (or types, for \Sigma_2^P),</li><li>averification/refutationpredicate(intheform),</li> <li>a verification/refutation predicate (in the form \texttt{witnessType} \times \alpha \to \texttt{Bool}orsimilar),</li><li>explicitpolynomialtimeboundsfortheverifier,</li><li>andalogicalspecificationrelatingsolutionstowitnesses.</li></ul><p>Forexample,<code>InCoNP</code>requires</p><p> or similar),</li> <li>explicit polynomial-time bounds for the verifier,</li> <li>and a logical specification relating solutions to witnesses.</li> </ul> <p>For example, <code>InCoNP</code> requires</p> <p>\forall x,\,\lnot P(x) \Leftrightarrow \exists w,\,\text{refute}(w, x) = \text{true}</p><p>withallcomputationsinpolynomialtime.</p><p>Adistinctiveaspectisthat,onceareductionfromanestablishedcompleteproblem(e.g.,TAUTOLOGYforcoNP,SETCOVERforNP)to</p> <p>with all computations in polynomial time.</p> <p>A distinctive aspect is that, once a reduction from an established complete problem (e.g., TAUTOLOGY for coNP, SET-COVER for NP) to Qisencodedasa<code>PolyReduction</code>,membershipintherelevantclassisinheritedautomatically,greatlyreducingtheburdenofboilerplateproofs:!!!!1!!!!SimilarmechanismsexistforNPand is encoded as a <code>PolyReduction</code>, membership in the relevant class is inherited automatically, greatly reducing the burden of boilerplate proofs:
    1
    2
    
    theorem Q_inCoNP [h : TAUTOLOGY ↠ₚ Q] : InCoNP Q :=
      apply CoNP_of_reduction TAUTOLOGY_inCoNP h
    Similar mechanisms exist for NP and
    \Sigma_2^P,acceleratingtheclassificationofnewdecisionproblems.</p><h2class=paperheadingid=explicitpolynomialtimebookkeeping>4.ExplicitPolynomialTimeBookkeeping</h2><p>Theframeworkrequiresprecisemanagementofpolynomialtimewitnessesforeveryreductionorverifier.Theidiomistoconstructanexplicit, accelerating the classification of new decision problems.</p> <h2 class='paper-heading' id='explicit-polynomial-time-bookkeeping'>4. Explicit Polynomial-Time Bookkeeping</h2> <p>The framework requires precise management of polynomial-time witnesses for every reduction or verifier. The idiom is to construct an explicit p(n),prove<code>polynomialp</code>,andestablishtheinequality, prove <code>polynomial p</code>, and establish the inequality \forall x,\, \text{runtime}(\cdot) \le p(\text{size }x).Thegranularcontrolextendstofinedetails,suchascomposition,additiveconstants,andutilizationofmonotonicityforhandlingthearithmeticofsizeandrunningtimepropagation:!!!!2!!!!IntermediatereasoningunfoldsinLeanvia<code>calc</code>blocksandnumericcoercionstomaintaincorrectnessandtransparencythroughoutpolynomialtimeboundconstruction.</p><p>Thisexplicitbookkeepingreinforcesthefoundationalcorrectnessguaranteesoftheformalizationandensuresverifiabilityofallcomplexityclaims.</p><h2class=paperheadingid=examplereductionsetcovertosufficiencycheck>5.ExampleReduction:SetCovertoSufficiencyCheck</h2><p>Toillustrate,thereductionfromSETCOVERtoSUFFICIENCYCHECKisformalizedasa<code>PolyReduction</code>:!!!!3!!!!Here,theconcretelydescribed<code>reduce</code>functionrepackagesSETCOVERinstancesasSUFFICIENCYCHECKproblemsviaappropriateencodingsofactions,coordinates,andquerysets.Theexplicitpolynomial. The granular control extends to fine details, such as composition, additive constants, and utilization of monotonicity for handling the arithmetic of size and running time propagation:
    1
    2
    
    let p := fun n => p₂ (p₁ n) + 3 * n + 5
    -- then refine ⟨polynomial.add (polynomial.comp hp₂ hp₁) (polynomial.add ...), ...⟩
    Intermediate reasoning unfolds in Lean via <code>calc</code> blocks and numeric coercions to maintain correctness and transparency throughout polynomial-time bound construction.</p> <p>This explicit bookkeeping reinforces the foundational correctness guarantees of the formalization and ensures verifiability of all complexity claims.</p> <h2 class='paper-heading' id='example-reduction-set-cover-to-sufficiency-check'>5. Example Reduction: Set-Cover to Sufficiency-Check</h2> <p>To illustrate, the reduction from SET-COVER to SUFFICIENCY-CHECK is formalized as a <code>PolyReduction</code>:
    1
    2
    3
    4
    5
    6
    
    def redSetCoverToSuffCheck : SetCoverInst ↠ₚ SuffCheckInst where
      reduce := fun ⟨coverFam, …⟩ => {…}
      time_bound := by
        let p := fun n => n^3 + 10*n^2 + 5
        use p
        ...
    Here, the concretely described <code>reduce</code> function repackages SET-COVER instances as SUFFICIENCY-CHECK problems via appropriate encodings of actions, coordinates, and query sets. The explicit polynomial
    p(n) = n^3 + 10n^2 + 5boundsruntimeforallsteps,ensuringthereductioniscorrectwithintheformalframework.</p><p>Oncesuchreductionsareestablished,theframeworkautomaticallyderivesthatSUFFICIENCYCHECKiscoNPcomplete(TAUTOLOGY bounds runtime for all steps, ensuring the reduction is correct within the formal framework.</p> <p>Once such reductions are established, the framework automatically derives that SUFFICIENCY-CHECK is coNP-complete (TAUTOLOGY \toSETCOVER SET-COVER \toSUFFICIENCYCHECK),withthecompositionalinfrastructurehandlingallrequisiteinstanceconstructions(<ahref="/papers/2601.15571"title=""rel="nofollow"dataturbo="false"class="assistantlink"xdataxtooltip.raw="">Simas,22Jan2026</a>).Theresultisaformallyverifiedcomplexitylandscapefortheproblemofidentifyingdecisionrelevantinformation.</p><h2class=paperheadingid=workflowandformalizationscale>6.WorkflowandFormalizationScale</h2><p>Thefullworkflowcomprises:</p><ol><li>Defining<code>structurePolyReduction</code>withexplicitreductionandruntimeboundfields.</li><li>Provingcombinatorssuchas<code>comp</code>,<code>id</code>,andothersforbuildingnewreductions.</li><li>Buildingtacticsupporttostreamlinereductionassemblyandpolynomialboundverification.</li><li>DefiningclasslevelencodingsforNP,coNP,and SUFFICIENCY-CHECK), with the compositional infrastructure handling all requisite instance constructions (<a href="/papers/2601.15571" title="" rel="nofollow" data-turbo="false" class="assistant-link" x-data x-tooltip.raw="">Simas, 22 Jan 2026</a>). The result is a formally verified complexity landscape for the problem of identifying decision-relevant information.</p> <h2 class='paper-heading' id='workflow-and-formalization-scale'>6. Workflow and Formalization Scale</h2> <p>The full workflow comprises:</p> <ol> <li>Defining <code>structure PolyReduction</code> with explicit reduction and runtime bound fields.</li> <li>Proving combinators such as <code>comp</code>, <code>id</code>, and others for building new reductions.</li> <li>Building tactic support to streamline reduction assembly and polynomial-bound verification.</li> <li>Defining class-level encodings for NP, coNP, and \Sigma_2^Palongwithinstanceconstructors.</li><li>Foranewproblem along with instance constructors.</li> <li>For a new problem Q,eitherproducinganexplicitreductionfromaknowncompleteproblemordirectlybuildingaverifieror, either producing an explicit reduction from a known complete problem or directly building a verifier or \Sigma_2$ witness.
  • Letting Lean’s type-class and instance inference propagate memberships automatically.
  • The mechanized artifact spans approximately 5,600 lines of Lean 4 code across 36 files, supporting 230+ theorems, with all polynomial bounds and complexity claims fully explicit and machine-checked.

    7. Significance and Scope of the Framework

    The Lean 4 framework enables exhaustive and trustworthy formalization of classical and novel complexity-theoretic reductions, systematically capturing coNP-completeness, hardness of approximation, exponential lower bounds (ETH), and parameterized complexity (W[2]-hardness) for problems such as SUFFICIENCY-CHECK. The dichotomy between explicit and succinct encodings is directly formalized, and reduction correctness together with time bounds are mechanically verified. This strongly facilitates research in complexity theory by removing ambiguity and human error from reduction-based arguments and by enabling large-scale, reusable mechanized libraries for complexity-theoretic reasoning (Simas, 22 Jan 2026).

    Definition Search Book Streamline Icon: https://streamlinehq.com
    References (1)

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Lean 4 Framework for Polynomial-Time Reductions.