LNDEQ-TRS: Equality Term Rewriting
- LNDEQ-TRS is a framework integrating labelled natural deduction and linear term rewriting to manage equality across Boolean logic and type theory.
- It employs explicit computational paths and rigorous rewrite rules to ensure termination, confluence, and polynomially bound derivation lengths.
- Interconnections with Boolean function theory and graph theory validate its expressiveness while exposing inherent computational complexity.
The LNDEQ-TRS ("Labelled Natural Deduction Equality Term Rewriting System") framework encompasses a family of term-rewriting systems designed for expressing, managing, and reasoning about equality and inference rules within formal logic and type theory. These systems leverage linear rewriting rules and explicit computational paths, enabling precise formulation and manipulation of equalities and inference derivations across Boolean logic, labelled deduction, and homotopy type theory. LNDEQ-TRS serves as a nexus for techniques from term rewriting, Boolean function theory, graph theory, and type-theoretic labelled deduction, yielding powerful results on expressiveness, complexity, and proof-theoretic properties.
1. Linear Rewriting Rules in Boolean Logic
LNDEQ-TRS for Boolean logic is constructed over the signature and a countable set of propositional variables equipped with a fixed involution to handle negation in negation normal form (NNF) (Das et al., 2016). A rewrite rule is an expression where , and , instantiated in context as for arbitrary one-hole contexts and substitutions .
Linearity is defined as:
- Left-linearity: Every variable occurs at most once in
- Right-linearity: Every variable occurs at most once in
- Linear rules: Both the above
A linear term rewriting system (TRS) is a set of such linear rules, with the critical property that the one-step reduction relation is decidable in time polynomial in the term size.
Boolean-logic LNDEQ-TRS considers systems whose rules are all linear and satisfy the semantic soundness condition: for monotone Boolean functions and computed by and , pointwise (i.e., ). The universe of all sound linear inferences is then coNP-complete.
2. Structure and Syntax of Computational Paths
In intensional Martin-Löf type theory, for terms , the identity type is refined in LNDEQ-TRS by explicit "computational paths" witnessing the equality (Veras et al., 2020). A computational path , denoted , consists of a finite, reversible sequence of definitional equality steps (rewrites and variable renamings), starting at and ending at . The constructors for paths include:
- Reflexivity ()
- Symmetry ()
- Transitivity ()
- , : Primitive reduction steps for -types
- , , : Congruence operations
Judgments in labelled natural deduction take the form , supplementing type- and term-formation with explicit evidence of equality. Term formation for identity types and their introduction (-I) and elimination (-E) rules are carried with path labels, supporting elimination/redex reduction via and conversions.
3. Rewrite Systems: Rules and Hierarchies
LNDEQ-TRS employs a rich set of rewrite rules for computational paths (Veras et al., 2020):
- 39 original LND-TRS rules resolve path redundancies and administrate relations between symmetry, transitivity, and congruence operations among computational paths.
- 7 additional TRS rules address redundancies at the rw-equality level (sequences of lower-level rewrites).
- A "cd" independence-of-choice rule identifies different reduction sequences (left- or right-oriented) producing the same outcome.
Table: Example Path Construct Rewrite Rules in LNDEQ-TRS
| Rule | Pattern | Result |
|---|---|---|
| Symmetry | ||
| Transitivity | ||
| Congruence |
This exhaustive rewriting system ensures termination and confluence: every rewrite sequence is finite and every computational path possesses a unique normal form under rw-reduction.
4. Proof-Theoretic and Complexity Properties
A central result for Boolean-logic LNDEQ-TRS is the polynomial bound on nontrivial derivation lengths (Das et al., 2016). For a strictly increasing sequence of constant-free NNF linear terms in variables representing a nontrivial inference, one finds . The proof employs the "relation web" —a complete graph on variables of , edges labeled by their least common connective. Maximal -cliques define minterms, while -cliques define maxterms.
The intersection lemma yields critical chains (minterms) and (maxterms) such that:
Edge changes in the relation web correspond to minterm shrinkage and maxterm growth, bounding the length of derivations.
Combining this with proof-theoretic complexity, the universe of sound linear rules is shown to be coNP-complete:
- Soundness checking of reduces to the validity of
- Any Boolean tautology can be polynomially encoded to a linear inference
No polynomial-time decidable, sound, complete LNDEQ-TRS exists for Boolean logic unless coNPNP.
5. Coinductive and Circular Proof Systems
Advancing from purely syntactic rewriting, logically constrained term rewriting systems (LCTRS) as in LNDEQ-TRS enable coinductive semantics and circular proofs (Ciobâcă et al., 2018). Rewrite rules possess logical constraints: , with of the same sort and constraint in a background signature.
The proof system for reachability (DSTEP) defines meaning all terminating rewrite sequences from reach :
- [Axiom]: For unsatisfiable constraints
- [Subsumption]: Eliminates goals covered by successors
- [Derive]: Symbolic successor derivation
DSTEP is sound and complete: iff , with demonic validity as the coinductive semantic criterion.
To finitely represent infinite coinductive proofs, the circularity rule ([circ]) is introduced in DCC by allowing guarded, finite circular proof branches. Soundness of circularity is preserved under appropriate guardedness conditions.
6. Worked Examples and Practical Implementation
In LND-TRS, explicit computational paths are constructed between complex terms, such as reductions in the lambda calculus (Veras et al., 2020). For example, to can be witnessed by a composite computational path combining - and -reductions, each step labeled and chained by transitivity. The rewrite calculus explicitly records each transition and provides a unique normal form for the path due to confluence and termination.
A non-trivial equality such as for and paths , in is derived using the higher-level rule 42 from TRS, establishing the congruence of path composition under function application.
For LCTRS, practical reachability analysis can be performed with tools such as the "RMT" system (Ciobâcă et al., 2018), which utilizes DAG-based term representation, maximal sharing, and an SMT-solver backend (default: Z3). Proof search is bounded or goal-directed, yielding either infinite coinductive DSTEP trees or finite circular DCC trees. Verified domains range from arithmetic algorithms (e.g., compositeness, greatest common divisors, summation, and powers) to operational semantics for imperative and functional programming languages.
7. Interconnections: Term Rewriting, Boolean Function Theory, and Graph Theory
LNDEQ-TRS integrates concepts from term rewriting theory (rules, contexts, modular equational deduction), Boolean function theory (monotone functions, minterms, maxterms, read-once functions), and graph theory (relation webs, -free graphs) (Das et al., 2016). Soundness and triviality of linear inference rules are characterized via minterm and maxterm inclusions, as per .
Gurvich’s theorem on read-once Boolean functions underpins the structural analysis—every minterm meets every maxterm in exactly one variable, a property leveraged to bound derivation lengths. Graph-theoretical invariants, such as clique sizes and edge-flips in relation webs, precisely translate into Boolean function and rewriting measures. Canonical rules like "medial" are characterized graph-theoretically as those which preserve, e.g., -edges in the rewriting process.
This suggests that LNDEQ-TRS achieves a unique intersection of expressiveness, polynomially-bounded derivational complexity, and deep interconnectivity among foundational areas, while demonstrating inherent computational limitations dictated by complexity-theoretic boundaries such as coNP-completeness.