Parity-Consistent Decomposition Method
- Parity-Consistent Decomposition (PCD) is a unified combinatorial and algebraic method that leverages parity symmetry and recursive partitioning to guarantee uniformity in structural and probabilistic analyses.
- The framework replaces brute-force computations with recursively constructed, parity-controlled decompositions, enhancing efficiency in graph theory, coding, quantum information, and distributed decision problems.
- PCD provides actionable benefits such as analytic simplicity, complexity reduction, and distributed optimization, with proven applications in matching theory, LP decoding, and quantum Schmidt decompositions.
Parity-Consistent Decomposition (PCD) is a unified combinatorial and algebraic framework predicated on parity symmetry, recursive partitioning, and equivalence relations that guarantee structural or probabilistic uniformity across domains. First developed in graph theory to analyze modular statistics for counts of graph substructures in random graphs (DeMarco et al., 2012), PCD has since been extended and adapted for global-optimal decoding in codes (Barman et al., 2012, Liu et al., 12 Jan 2026), entanglement decomposition in quantum information (Guerrero et al., 2023), and distributed decision problems (Farjoun et al., 20 Oct 2025). PCD is notable for its ability to replace brute-force or highly correlated computations with recursive, parity-controlled decompositions yielding analytic simplicity, complexity reduction, or invariance.
1. Foundational Definition and General Structure
In its foundational incarnation, PCD is a combinatorial graph-theoretic construct. For a fixed disconnected graph with pairwise non-isomorphic , a "gluing" is a tuple where is connected, , and the edge sets cover . The decomposition is parity-consistent (tree-like) if is uniquely decomposable and its structure graph is a tree (DeMarco et al., 2012).
The central algebraic operation is expansion of counting functions (e.g., , the number of occurrences of in a host graph ) into signed sums over connected gluings:
where is determined by Möbius inversion on partitions. Parity-consistency guarantees for tree-like gluings, forcing invertibility modulo any .
2. Uniformity and Independence: Graphs and Random Structures
Kolaitis-Kopparty’s theorem ensures that for any distinct , the vector is asymptotically uniform and independent in (DeMarco et al., 2012). PCD leverages this by constructing tree-like gluings that force the main term in the inclusion-exclusion expansion to have coefficient , ensuring that is -close to uniform for any . This mechanism generalizes to infinite families, including all disconnected graphs with two-connected components and all subgraph-free families.
Algorithmic realization uses diameter-ordered gluings: start with , iteratively glue each at diametral vertices, then verify unique decomposability (e.g., via block-degree or cut-vertex checks).
3. Parity-Consistent Decomposition in Factor-Connected Graphs and Matching Theory
The general Kotzig–Lovász decomposition for min-joins in grafts (Kita, 2017) is a direct generalization of the classical Kotzig-Lovász decomposition, widely applied in matching theory. Given a graft , a -join is a subset of edges with odd degree at and even elsewhere. PCD for joins proceeds by defining allowed edges (present in at least one minimum join), establishing factor-connectivity, and an equivalence relation:
where is the minimum-weight path metric derived from a fixed join and a weight function .
The set of equivalence classes defines the PCD of the graft; each class refines the classical decomposition, and the scheme generalizes perfectly to perfect matchings when . The construction algorithm involves finding a minimum join, extracting allowed edges and factor-components, and applying union-find over zero-distance pairs to label classes; time complexity is polynomial, matching specialized -join algorithms.
4. Recursive Decoding and Weight Distributions: Polar Codes
In coding theory, the recursive PCD method enables efficient computation of the Hamming weight distribution (WD) for polar codes, even after pre-transformation by an upper-triangular matrix (Liu et al., 12 Jan 2026). Key steps:
- Identify "matched pairs" and partition the code as a union of PCD-cosets.
- Develop the expanded information set to eliminate bit dependencies; for any where depends on via , expand into and cosets.
- Recursively compute the WD via convolution, halving code length in each recursions.
Equivalence class theory further optimizes complexity: codes with equivalent WD admit multiple pre-transforms—choosing one minimizing drastically reduces the exponential search space (with speedups up to observed numerically for ). The overall complexity for PCD is , with typically much less than the naive enumeration required by previous approaches.
5. Convex Geometry and Distributed Decoding: LP Decoding via ADMM
For LP decoding of LDPC codes, the PCD method incorporates a decomposition via the two-slice lemma for the parity polytope (Barman et al., 2012), enabling distributed optimization with ADMM. Each parity-check projects onto the parity polytope , the convex hull of binary vectors with even Hamming weight. The two-slice lemma states that any is a mixture of two permutahedra slices and (for the maximal even integer below ):
Projection onto is via searching for a unique mixing parameter , marching through at most $2d$ breakpoints.
ADMM proceeds by alternating -updates (projecting onto ), local -updates (projection onto for each check), and dual variable updates. The decomposition is fully distributed, parallelizable, and ensures convergence to the global LP optimum; practical performance matches belief propagation in speed, with superior high-SNR behavior and no error floor.
6. Quantum Schmidt Decomposition and Parity-Adapted States
PCD extends naturally to quantum information, specifically to the decomposition of Schrödinger-cat (parity-adapted) coherent states for symmetric multi-quDit systems (Guerrero et al., 2023). Here, the relevant parity group is and the projectors select definite parity sectors in the symmetric Hilbert space of bosonic quDits.
For a pure parity-cat state under partial trace, the reduced density matrix is diagonalized via the Schmidt decomposition:
with explicit formulas for Schmidt coefficients in terms of normalization overlaps. Asymptotic regimes (single and double thermodynamic limit) demonstrate robustness and unification with photonic beam-splitter decoherence models.
7. Strategyproof Mechanism Design and Social Choice
The Proportional Circle Distance (PCD) mechanism in facility location for agents on a circle (Farjoun et al., 20 Oct 2025) formalizes the selection rule: the facility is assigned to agent 's reported point with probability , the length of the arc in front of . For odd , a parity-consistent partition coloring odd/even arcs proves strategyproofness, and intricate algebraic reduction—median-optimal agent, arc constraints—establishes the tight worst-case approximation ratio for . Hypotheses for general odd locate the extremal regime in two-cluster-plus-outlier profiles.
8. Examples and Infinite Families
PCD provides uniformity for two-component graphs (excluding trivial ), for subgraph-free families, for graph families with two-connected components, and also admits explicit algorithmic construction and verification. However, fully generic recursive recipes do not exist for arbitrary ; uniqueness of decomposition and parity-tree structure cannot be guaranteed in all cases (DeMarco et al., 2012).
9. Connections and Unification Principles
PCD’s unifying feature is its control over dependencies via parity and recursive partitioning, with rigorous algebraic justification. In all contexts—graph counting, decoding, quantum state decomposition, mechanism design—the method introduces either independence, uniformity, or analytic tractability, often by reducing global constraints to tractable local or recursive structure with parity-symmetric guarantees. The method also serves as the core for more sophisticated decompositions, such as the basilica decomposition in matching theory (Kita, 2017).
Table: Summary of PCD Instantiations
| Domain | Core PCD Concept | Key Guarantee/Benefit |
|---|---|---|
| Random Graphs | Tree-like gluing | Uniformity mod for graph counts |
| Matching Theory | Factor-connected classes | Canonical decomposition/refinement |
| Coding Theory | Recursive coset partition | Efficient weight distribution |
| Convex Geometry | Two-slice parity polytope | Fast distributed decoding via ADMM |
| Quantum Info | Parity-adapted Schmidt | Exact entanglement structure |
| Soc. Choice | Parity/arc partition | Strategyproofness for odd agents |
In all applicative forms, the existence and verification of parity-consistency—whether in combinatorial gluings, join equivalence classes, or convex slices—drives analytic tractability, independence regimes, or algorithmic efficiency. The method’s scope spans combinatorics, optimization, coding, quantum theory, and social choice, and remains a central algebraic-combinatorial paradigm for modern probabilistic and distributed systems.