Papers
Topics
Authors
Recent
2000 character limit reached

Grammar-Constrained Decoding

Updated 14 September 2025
  • Grammar-Constrained Decoding is a process that enforces formal syntactic constraints by limiting outputs to those derivable from specified grammars, such as CFGs.
  • It leverages different grammar classes, including linear grammars, to reduce computational complexity from cubic to quadratic time in many decoding and propagation tasks.
  • The method supports compound constraints, enabling integration (e.g., EDITDISTANCE with REGULAR constraints) for practical applications in scheduling, biology, and language processing.

Grammar-constrained decoding is the process of strictly enforcing syntactic constraints, defined by formal grammars, during the generation of candidate sequences in tasks such as combinatorial search, sequence modeling, and structured prediction. In this paradigm, the set of permissible outputs is limited to those derivable by a target grammar—typically a context-free grammar (CFG) or one of its subclasses such as linear grammars or regular languages—thus ensuring outputs conform to complex, often application-specific structure. At its core, grammar-constrained decoding enables both the expression and enforcement of global dependencies among variables or tokens through the explicit representation of the language of acceptable outputs.

1. Role of Context-Free Grammars in Decoding

A context-free grammar G=(N,T,P,S)G = (N, T, P, S), where NN is the set of non-terminals, TT the set of terminals, PP the productions, and SS the start symbol, provides the generative mechanism for the set of strings L(G)L(G). In grammar-constrained decoding, the task is to ensure that the sequence [X1,X2,,Xn][X_1,\,X_2,\,\dots,\,X_n] of variable assignments yields w=x1x2xnL(G)w = x_1x_2\cdots x_n \in L(G). This gives rise to the global GRAMMAR constraint:

GRAMMAR([X1,...,Xn],G)    x1x2xnL(G)\text{GRAMMAR}([X_1, ..., X_n], G) \iff x_1x_2\cdots x_n \in L(G)

In practical decoding frameworks—such as those encountered in parsing, scheduling, and language processing—one might be motivated to restrict GG to classes with desirable properties (e.g., deterministic or unambiguous grammars in Greibach normal form, LL(1) grammars), on the intuition that restricted grammars could enable more efficient constraint propagation or reduce complexity. However, as rigorously shown in Theorem 1 and Corollary 1 of the reference, detecting whether any variable assignment yields a string in L(G)L(G) (i.e., testing disentailment of the constraint) is, for deterministic and unambiguous CFGs, as hard as unrestricted CFG parsing. This hardness persists even if GG is in Greibach form (a simple and deterministic grammar). Formally, any attempt to exploit the structure of these restricted grammars to yield asymptotically faster solutions is stymied by lower bounds for context-free parsing (cubic or better, but with complexity tightly coupled to advances in Boolean matrix multiplication).

2. Linear Grammars and Propagation Complexity

Linear grammars—CFGs where every production contains at most one non-terminal on the right-hand side—occupy an important intermediate expressiveness class: strictly more expressive than regular languages, but less so than general CFGs. Their crucial property in the context of grammar-constrained decoding is that membership testing, as well as generalized propagation for the GRAMMAR constraint, can be performed in O(n2G)O(n^2|G|) time.

This is achieved through a variant of the CYK parsing algorithm tailored to linear grammars:

  • Bottom-up phase: A dynamic programming table is constructed over substrings, with fewer candidate derivations per cell due to the restriction of productions, leading to quadratic instead of cubic complexity.
  • Top-down phase: Pruning is applied to eliminate unsupported values, ensuring domain (generalized arc) consistency.

The significance of this quadratic-time propagator is formalized in Theorem 3. The propagation algorithm's complexity crucially depends on the property that a linear grammar's production rules limit the number of derivations per substring, thus bounding the computational effort.

3. Applications: Constraint Encoding and Combined Constraints

One major application is the encoding of the EDITDISTANCE constraint—measuring the minimum number of edit operations to transform one string into another—as a weighted GRAMMAR constraint. For sequences XX and YY, the approach is as follows:

  • Construct a weighted linear grammar GedG_{ed} with productions:
    • SdSdS \rightarrow d\,S\,d (weight $0$): matching characters.
    • Sd1Sd2S \rightarrow d_1\,S\,d_2 (weight $1$): substitution, d1d2d_1 \neq d_2.
    • SdSS \rightarrow d\,S or SSdS \rightarrow S\,d (weight $1$): insertion/deletion.
    • S#S \rightarrow \# (weight $0$): sentinel.
  • Form the sequence X1..Xm#Yn..Y1X_1..X_m\,\#\,Y_n..Y_1.
  • A derivation of minimum weight N\leq N encodes the constraint EDITDISTANCE(X,Y,N)\text{EDITDISTANCE}(X, Y, N).

Furthermore, grammar constraints can be conjoined; e.g., intersecting a linear grammar (capturing EDITDISTANCE) and a regular grammar (capturing a finite automaton-based REGULAR constraint). Via the "triple construction" technique, the resulting composite grammar's size is quadratic in the automaton's size, and the combined propagation complexity remains quadratic in the sequence length (adjusted for automaton size). This allows for the efficient composition of multiple sequence constraints—REGULAR(X,R1)REGULAR(Y,R2)EDITDISTANCE(X,Y,N)\text{REGULAR}(X, R_1) \land \text{REGULAR}(Y, R_2) \land \text{EDITDISTANCE}(X, Y, N)—with a global quadratic time bound.

4. Mathematical Formulation and Technical Realization

Key formal elements include:

  • CFG definition: G=(N,T,P,S)G = (N, T, P, S).
  • The GRAMMAR constraint operationalizes as GRAMMAR([X1,...,Xn],G)=(x1x2xnL(G))\text{GRAMMAR}([X_1, ..., X_n], G) = \left(x_1x_2\cdots x_n \in L(G)\right).
  • The weighted grammar for EDITDISTANCE employs the following production scheme, paired with derivation weights:
    • SdSdS \rightarrow d\,S\,d (w=0w=0), Sd1Sd2S \rightarrow d_1\,S\,d_2 (w=1w=1), SdSS \rightarrow d\,S or SSdS \rightarrow S\,d (w=1w=1), S#S \rightarrow \# (w=0w=0).
  • For combining with REGULAR, non-terminals in the intersection grammar take the form (F,A,F)(F, A, F'), where FF and FF' index states/non-terminals of the automaton, and AA is from GG.

Algorithmically, the core propagator for a linear grammar is based on CYK-style dynamic programming supplemented by a weights mechanism for soft constraints and an intersection construction for conjoining multiple constraints.

5. Practical Impact and Empirical Results

The implications of these results are direct for various real-world structured generation, verification, and search tasks:

  • In scheduling domains such as nurse rostering, global temporal requirements can be natively embedded via CFGs.
  • Encoding error-correcting (e.g., minimum edit distance), string matching, or similarity constraints as weighted GRAMMAR constraints supports tasks in computational biology and string correction.
  • When constraints are specified as linear grammars (or their intersection with regular languages), the theoretical quadratic time complexity matches practical performance—experimentally confirmed in the cited work via competitive runtimes in Table 1.

An important finding is negative: deterministic and unambiguous CFGs do not offer asymptotic improvements in propagation over arbitrary CFGs. Only when restricting to linear grammars does a computational advantage manifest. This demarcates the boundary of tractability in grammar-constrained decoding for constraint satisfaction and propagation, guiding practitioners in the choice of grammar formalisms.

6. Concluding Insights

"Restricted Global Grammar Constraints" demonstrates that the boundary between efficiently enforceable and intractable grammar constraints for decoding aligns with the class of linear grammars. Efficient (O(n2G)O(n^2|G|)) propagators exist there; for deterministic/unambiguous grammars, propagation is as hard as general parsing. The weighted/conjoined constraint construction further exemplifies the flexibility and power of grammar-constrained decoding, especially in compound sequence tasks requiring both soft and hard constraints. This informs both the theoretical design and practical implementation of constraint propagation systems, string processing engines, and sequence modeling frameworks that utilize global grammar constraints.

Whiteboard

Follow Topic

Get notified by email when new papers are published related to Grammar-Constrained Decoding.