Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
127 tokens/sec
GPT-4o
11 tokens/sec
Gemini 2.5 Pro Pro
53 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
10 tokens/sec
DeepSeek R1 via Azure Pro
33 tokens/sec
2000 character limit reached

Transformer Grammars: Augmenting Transformer Language Models with Syntactic Inductive Biases at Scale (2203.00633v2)

Published 1 Mar 2022 in cs.CL

Abstract: We introduce Transformer Grammars (TGs), a novel class of Transformer LLMs that combine (i) the expressive power, scalability, and strong performance of Transformers and (ii) recursive syntactic compositions, which here are implemented through a special attention mask and deterministic transformation of the linearized tree. We find that TGs outperform various strong baselines on sentence-level LLMing perplexity, as well as on multiple syntax-sensitive LLMing evaluation metrics. Additionally, we find that the recursive syntactic composition bottleneck which represents each sentence as a single vector harms perplexity on document-level LLMing, providing evidence that a different kind of memory mechanism -- one that is independent of composed syntactic representations -- plays an important role in current successful models of long text.

Citations (45)

Summary

We haven't generated a summary for this paper yet.