Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

StructFormer: Joint Unsupervised Induction of Dependency and Constituency Structure from Masked Language Modeling (2012.00857v3)

Published 1 Dec 2020 in cs.CL, cs.AI, and cs.LG

Abstract: There are two major classes of natural language grammar -- the dependency grammar that models one-to-one correspondences between words and the constituency grammar that models the assembly of one or several corresponded words. While previous unsupervised parsing methods mostly focus on only inducing one class of grammars, we introduce a novel model, StructFormer, that can simultaneously induce dependency and constituency structure. To achieve this, we propose a new parsing framework that can jointly generate a constituency tree and dependency graph. Then we integrate the induced dependency relations into the transformer, in a differentiable manner, through a novel dependency-constrained self-attention mechanism. Experimental results show that our model can achieve strong results on unsupervised constituency parsing, unsupervised dependency parsing, and masked LLMing at the same time.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Yikang Shen (62 papers)
  2. Yi Tay (94 papers)
  3. Che Zheng (8 papers)
  4. Dara Bahri (30 papers)
  5. Donald Metzler (49 papers)
  6. Aaron Courville (201 papers)
Citations (40)

Summary

We haven't generated a summary for this paper yet.