Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Neural Combinatory Constituency Parsing (2106.06689v1)

Published 12 Jun 2021 in cs.CL

Abstract: We propose two fast neural combinatory models for constituency parsing: binary and multi-branching. Our models decompose the bottom-up parsing process into 1) classification of tags, labels, and binary orientations or chunks and 2) vector composition based on the computed orientations or chunks. These models have theoretical sub-quadratic complexity and empirical linear complexity. The binary model achieves an F1 score of 92.54 on Penn Treebank, speeding at 1327.2 sents/sec. Both the models with XLNet provide near state-of-the-art accuracies for English. Syntactic branching tendency and headedness of a language are observed during the training and inference processes for Penn Treebank, Chinese Treebank, and Keyaki Treebank (Japanese).

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Zhousi Chen (1 paper)
  2. Longtu Zhang (4 papers)
  3. Aizhan Imankulova (6 papers)
  4. Mamoru Komachi (40 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.