Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Representations of Syntax [MASK] Useful: Effects of Constituency and Dependency Structure in Recursive LSTMs (2005.00019v1)

Published 30 Apr 2020 in cs.CL

Abstract: Sequence-based neural networks show significant sensitivity to syntactic structure, but they still perform less well on syntactic tasks than tree-based networks. Such tree-based networks can be provided with a constituency parse, a dependency parse, or both. We evaluate which of these two representational schemes more effectively introduces biases for syntactic structure that increase performance on the subject-verb agreement prediction task. We find that a constituency-based network generalizes more robustly than a dependency-based one, and that combining the two types of structure does not yield further improvement. Finally, we show that the syntactic robustness of sequential models can be substantially improved by fine-tuning on a small amount of constructed data, suggesting that data augmentation is a viable alternative to explicit constituency structure for imparting the syntactic biases that sequential models are lacking.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Tal Linzen (73 papers)
  2. R. Thomas McCoy (33 papers)
  3. Michael A. Lepori (14 papers)
Citations (11)

Summary

We haven't generated a summary for this paper yet.