Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Tree-structured composition in neural networks without tree-structured architectures (1506.04834v3)

Published 16 Jun 2015 in cs.CL and cs.LG

Abstract: Tree-structured neural networks encode a particular tree geometry for a sentence in the network design. However, these models have at best only slightly outperformed simpler sequence-based models. We hypothesize that neural sequence models like LSTMs are in fact able to discover and implicitly use recursive compositional structure, at least for tasks with clear cues to that structure in the data. We demonstrate this possibility using an artificial data task for which recursive compositional structure is crucial, and find an LSTM-based sequence model can indeed learn to exploit the underlying tree structure. However, its performance consistently lags behind that of tree models, even on large training sets, suggesting that tree-structured models are more effective at exploiting recursive structure.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Samuel R. Bowman (103 papers)
  2. Christopher D. Manning (169 papers)
  3. Christopher Potts (113 papers)
Citations (74)

Summary

We haven't generated a summary for this paper yet.