Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

The Importance of Being Recurrent for Modeling Hierarchical Structure (1803.03585v2)

Published 9 Mar 2018 in cs.CL

Abstract: Recent work has shown that recurrent neural networks (RNNs) can implicitly capture and exploit hierarchical information when trained to solve common natural language processing tasks such as LLMing (Linzen et al., 2016) and neural machine translation (Shi et al., 2016). In contrast, the ability to model structured data with non-recurrent neural networks has received little attention despite their success in many NLP tasks (Gehring et al., 2017; Vaswani et al., 2017). In this work, we compare the two architectures---recurrent versus non-recurrent---with respect to their ability to model hierarchical structure and find that recurrency is indeed important for this purpose.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Ke Tran (12 papers)
  2. Arianna Bisazza (43 papers)
  3. Christof Monz (54 papers)
Citations (149)

Summary

We haven't generated a summary for this paper yet.