Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learning Operations on a Stack with Neural Turing Machines (1612.00827v1)

Published 2 Dec 2016 in cs.LG

Abstract: Multiple extensions of Recurrent Neural Networks (RNNs) have been proposed recently to address the difficulty of storing information over long time periods. In this paper, we experiment with the capacity of Neural Turing Machines (NTMs) to deal with these long-term dependencies on well-balanced strings of parentheses. We show that not only does the NTM emulate a stack with its heads and learn an algorithm to recognize such words, but it is also capable of strongly generalizing to much longer sequences.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Tristan Deleu (31 papers)
  2. Joseph Dureau (9 papers)
Citations (8)

Summary

We haven't generated a summary for this paper yet.