Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Prototypical Recurrent Unit (1611.06530v2)

Published 20 Nov 2016 in cs.LG

Abstract: Despite the great successes of deep learning, the effectiveness of deep neural networks has not been understood at any theoretical depth. This work is motivated by the thrust of developing a deeper understanding of recurrent neural networks, particularly LSTM/GRU-like networks. As the highly complex structure of the recurrent unit in LSTM and GRU networks makes them difficult to analyze, our methodology in this research theme is to construct an alternative recurrent unit that is as simple as possible and yet also captures the key components of LSTM/GRU recurrent units. Such a unit can then be used for the study of recurrent networks and its structural simplicity may allow easier analysis. Towards that goal, we take a system-theoretic perspective to design a new recurrent unit, which we call the prototypical recurrent unit (PRU). Not only having minimal complexity, PRU is demonstrated experimentally to have comparable performance to GRU and LSTM unit. This establishes PRU networks as a prototype for future study of LSTM/GRU-like recurrent networks. This paper also studies the memorization abilities of LSTM, GRU and PRU networks, motivated by the folk belief that such networks possess long-term memory. For this purpose, we design a simple and controllable task, called memorization problem'', where the networks are trained to memorize certain targeted information. We show that the memorization performance of all three networks depends on the amount of targeted information, the amount ofinterfering" information, and the state space dimension of the recurrent unit. Experiments are also performed for another controllable task, the adding problem, and similar conclusions are obtained.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Dingkun Long (23 papers)
  2. Richong Zhang (47 papers)
  3. Yongyi Mao (45 papers)
Citations (6)

Summary

We haven't generated a summary for this paper yet.