Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Fast Weight Long Short-Term Memory (1804.06511v1)

Published 18 Apr 2018 in cs.NE and cs.LG

Abstract: Associative memory using fast weights is a short-term memory mechanism that substantially improves the memory capacity and time scale of recurrent neural networks (RNNs). As recent studies introduced fast weights only to regular RNNs, it is unknown whether fast weight memory is beneficial to gated RNNs. In this work, we report a significant synergy between long short-term memory (LSTM) networks and fast weight associative memories. We show that this combination, in learning associative retrieval tasks, results in much faster training and lower test error, a performance boost most prominent at high memory task difficulties.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. T. Anderson Keller (15 papers)
  2. Sharath Nittur Sridhar (16 papers)
  3. Xin Wang (1307 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.