Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
60 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
8 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

$\infty$-former: Infinite Memory Transformer (2109.00301v3)

Published 1 Sep 2021 in cs.CL

Abstract: Transformers are unable to model long-term memories effectively, since the amount of computation they need to perform grows with the context length. While variations of efficient transformers have been proposed, they all have a finite memory capacity and are forced to drop old information. In this paper, we propose the $\infty$-former, which extends the vanilla transformer with an unbounded long-term memory. By making use of a continuous-space attention mechanism to attend over the long-term memory, the $\infty$-former's attention complexity becomes independent of the context length, trading off memory length with precision. In order to control where precision is more important, $\infty$-former maintains "sticky memories" being able to model arbitrarily long contexts while keeping the computation budget fixed. Experiments on a synthetic sorting task, LLMing, and document grounded dialogue generation demonstrate the $\infty$-former's ability to retain information from long sequences.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Pedro Henrique Martins (11 papers)
  2. Zita Marinho (15 papers)
  3. André F. T. Martins (113 papers)
Citations (11)
Youtube Logo Streamline Icon: https://streamlinehq.com