Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learning a Fourier Transform for Linear Relative Positional Encodings in Transformers (2302.01925v2)

Published 3 Feb 2023 in cs.LG

Abstract: We propose a new class of linear Transformers called FourierLearner-Transformers (FLTs), which incorporate a wide range of relative positional encoding mechanisms (RPEs). These include regular RPE techniques applied for sequential data, as well as novel RPEs operating on geometric data embedded in higher-dimensional Euclidean spaces. FLTs construct the optimal RPE mechanism implicitly by learning its spectral representation. As opposed to other architectures combining efficient low-rank linear attention with RPEs, FLTs remain practical in terms of their memory usage and do not require additional assumptions about the structure of the RPE mask. Besides, FLTs allow for applying certain structural inductive bias techniques to specify masking strategies, e.g. they provide a way to learn the so-called local RPEs introduced in this paper and give accuracy gains as compared with several other linear Transformers for LLMing. We also thoroughly test FLTs on other data modalities and tasks, such as image classification, 3D molecular modeling, and learnable optimizers. To the best of our knowledge, for 3D molecular data, FLTs are the first Transformer architectures providing linear attention and incorporating RPE masking.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (10)
  1. Krzysztof Marcin Choromanski (3 papers)
  2. Shanda Li (15 papers)
  3. Valerii Likhosherstov (25 papers)
  4. Kumar Avinava Dubey (5 papers)
  5. Shengjie Luo (20 papers)
  6. Di He (108 papers)
  7. Yiming Yang (151 papers)
  8. Tamas Sarlos (40 papers)
  9. Thomas Weingarten (2 papers)
  10. Adrian Weller (150 papers)
Citations (5)
X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets