Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
38 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Conv-Basis: A New Paradigm for Efficient Attention Inference and Gradient Computation in Transformers (2405.05219v2)

Published 8 May 2024 in cs.LG and cs.AI

Abstract: The self-attention mechanism is the key to the success of transformers in recent LLMs. However, the quadratic computational cost $O(n2)$ in the input sequence length $n$ is a notorious obstacle for further improvement and scalability in longer contexts. In this work, we leverage the convolution-like structure of attention matrices to develop an efficient approximation method for attention computation using convolution matrices. We propose a $\mathsf{conv}$ basis system, analogous to the rank basis, and show that any lower triangular matrix can always be decomposed as a sum of structured convolution matrices in this basis. We then design a fast algorithm to approximate the attention matrix via a sum of such $k$ convolution matrices. This allows us to compute the attention {\it inference} via Fast Fourier Transforms (FFT) in $O(knd \log n)$ time, where $d$ is the hidden dimension, and thus achieve almost linear time $n{1+o(1)}$ in the practical scenario where $kd = n{o(1)}$. Furthermore, the attention {\it training forward} and {\it backward gradient} can be computed in $n{1+o(1)}$ as well. We provide theoretical guarantees on the run time and approximation error and conduct preliminary experiments to evaluate its effectiveness. We hope our new paradigm for accelerating attention computation in transformer models can help their application to longer contexts.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Yingyu Liang (107 papers)
  2. Heshan Liu (6 papers)
  3. Zhenmei Shi (60 papers)
  4. Zhao Song (253 papers)
  5. Junze Yin (26 papers)
  6. Zhuoyan Xu (8 papers)
Citations (14)
X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets