Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Efficient Diffusion Transformer with Step-wise Dynamic Attention Mediators (2408.05710v1)

Published 11 Aug 2024 in cs.CV

Abstract: This paper identifies significant redundancy in the query-key interactions within self-attention mechanisms of diffusion transformer models, particularly during the early stages of denoising diffusion steps. In response to this observation, we present a novel diffusion transformer framework incorporating an additional set of mediator tokens to engage with queries and keys separately. By modulating the number of mediator tokens during the denoising generation phases, our model initiates the denoising process with a precise, non-ambiguous stage and gradually transitions to a phase enriched with detail. Concurrently, integrating mediator tokens simplifies the attention module's complexity to a linear scale, enhancing the efficiency of global attention processes. Additionally, we propose a time-step dynamic mediator token adjustment mechanism that further decreases the required computational FLOPs for generation, simultaneously facilitating the generation of high-quality images within the constraints of varied inference budgets. Extensive experiments demonstrate that the proposed method can improve the generated image quality while also reducing the inference cost of diffusion transformers. When integrated with the recent work SiT, our method achieves a state-of-the-art FID score of 2.01. The source code is available at https://github.com/LeapLabTHU/Attention-Mediators.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (12)
  1. Yifan Pu (22 papers)
  2. Zhuofan Xia (12 papers)
  3. Jiayi Guo (24 papers)
  4. Dongchen Han (12 papers)
  5. Qixiu Li (4 papers)
  6. Duo Li (31 papers)
  7. Yuhui Yuan (42 papers)
  8. Ji Li (186 papers)
  9. Yizeng Han (33 papers)
  10. Shiji Song (103 papers)
  11. Gao Huang (178 papers)
  12. Xiu Li (166 papers)
Citations (5)

Summary

We haven't generated a summary for this paper yet.