Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Enhancing Efficiency in Vision Transformer Networks: Design Techniques and Insights (2403.19882v1)

Published 28 Mar 2024 in eess.IV, cs.CV, and cs.LG

Abstract: Intrigued by the inherent ability of the human visual system to identify salient regions in complex scenes, attention mechanisms have been seamlessly integrated into various Computer Vision (CV) tasks. Building upon this paradigm, Vision Transformer (ViT) networks exploit attention mechanisms for improved efficiency. This review navigates the landscape of redesigned attention mechanisms within ViTs, aiming to enhance their performance. This paper provides a comprehensive exploration of techniques and insights for designing attention mechanisms, systematically reviewing recent literature in the field of CV. This survey begins with an introduction to the theoretical foundations and fundamental concepts underlying attention mechanisms. We then present a systematic taxonomy of various attention mechanisms within ViTs, employing redesigned approaches. A multi-perspective categorization is proposed based on their application, objectives, and the type of attention applied. The analysis includes an exploration of the novelty, strengths, weaknesses, and an in-depth evaluation of the different proposed strategies. This culminates in the development of taxonomies that highlight key properties and contributions. Finally, we gather the reviewed studies along with their available open-source implementations at our \href{https://github.com/mindflow-institue/Awesome-Attention-Mechanism-in-Medical-Imaging}{GitHub}\footnote{\url{https://github.com/xmindflow/Awesome-Attention-Mechanism-in-Medical-Imaging}}. We aim to regularly update it with the most recent relevant papers.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (11)
  1. Moein Heidari (18 papers)
  2. Reza Azad (52 papers)
  3. Sina Ghorbani Kolahi (4 papers)
  4. René Arimond (2 papers)
  5. Leon Niggemeier (2 papers)
  6. Alaa Sulaiman (4 papers)
  7. Afshin Bozorgpour (17 papers)
  8. Ehsan Khodapanah Aghdam (13 papers)
  9. Amirhossein Kazerouni (19 papers)
  10. Ilker Hacihaliloglu (38 papers)
  11. Dorit Merhof (75 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.