Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Transformer for Graphs: An Overview from Architecture Perspective (2202.08455v1)

Published 17 Feb 2022 in cs.LG and cs.AI

Abstract: Recently, Transformer model, which has achieved great success in many artificial intelligence fields, has demonstrated its great potential in modeling graph-structured data. Till now, a great variety of Transformers has been proposed to adapt to the graph-structured data. However, a comprehensive literature review and systematical evaluation of these Transformer variants for graphs are still unavailable. It's imperative to sort out the existing Transformer models for graphs and systematically investigate their effectiveness on various graph tasks. In this survey, we provide a comprehensive review of various Graph Transformer models from the architectural design perspective. We first disassemble the existing models and conclude three typical ways to incorporate the graph information into the vanilla Transformer: 1) GNNs as Auxiliary Modules, 2) Improved Positional Embedding from Graphs, and 3) Improved Attention Matrix from Graphs. Furthermore, we implement the representative components in three groups and conduct a comprehensive comparison on various kinds of famous graph data benchmarks to investigate the real performance gain of each component. Our experiments confirm the benefits of current graph-specific modules on Transformer and reveal their advantages on different kinds of graph tasks.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (10)
  1. Erxue Min (8 papers)
  2. Runfa Chen (6 papers)
  3. Yatao Bian (60 papers)
  4. Tingyang Xu (55 papers)
  5. Kangfei Zhao (18 papers)
  6. Wenbing Huang (95 papers)
  7. Peilin Zhao (127 papers)
  8. Junzhou Huang (137 papers)
  9. Sophia Ananiadou (72 papers)
  10. Yu Rong (146 papers)
Citations (126)