Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On the Feasibility of Simple Transformer for Dynamic Graph Modeling (2401.14009v2)

Published 25 Jan 2024 in cs.SI

Abstract: Dynamic graph modeling is crucial for understanding complex structures in web graphs, spanning applications in social networks, recommender systems, and more. Most existing methods primarily emphasize structural dependencies and their temporal changes. However, these approaches often overlook detailed temporal aspects or struggle with long-term dependencies. Furthermore, many solutions overly complicate the process by emphasizing intricate module designs to capture dynamic evolutions. In this work, we harness the strength of the Transformer's self-attention mechanism, known for adeptly handling long-range dependencies in sequence modeling. Our approach offers a simple Transformer model, called SimpleDyG, tailored for dynamic graph modeling without complex modifications. We re-conceptualize dynamic graphs as a sequence modeling challenge and introduce a novel temporal alignment technique. This technique not only captures the inherent temporal evolution patterns within dynamic graphs but also streamlines the modeling process of their evolution. To evaluate the efficacy of SimpleDyG, we conduct extensive experiments on four real-world datasets from various domains. The results demonstrate the competitive performance of SimpleDyG in comparison to a series of state-of-the-art approaches despite its simple design.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Yuxia Wu (10 papers)
  2. Yuan Fang (146 papers)
  3. Lizi Liao (44 papers)
Citations (9)