Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Anomaly Detection in Dynamic Graphs via Transformer (2106.09876v2)

Published 18 Jun 2021 in cs.LG

Abstract: Detecting anomalies for dynamic graphs has drawn increasing attention due to their wide applications in social networks, e-commerce, and cybersecurity. Recent deep learning-based approaches have shown promising results over shallow methods. However, they fail to address two core challenges of anomaly detection in dynamic graphs: the lack of informative encoding for unattributed nodes and the difficulty of learning discriminate knowledge from coupled spatial-temporal dynamic graphs. To overcome these challenges, in this paper, we present a novel Transformer-based Anomaly Detection framework for DYnamic graphs (TADDY). Our framework constructs a comprehensive node encoding strategy to better represent each node's structural and temporal roles in an evolving graphs stream. Meanwhile, TADDY captures informative representation from dynamic graphs with coupled spatial-temporal patterns via a dynamic graph transformer model. The extensive experimental results demonstrate that our proposed TADDY framework outperforms the state-of-the-art methods by a large margin on six real-world datasets.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Yixin Liu (108 papers)
  2. Shirui Pan (198 papers)
  3. Yu Guang Wang (59 papers)
  4. Fei Xiong (8 papers)
  5. Liang Wang (512 papers)
  6. Qingfeng Chen (7 papers)
  7. Vincent CS Lee (13 papers)
Citations (76)

Summary

We haven't generated a summary for this paper yet.