Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Understanding Graph Neural Networks from Graph Signal Denoising Perspectives (2006.04386v1)

Published 8 Jun 2020 in cs.LG and stat.ML

Abstract: Graph neural networks (GNNs) have attracted much attention because of their excellent performance on tasks such as node classification. However, there is inadequate understanding on how and why GNNs work, especially for node representation learning. This paper aims to provide a theoretical framework to understand GNNs, specifically, spectral graph convolutional networks and graph attention networks, from graph signal denoising perspectives. Our framework shows that GNNs are implicitly solving graph signal denoising problems: spectral graph convolutions work as denoising node features, while graph attentions work as denoising edge weights. We also show that a linear self-attention mechanism is able to compete with the state-of-the-art graph attention methods. Our theoretical results further lead to two new models, GSDN-F and GSDN-EF, which work effectively for graphs with noisy node features and/or noisy edges. We validate our theoretical findings and also the effectiveness of our new models by experiments on benchmark datasets. The source code is available at \url{https://github.com/fuguoji/GSDN}.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Guoji Fu (14 papers)
  2. Yifan Hou (27 papers)
  3. Jian Zhang (544 papers)
  4. Kaili Ma (9 papers)
  5. Barakeel Fanseu Kamhoua (1 paper)
  6. James Cheng (76 papers)
Citations (18)

Summary

We haven't generated a summary for this paper yet.