Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Graph Convolutional Neural Networks via Motif-based Attention (1811.08270v2)

Published 11 Nov 2018 in cs.LG

Abstract: Many real-world problems can be represented as graph-based learning problems. In this paper, we propose a novel framework for learning spatial and attentional convolution neural networks on arbitrary graphs. Different from previous convolutional neural networks on graphs, we first design a motif-matching guided subgraph normalization method to capture neighborhood information. Then we implement subgraph-level self-attentional layers to learn different importances from different subgraphs to solve graph classification problems. Analogous to image-based attentional convolution networks that operate on locally connected and weighted regions of the input, we also extend graph normalization from one-dimensional node sequence to two-dimensional node grid by leveraging motif-matching, and design self-attentional layers without requiring any kinds of cost depending on prior knowledge of the graph structure. Our results on both bioinformatics and social network datasets show that we can significantly improve graph classification benchmarks over traditional graph kernel and existing deep models.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Hao Peng (291 papers)
  2. Jianxin Li (128 papers)
  3. Qiran Gong (4 papers)
  4. Senzhang Wang (57 papers)
  5. Yuanxing Ning (4 papers)
  6. Philip S. Yu (592 papers)
Citations (12)

Summary

We haven't generated a summary for this paper yet.