Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Graph Structure Learning with Variational Information Bottleneck (2112.08903v1)

Published 16 Dec 2021 in cs.LG and cs.AI

Abstract: Graph Neural Networks (GNNs) have shown promising results on a broad spectrum of applications. Most empirical studies of GNNs directly take the observed graph as input, assuming the observed structure perfectly depicts the accurate and complete relations between nodes. However, graphs in the real world are inevitably noisy or incomplete, which could even exacerbate the quality of graph representations. In this work, we propose a novel Variational Information Bottleneck guided Graph Structure Learning framework, namely VIB-GSL, in the perspective of information theory. VIB-GSL advances the Information Bottleneck (IB) principle for graph structure learning, providing a more elegant and universal framework for mining underlying task-relevant relations. VIB-GSL learns an informative and compressive graph structure to distill the actionable information for specific downstream tasks. VIB-GSL deduces a variational approximation for irregular graph data to form a tractable IB objective function, which facilitates training stability. Extensive experimental results demonstrate that the superior effectiveness and robustness of VIB-GSL.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Qingyun Sun (46 papers)
  2. Jianxin Li (128 papers)
  3. Hao Peng (291 papers)
  4. Jia Wu (93 papers)
  5. Xingcheng Fu (26 papers)
  6. Cheng Ji (40 papers)
  7. Philip S. Yu (592 papers)
Citations (125)

Summary

We haven't generated a summary for this paper yet.