Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Graph Representation Learning Beyond Node and Homophily (2203.01564v1)

Published 3 Mar 2022 in cs.LG and cs.SI

Abstract: Unsupervised graph representation learning aims to distill various graph information into a downstream task-agnostic dense vector embedding. However, existing graph representation learning approaches are designed mainly under the node homophily assumption: connected nodes tend to have similar labels and optimize performance on node-centric downstream tasks. Their design is apparently against the task-agnostic principle and generally suffers poor performance in tasks, e.g., edge classification, that demands feature signals beyond the node-view and homophily assumption. To condense different feature signals into the embeddings, this paper proposes PairE, a novel unsupervised graph embedding method using two paired nodes as the basic unit of embedding to retain the high-frequency signals between nodes to support node-related and edge-related tasks. Accordingly, a multi-self-supervised autoencoder is designed to fulfill two pretext tasks: one retains the high-frequency signal better, and another enhances the representation of commonality. Our extensive experiments on a diversity of benchmark datasets clearly show that PairE outperforms the unsupervised state-of-the-art baselines, with up to 101.1\% relative improvement on the edge classification tasks that rely on both the high and low-frequency signals in the pair and up to 82.5\% relative performance gain on the node classification tasks.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. You Li (58 papers)
  2. Bei Lin (2 papers)
  3. Binli Luo (3 papers)
  4. Ning Gui (16 papers)
Citations (14)

Summary

We haven't generated a summary for this paper yet.