Unsupervised node clustering via contrastive hard sampling (2409.07718v1)
Abstract: This paper introduces a fine-grained contrastive learning scheme for unsupervised node clustering. Previous clustering methods only focus on a small feature set (class-dependent features), which demonstrates explicit clustering characteristics, ignoring the rest of the feature spaces (class-invariant features). This paper exploits class-invariant features via graph contrastive learning to discover additional high-quality features for unsupervised clustering. We formulate a novel node-level fine-grained augmentation framework for self-supervised learning, which iteratively identifies competitive contrastive samples from the whole feature spaces, in the form of positive and negative examples of node relations. While positive examples of node relations are usually expressed as edges in graph homophily, negative examples are implicit without a direct edge. We show, however, that simply sampling nodes beyond the local neighborhood results in less competitive negative pairs, that are less effective for contrastive learning. Inspired by counterfactual augmentation, we instead sample competitive negative node relations by creating virtual nodes that inherit (in a self-supervised fashion) class-invariant features, while altering class-dependent features, creating contrasting pairs that lie closer to the boundary and offering better contrast. Consequently, our experiments demonstrate significant improvements in supervised node clustering tasks on six baselines and six real-world social network datasets.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Collections
Sign up for free to add this paper to one or more collections.