Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
129 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Structure-Aware Label Smoothing for Graph Neural Networks (2112.00499v1)

Published 1 Dec 2021 in cs.LG, cs.AI, and cs.SI

Abstract: Representing a label distribution as a one-hot vector is a common practice in training node classification models. However, the one-hot representation may not adequately reflect the semantic characteristics of a node in different classes, as some nodes may be semantically close to their neighbors in other classes. It would cause over-confidence since the models are encouraged to assign full probabilities when classifying every node. While training models with label smoothing can ease this problem to some degree, it still fails to capture the nodes' semantic characteristics implied by the graph structures. In this work, we propose a novel SALS (\textit{Structure-Aware Label Smoothing}) method as an enhancement component to popular node classification models. SALS leverages the graph structures to capture the semantic correlations between the connected nodes and generate the structure-aware label distribution to replace the original one-hot label vectors, thus improving the node classification performance without inference costs. Extensive experiments on seven node classification benchmark datasets reveal the effectiveness of our SALS on improving both transductive and inductive node classification. Empirical results show that SALS is superior to the label smoothing method and enhances the node classification models to outperform the baseline methods.

Citations (3)

Summary

We haven't generated a summary for this paper yet.