Papers
Topics
Authors
Recent
2000 character limit reached

SSFG: Stochastically Scaling Features and Gradients for Regularizing Graph Convolutional Networks

Published 20 Feb 2021 in cs.LG and cs.CV | (2102.10338v2)

Abstract: Graph convolutional networks have been successfully applied in various graph-based tasks. In a typical graph convolutional layer, node features are updated by aggregating neighborhood information. Repeatedly applying graph convolutions can cause the oversmoothing issue, i.e., node features at deep layers converge to similar values. Previous studies have suggested that oversmoothing is one of the major issues that restrict the performance of graph convolutional networks. In this paper, we propose a stochastic regularization method to tackle the oversmoothing problem. In the proposed method, we stochastically scale features and gradients (SSFG) by a factor sampled from a probability distribution in the training procedure. By explicitly applying a scaling factor to break feature convergence, the oversmoothing issue is alleviated. We show that applying stochastic scaling at the gradient level is complementary to that applied at the feature level to improve the overall performance. Our method does not increase the number of trainable parameters. When used together with ReLU, our SSFG can be seen as a stochastic ReLU activation function. We experimentally validate our SSFG regularization method on three commonly used types of graph networks. Extensive experimental results on seven benchmark datasets for four graph-based tasks demonstrate that our SSFG regularization is effective in improving the overall performance of the baseline graph networks.

Citations (8)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.