Papers
Topics
Authors
Recent
2000 character limit reached

A systematic approach to random data augmentation on graph neural networks

Published 8 Dec 2021 in cs.LG, cs.AI, and stat.ML | (2112.04314v2)

Abstract: Random data augmentations (RDAs) are state of the art regarding practical graph neural networks that are provably universal. There is great diversity regarding terminology, methodology, benchmarks, and evaluation metrics used among existing RDAs. Not only does this make it increasingly difficult for practitioners to decide which technique to apply to a given problem, but it also stands in the way of systematic improvements. We propose a new comprehensive framework that captures all previous RDA techniques. On the theoretical side, among other results, we formally prove that under natural conditions all instantiations of our framework are universal. On the practical side, we develop a method to systematically and automatically train RDAs. This in turn enables us to impartially and objectively compare all existing RDAs. New RDAs naturally emerge from our approach, and our experiments demonstrate that they improve the state of the art.

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Paper to Video (Beta)

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.