Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Graph-Based Neural Network Models with Multiple Self-Supervised Auxiliary Tasks (2011.07267v2)

Published 14 Nov 2020 in cs.LG

Abstract: Self-supervised learning is currently gaining a lot of attention, as it allows neural networks to learn robust representations from large quantities of unlabeled data. Additionally, multi-task learning can further improve representation learning by training networks simultaneously on related tasks, leading to significant performance improvements. In this paper, we propose three novel self-supervised auxiliary tasks to train graph-based neural network models in a multi-task fashion. Since Graph Convolutional Networks are among the most promising approaches for capturing relationships among structured data points, we use them as a building block to achieve competitive results on standard semi-supervised graph classification tasks.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Franco Manessi (8 papers)
  2. Alessandro Rozza (13 papers)
Citations (40)

Summary

We haven't generated a summary for this paper yet.