Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

DoubleMatch: Improving Semi-Supervised Learning with Self-Supervision (2205.05575v1)

Published 11 May 2022 in cs.LG, cs.CV, and stat.ML

Abstract: Following the success of supervised learning, semi-supervised learning (SSL) is now becoming increasingly popular. SSL is a family of methods, which in addition to a labeled training set, also use a sizable collection of unlabeled data for fitting a model. Most of the recent successful SSL methods are based on pseudo-labeling approaches: letting confident model predictions act as training labels. While these methods have shown impressive results on many benchmark datasets, a drawback of this approach is that not all unlabeled data are used during training. We propose a new SSL algorithm, DoubleMatch, which combines the pseudo-labeling technique with a self-supervised loss, enabling the model to utilize all unlabeled data in the training process. We show that this method achieves state-of-the-art accuracies on multiple benchmark datasets while also reducing training times compared to existing SSL methods. Code is available at https://github.com/walline/doublematch.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Erik Wallin (21 papers)
  2. Lennart Svensson (81 papers)
  3. Fredrik Kahl (39 papers)
  4. Lars Hammarstrand (17 papers)
Citations (11)

Summary

We haven't generated a summary for this paper yet.