Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On the Importance of Hyperparameters and Data Augmentation for Self-Supervised Learning (2207.07875v1)

Published 16 Jul 2022 in cs.LG, cs.AI, and cs.CV

Abstract: Self-Supervised Learning (SSL) has become a very active area of Deep Learning research where it is heavily used as a pre-training method for classification and other tasks. However, the rapid pace of advancements in this area comes at a price: training pipelines vary significantly across papers, which presents a potentially crucial confounding factor. Here, we show that, indeed, the choice of hyperparameters and data augmentation strategies can have a dramatic impact on performance. To shed light on these neglected factors and help maximize the power of SSL, we hyperparameterize these components and optimize them with Bayesian optimization, showing improvements across multiple datasets for the SimSiam SSL approach. Realizing the importance of data augmentations for SSL, we also introduce a new automated data augmentation algorithm, GroupAugment, which considers groups of augmentations and optimizes the sampling across groups. In contrast to algorithms designed for supervised learning, GroupAugment achieved consistently high linear evaluation accuracy across all datasets we considered. Overall, our results indicate the importance and likely underestimated role of data augmentation for SSL.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Diane Wagner (3 papers)
  2. Fabio Ferreira (22 papers)
  3. Danny Stoll (9 papers)
  4. Robin Tibor Schirrmeister (18 papers)
  5. Samuel Müller (31 papers)
  6. Frank Hutter (177 papers)
Citations (15)

Summary

We haven't generated a summary for this paper yet.