Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On the Memorization Properties of Contrastive Learning (2107.10143v1)

Published 21 Jul 2021 in cs.LG and stat.ML

Abstract: Memorization studies of deep neural networks (DNNs) help to understand what patterns and how do DNNs learn, and motivate improvements to DNN training approaches. In this work, we investigate the memorization properties of SimCLR, a widely used contrastive self-supervised learning approach, and compare them to the memorization of supervised learning and random labels training. We find that both training objects and augmentations may have different complexity in the sense of how SimCLR learns them. Moreover, we show that SimCLR is similar to random labels training in terms of the distribution of training objects complexity.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Ildus Sadrtdinov (4 papers)
  2. Nadezhda Chirkova (25 papers)
  3. Ekaterina Lobacheva (17 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.