Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Semi-supervised Learning with Contrastive Predicative Coding (1905.10514v1)

Published 25 May 2019 in cs.LG and stat.ML

Abstract: Semi-supervised learning (SSL) provides a powerful framework for leveraging unlabeled data when labels are limited or expensive to obtain. SSL algorithms based on deep neural networks have recently proven successful on standard benchmark tasks. However, many of them have thus far been either inflexible, inefficient or non-scalable. This paper explores recently developed contrastive predictive coding technique to improve discriminative power of deep learning models when a large portion of labels are absent. Two models, cpc-SSL and a class conditional variant~(ccpc-SSL) are presented. They effectively exploit the unlabeled data by extracting shared information between different parts of the (high-dimensional) data. The proposed approaches are inductive, and scale well to very large datasets like ImageNet, making them good candidates in real-world large scale applications.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Jiaxing Wang (16 papers)
  2. Yin Zheng (23 papers)
  3. Xiaoshuang Chen (28 papers)
  4. Junzhou Huang (137 papers)
  5. Jian Cheng (127 papers)

Summary

We haven't generated a summary for this paper yet.