Papers
Topics
Authors
Recent
2000 character limit reached

Doubly Contrastive Deep Clustering

Published 9 Mar 2021 in cs.CV | (2103.05484v1)

Abstract: Deep clustering successfully provides more effective features than conventional ones and thus becomes an important technique in current unsupervised learning. However, most deep clustering methods ignore the vital positive and negative pairs introduced by data augmentation and further the significance of contrastive learning, which leads to suboptimal performance. In this paper, we present a novel Doubly Contrastive Deep Clustering (DCDC) framework, which constructs contrastive loss over both sample and class views to obtain more discriminative features and competitive results. Specifically, for the sample view, we set the class distribution of the original sample and its augmented version as positive sample pairs and set one of the other augmented samples as negative sample pairs. After that, we can adopt the sample-wise contrastive loss to pull positive sample pairs together and push negative sample pairs apart. Similarly, for the class view, we build the positive and negative pairs from the sample distribution of the class. In this way, two contrastive losses successfully constrain the clustering results of mini-batch samples in both sample and class level. Extensive experimental results on six benchmark datasets demonstrate the superiority of our proposed model against state-of-the-art methods. Particularly in the challenging dataset Tiny-ImageNet, our method leads 5.6\% against the latest comparison method. Our code will be available at \url{https://github.com/ZhiyuanDang/DCDC}.

Citations (14)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.