Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Improving Tail-Class Representation with Centroid Contrastive Learning (2110.10048v2)

Published 19 Oct 2021 in cs.CV

Abstract: In vision domain, large-scale natural datasets typically exhibit long-tailed distribution which has large class imbalance between head and tail classes. This distribution poses difficulty in learning good representations for tail classes. Recent developments have shown good long-tailed model can be learnt by decoupling the training into representation learning and classifier balancing. However, these works pay insufficient consideration on the long-tailed effect on representation learning. In this work, we propose interpolative centroid contrastive learning (ICCL) to improve long-tailed representation learning. ICCL interpolates two images from a class-agnostic sampler and a class-aware sampler, and trains the model such that the representation of the interpolative image can be used to retrieve the centroids for both source classes. We demonstrate the effectiveness of our approach on multiple long-tailed image classification benchmarks. Our result shows a significant accuracy gain of 2.8% on the iNaturalist 2018 dataset with a real-world long-tailed distribution.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Anthony Meng Huat Tiong (7 papers)
  2. Junnan Li (56 papers)
  3. Guosheng Lin (157 papers)
  4. Boyang Li (106 papers)
  5. Caiming Xiong (337 papers)
  6. Steven C. H. Hoi (94 papers)
Citations (12)

Summary

We haven't generated a summary for this paper yet.