Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Few-shot Text Classification with Dual Contrastive Consistency (2209.15069v1)

Published 29 Sep 2022 in cs.CL, cs.AI, and cs.LG

Abstract: In this paper, we explore how to utilize pre-trained LLM to perform few-shot text classification where only a few annotated examples are given for each class. Since using traditional cross-entropy loss to fine-tune LLM under this scenario causes serious overfitting and leads to sub-optimal generalization of model, we adopt supervised contrastive learning on few labeled data and consistency-regularization on vast unlabeled data. Moreover, we propose a novel contrastive consistency to further boost model performance and refine sentence representation. After conducting extensive experiments on four datasets, we demonstrate that our model (FTCC) can outperform state-of-the-art methods and has better robustness.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Liwen Sun (4 papers)
  2. Jiawei Han (263 papers)

Summary

We haven't generated a summary for this paper yet.