Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

PAUSE: Positive and Annealed Unlabeled Sentence Embedding (2109.03155v1)

Published 7 Sep 2021 in cs.CL, cs.AI, and cs.LG

Abstract: Sentence embedding refers to a set of effective and versatile techniques for converting raw text into numerical vector representations that can be used in a wide range of NLP applications. The majority of these techniques are either supervised or unsupervised. Compared to the unsupervised methods, the supervised ones make less assumptions about optimization objectives and usually achieve better results. However, the training requires a large amount of labeled sentence pairs, which is not available in many industrial scenarios. To that end, we propose a generic and end-to-end approach -- PAUSE (Positive and Annealed Unlabeled Sentence Embedding), capable of learning high-quality sentence embeddings from a partially labeled dataset. We experimentally show that PAUSE achieves, and sometimes surpasses, state-of-the-art results using only a small fraction of labeled sentence pairs on various benchmark tasks. When applied to a real industrial use case where labeled samples are scarce, PAUSE encourages us to extend our dataset without the liability of extensive manual annotation work.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Lele Cao (28 papers)
  2. Emil Larsson (1 paper)
  3. Vilhelm von Ehrenheim (8 papers)
  4. Dhiana Deva Cavalcanti Rocha (1 paper)
  5. Anna Martin (10 papers)
  6. Sonja Horn (2 papers)
Citations (5)