Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Sen2Pro: A Probabilistic Perspective to Sentence Embedding from Pre-trained Language Model (2306.02247v1)

Published 4 Jun 2023 in cs.CL and cs.AI

Abstract: Sentence embedding is one of the most fundamental tasks in Natural Language Processing and plays an important role in various tasks. The recent breakthrough in sentence embedding is achieved by pre-trained LLMs (PLMs). Despite its success, an embedded vector (Sen2Vec) representing a point estimate does not naturally express uncertainty in a taskagnostic way. This paper thereby proposes an efficient framework on probabilistic sentence embedding (Sen2Pro) from PLMs, and it represents a sentence as a probability density distribution in an embedding space to reflect both model uncertainty and data uncertainty (i.e., many-to-one nature) in the sentence representation. The proposed framework performs in a plug-and-play way without retraining PLMs anymore, and it is easy to implement and generally applied on top of any PLM. The superiority of Sen2Pro over Sen2Vec has been theoretically verified and practically illustrated on different NLP tasks.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Lingfeng Shen (18 papers)
  2. Haiyun Jiang (34 papers)
  3. Lemao Liu (62 papers)
  4. Shuming Shi (126 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.