Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Enhancing Sentence Embedding with Generalized Pooling (1806.09828v1)

Published 26 Jun 2018 in cs.CL, cs.AI, and cs.LG

Abstract: Pooling is an essential component of a wide variety of sentence representation and embedding models. This paper explores generalized pooling methods to enhance sentence embedding. We propose vector-based multi-head attention that includes the widely used max pooling, mean pooling, and scalar self-attention as special cases. The model benefits from properly designed penalization terms to reduce redundancy in multi-head attention. We evaluate the proposed model on three different tasks: natural language inference (NLI), author profiling, and sentiment classification. The experiments show that the proposed model achieves significant improvement over strong sentence-encoding-based methods, resulting in state-of-the-art performances on four datasets. The proposed approach can be easily implemented for more problems than we discuss in this paper.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Qian Chen (264 papers)
  2. Zhen-Hua Ling (114 papers)
  3. Xiaodan Zhu (94 papers)
Citations (67)

Summary

We haven't generated a summary for this paper yet.