Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learning Phrase Embeddings from Paraphrases with GRUs (1710.05094v1)

Published 13 Oct 2017 in cs.CL

Abstract: Learning phrase representations has been widely explored in many NLP tasks (e.g., Sentiment Analysis, Machine Translation) and has shown promising improvements. Previous studies either learn non-compositional phrase representations with general word embedding learning techniques or learn compositional phrase representations based on syntactic structures, which either require huge amounts of human annotations or cannot be easily generalized to all phrases. In this work, we propose to take advantage of large-scaled paraphrase database and present a pair-wise gated recurrent units (pairwise-GRU) framework to generate compositional phrase representations. Our framework can be re-used to generate representations for any phrases. Experimental results show that our framework achieves state-of-the-art results on several phrase similarity tasks.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Zhihao Zhou (13 papers)
  2. Lifu Huang (92 papers)
  3. Heng Ji (267 papers)
Citations (9)

Summary

We haven't generated a summary for this paper yet.