Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Retrofitting Word Vectors to Semantic Lexicons (1411.4166v4)

Published 15 Nov 2014 in cs.CL

Abstract: Vector space word representations are learned from distributional information of words in large corpora. Although such statistics are semantically informative, they disregard the valuable information that is contained in semantic lexicons such as WordNet, FrameNet, and the Paraphrase Database. This paper proposes a method for refining vector space representations using relational information from semantic lexicons by encouraging linked words to have similar vector representations, and it makes no assumptions about how the input vectors were constructed. Evaluated on a battery of standard lexical semantic evaluation tasks in several languages, we obtain substantial improvements starting with a variety of word vector models. Our refinement method outperforms prior techniques for incorporating semantic lexicons into the word vector training algorithms.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Manaal Faruqui (39 papers)
  2. Jesse Dodge (45 papers)
  3. Sujay K. Jauhar (1 paper)
  4. Chris Dyer (91 papers)
  5. Eduard Hovy (115 papers)
  6. Noah A. Smith (224 papers)