Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Entity-aware ELMo: Learning Contextual Entity Representation for Entity Disambiguation (1908.05762v2)

Published 14 Aug 2019 in cs.CL, cs.IR, cs.LG, and stat.ML

Abstract: We present a new local entity disambiguation system. The key to our system is a novel approach for learning entity representations. In our approach we learn an entity aware extension of Embedding for LLM (ELMo) which we call Entity-ELMo (E-ELMo). Given a paragraph containing one or more named entity mentions, each mention is first defined as a function of the entire paragraph (including other mentions), then they predict the referent entities. Utilizing E-ELMo for local entity disambiguation, we outperform all of the state-of-the-art local and global models on the popular benchmarks by improving about 0.5\% on micro average accuracy for AIDA test-b with Yago candidate set. The evaluation setup of the training data and candidate set are the same as our baselines for fair comparison.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Hamed Shahbazi (5 papers)
  2. Xiaoli Z. Fern (18 papers)
  3. Reza Ghaeini (9 papers)
  4. Rasha Obeidat (2 papers)
  5. Prasad Tadepalli (33 papers)
Citations (21)