Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

ESA: Entity Summarization with Attention (1905.10625v4)

Published 25 May 2019 in cs.CL and cs.AI

Abstract: Entity summarization aims at creating brief but informative descriptions of entities from knowledge graphs. While previous work mostly focused on traditional techniques such as clustering algorithms and graph models, we ask how to apply deep learning methods into this task. In this paper we propose ESA, a neural network with supervised attention mechanisms for entity summarization. Specifically, we calculate attention weights for facts in each entity, and rank facts to generate reliable summaries. We explore techniques to solve difficult learning problems presented by the ESA, and demonstrate the effectiveness of our model in comparison with the state-of-the-art methods. Experimental results show that our model improves the quality of the entity summaries in both F-measure and MAP.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Dongjun Wei (6 papers)
  2. Yaxin Liu (17 papers)
  3. Fuqing Zhu (8 papers)
  4. Liangjun Zang (10 papers)
  5. Wei Zhou (311 papers)
  6. Jizhong Han (48 papers)
  7. Songlin Hu (80 papers)
Citations (13)

Summary

We haven't generated a summary for this paper yet.