Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learning Neural Textual Representations for Citation Recommendation (2007.04070v1)

Published 8 Jul 2020 in cs.CL

Abstract: With the rapid growth of the scientific literature, manually selecting appropriate citations for a paper is becoming increasingly challenging and time-consuming. While several approaches for automated citation recommendation have been proposed in the recent years, effective document representations for citation recommendation are still elusive to a large extent. For this reason, in this paper we propose a novel approach to citation recommendation which leverages a deep sequential representation of the documents (Sentence-BERT) cascaded with Siamese and triplet networks in a submodular scoring function. To the best of our knowledge, this is the first approach to combine deep representations and submodular selection for a task of citation recommendation. Experiments have been carried out using a popular benchmark dataset - the ACL Anthology Network corpus - and evaluated against baselines and a state-of-the-art approach using metrics such as the MRR and F1-at-k score. The results show that the proposed approach has been able to outperform all the compared approaches in every measured metric.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Binh Thanh Kieu (1 paper)
  2. Inigo Jauregi Unanue (13 papers)
  3. Son Bao Pham (7 papers)
  4. Hieu Xuan Phan (1 paper)
  5. Massimo Piccardi (21 papers)
Citations (5)

Summary

We haven't generated a summary for this paper yet.