Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Navigation-Based Candidate Expansion and Pretrained Language Models for Citation Recommendation (2001.08687v1)

Published 23 Jan 2020 in cs.IR and cs.DL

Abstract: Citation recommendation systems for the scientific literature, to help authors find papers that should be cited, have the potential to speed up discoveries and uncover new routes for scientific exploration. We treat this task as a ranking problem, which we tackle with a two-stage approach: candidate generation followed by re-ranking. Within this framework, we adapt to the scientific domain a proven combination based on "bag of words" retrieval followed by re-scoring with a BERT model. We experimentally show the effects of domain adaptation, both in terms of pretraining on in-domain data and exploiting in-domain vocabulary. In addition, we introduce a novel navigation-based document expansion strategy to enrich the candidate documents processed by our neural models. On three different collections from different scientific disciplines, we achieve the best-reported results in the citation recommendation task.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Rodrigo Nogueira (70 papers)
  2. Zhiying Jiang (27 papers)
  3. Kyunghyun Cho (292 papers)
  4. Jimmy Lin (208 papers)
Citations (18)

Summary

We haven't generated a summary for this paper yet.