Papers
Topics
Authors
Recent
Search
2000 character limit reached

Modeling and Predicting Citation Count via Recurrent Neural Network with Long Short-Term Memory

Published 6 Nov 2018 in cs.DL and physics.soc-ph | (1811.02129v2)

Abstract: The rapid evolution of scientific research has been creating a huge volume of publications every year. Among the many quantification measures of scientific impact, citation count stands out for its frequent use in the research community. Although peer review process is the mainly reliable way of predicting a paper's future impact, the ability to foresee lasting impact on the basis of citation records is increasingly important in the scientific impact analysis in the era of big data. This paper focuses on the long-term citation count prediction for individual publications, which has become an emerging and challenging applied research topic. Based on the four key phenomena confirmed independently in previous studies of long-term scientific impact quantification, including the intrinsic quality of publications, the aging effect and the Matthew effect and the recency effect, we unify the formulations of all these observations in this paper. Building on a foundation of the above formulations, we propose a long-term citation count prediction model for individual papers via recurrent neural network with long short-term memory units. Extensive experiments on a real-large citation data set demonstrate that the proposed model consistently outperforms existing methods, and achieves a significant performance improvement.

Citations (14)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.