Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Do disruption index indicators measure what they propose to measure? The comparison of several indicator variants with assessments by peers (1911.08775v1)

Published 20 Nov 2019 in cs.DL and physics.soc-ph

Abstract: Recently, Wu, Wang, and Evans (2019) and Bu, Waltman, and Huang (2019) proposed a new family of indicators, which measure whether a scientific publication is disruptive to a field or tradition of research. Such disruptive influences are characterized by citations to a focal paper, but not its cited references. In this study, we are interested in the question of convergent validity, i.e., whether these indicators of disruption are able to measure what they propose to measure ('disruptiveness'). We used external criteria of newness to examine convergent validity: in the post-publication peer review system of F1000Prime, experts assess papers whether the reported research fulfills these criteria (e.g., reports new findings). This study is based on 120,179 papers from F1000Prime published between 2000 and 2016. In the first part of the study we discuss the indicators. Based on the insights from the discussion, we propose alternate variants of disruption indicators. In the second part, we investigate the convergent validity of the indicators and the (possibly) improved variants. Although the results of a factor analysis show that the different variants measure similar dimensions, the results of regression analyses reveal that one variant (DI5) performs slightly better than the others.

Citations (9)

Summary

We haven't generated a summary for this paper yet.