Journal article

Unbiased evaluation of ranking metrics reveals consistent performance in science and technology citation data

  • Xu, Shuqi Institute of Fundamental and Frontier Sciences, University of Electronic Science and Technology of China, Chengdu 610054, PR China
  • Mariani, Manuel Sebastian Institute of Fundamental and Frontier Sciences, University of Electronic Science and Technology of China, Chengdu 610054, PR China - URPP Social Networks, University of Zurich, 8050 Zurich, Switzerland
  • Lü, Linyuan Institute of Fundamental and Frontier Sciences, University of Electronic Science and Technology of China, Chengdu 610054, PR China - Alibaba Research Center for Complexity Sciences, Hangzhou Normal University, 311121 Hangzhou, PR China
  • Medo, Matúš Institute of Fundamental and Frontier Sciences, University of Electronic Science and Technology of China, Chengdu 610054, PR China - Department of Radiation Oncology, Inselspital, Bern University Hospital and University of Bern, 3010 Bern, Switzerland - Department of Physics, University of Fribourg, 1700 Fribourg, Switzerland
Show more…
    04.02.2020
Published in:
  • Journal of Informetrics. - 2020, vol. 14, no. 1, p. 101005
English Despite the increasing use of citation-based metrics for research evaluation purposes, we do not know yet which metrics best deliver on their promise to gauge the significance of a scientific paper or a patent. We assess 17 network-based metrics by their ability to identify milestone papers and patents in three large citation datasets. We find that traditional information-retrieval evaluation metrics are strongly affected by the interplay between the age distribution of the milestone items and age biases of the evaluated metrics. Outcomes of these metrics are therefore not representative of the metrics’ ranking ability. We argue in favor of a modified evaluation procedure that explicitly penalizes biased metrics and allows us to reveal metrics’ performance patterns that are consistent across the datasets. PageRank and LeaderRank turn out to be the best-performing ranking metrics when their age bias is suppressed by a simple transformation of the scores that they produce, whereas other popular metrics, including citation count, HITS and Collective Influence, produce significantly worse ranking results.
Faculty
Faculté des sciences et de médecine
Department
Département de Physique
Language
  • English
Classification
Physics
License
License undefined
Identifiers
Persistent URL
https://folia.unifr.ch/unifr/documents/308764
Statistics

Document views: 21 File downloads:
  • med_uer.pdf: 24