Open Access Metrics: Use REF2014 to Validate Metrics for REF2020

Stevan Harnad harnad at ECS.SOTON.AC.UK
Wed Dec 17 12:34:39 EST 2014

On Dec 17, 2014, at 12:15 PM, Taylor, Nicholas K <N.K.Taylor at> wrote:
> Why should the REF 2014 panellists become the Gold Standard? No disrespect but I would venture to suggest that even they would not see themselves as the optimal arbiters for all time.

REF 2014 is certainly not the gold standard for all, or for all time — but it’s clearly going to be the standard for doling out the 2014 REF gold!

And as such, it is just fine for also initializing the weights on a first approximation to a REF 2020 metric equation:

REF2020Rank = 

w1(pubcount) + w2(JIF) + w3(cites) +w4(art-age) + w5(art-growth) + w6(hits) + w7(cite-peak-latency) + w8(hit-peak-latency) + w9(citedecay) + w10(hitdecay) + w11(hub-score) + w12(authority+score) + w13(h-index) + w14(prior-funding) +w15(bookcites) + w16(student-counts) + w17(co-cites + w18(co-hits) + w19(co-authors) + w20(endogamy) + w21(exogamy) + w22(co-text) + w23(tweets) + w24(tags), + w25(comments) + w26(acad-likes) etc. etc. 

There’s plenty of time between now and 2020 to keep optimizing the equation, discipline by discipline, updating the initial weights as well as adding more metrics to the battery.


> From: Council of Professors and Heads of Computing in UK universities [mailto:cphc-members at JISCMAIL.AC.UK <mailto:cphc-members at JISCMAIL.AC.UK>] On Behalf Of Stevan Harnad
> Sent: 17 December 2014 14:27
> To: cphc-members at JISCMAIL.AC.UK <mailto:cphc-members at JISCMAIL.AC.UK>
> Subject: Open Access Metrics: Use REF2014 to Validate Metrics for REF2020
> Steven Hill of HEFCE has posted “an overview of the work HEFCE are currently commissioning which they are hoping will build a robust evidence base for research assessment” in LSE Impact Blog 12(17) 2014 entitled Time for REFlection: HEFCE look ahead to provide rounded evaluation of the REF <>
> Let me add a suggestion, updated for REF2014, that I have made before (unheeded):
> Scientometric predictors of research performance need to be validated by showing that they have a high correlation with the external criterion they are trying to predict. The UK Research Excellence Framework (REF) -- together with the growing movement toward making the full-texts of research articles freely available on the web -- offer a unique opportunity to test and validate a wealth of old and new scientometric predictors, through multiple regression analysis: Publications, journal impact factors, citations, co-citations, citation chronometrics (age, growth, latency to peak, decay rate), hub/authority scores, h-index, prior funding, student counts, co-authorship scores, endogamy/exogamy, textual proximity, download/co-downloads and their chronometrics, tweets, tags, etc.) can all be tested and validated jointly, discipline by discipline, against their REF panel rankings in REF2014. The weights of each predictor can be calibrated to maximize the joint correlation with the rankings. Open Access Scientometrics will provide powerful new means of navigating, evaluating, predicting and analyzing the growing Open Access database, as well as powerful incentives for making it grow faster.
> Harnad, S. (2009) Open Access Scientometrics and the UK Research Assessment Exercise <>. Scientometrics 79 (1) Also in Proceedings of 11th Annual Meeting of the International Society for Scientometrics and Informetrics 11(1), pp. 27-33, Madrid, Spain. Torres-Salinas, D. and Moed, H. F., Eds.  (2007) 
> See also:
> The Only Substitute for Metrics is Better Metrics <> (2014)
> and
> On Metrics and Metaphysics <> (2008)
> We invite research leaders and ambitious early career researchers to join us in leading and driving research in key inter-disciplinary themes. Please see <> for further information and how to apply. 
> Heriot-Watt University is a Scottish charity registered under charity number SC000278.

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <>

More information about the SIGMETRICS mailing list