The Need to Cross-Validate and Initialize Multiple Metrics Jointly Against Peer Ratings
Stevan Harnad
amsciforum at GMAIL.COM
Thu Mar 19 09:18:31 EDT 2009
The Times Higher Education Supplement (THES) has
reported<http://www.timeshighereducation.co.uk/story.asp?sectioncode=26&storycode=405831&c=1>
the
results of a study they commissioned by Evidence
Ltd<http://www.evidence.co.uk/> that
found that the ranking criteria for assessing and rewarding research
performance in the UK Research Assessment Exercise (RAE) changed from RAE
2001 <http://www.hero.ac.uk/rae/> to RAE 2008 <http://www.rae.ac.uk/>. The
result is that citations, which correlated highly with RAE 2001, correlated
less highly with RAE 2008, so a number of universities whose citation counts
had decreased were rewarded more in 2008, and a number of universities whose
citation counts had increased were rewarded less.
(1) Citation counts are only one (though an important one) among many
potential metrics of research performance.
(2) If the RAE peer panel raters' criteria for ranking the universities
varied or were inconsistent between RAE 2001 and RAE 2008 then that is a
problem with peer ratings rather than with metrics (which, being objective,
remain consistent).
(3) Despite the variability and inconsistency, peer ratings are the only way
to initialise the weights on metrics: Metrics first have to be jointly
validated against expert peer evaluation by measuring their correlation with
the peer rankings, discipline by discipline; then the metrics' respective
weights can be updated and fine-tuned, discipline by discipline, in
conjunction with expert judgment of the resulting rankings and continuing
research activity.
(4) If only one metric (e.g., citation) is used, there is the risk that
expert ratings will simply echo it. But if a rich and diverse battery of
multiple metrics is jointly validated and initialized against the RAE 2008
expert ratings, then this will create an assessment-assistant tool whose
initial weights can be calibrated and used in an exploratory way to generate
different rankings, to be compared by the peer panels with previous rankings
as well as with new, evolving criteria of research productivity, uptake,
importance, influence, excellence and impact.
(5) The dawning era of Open Access (free web access) to peer-reviewed
research is providing a wealth of new metrics to be included, tested and
assigned initial weights in the joint battery of metrics. These include
download counts, citation and download growth and decay rates, hub and
authority scores, interdisciplinarity scores, co-citations, tag counts,
comment counts, link counts, data-usage, and many other openly accessible
and measurable properties of the growth of knowledge in our evolving
"Cognitive Commons."
Brody, T., Kampa, S., Harnad, S., Carr, L. and Hitchcock, S. (2003)
Digitometric
Services for Open Archives Environments<http://eprints.ecs.soton.ac.uk/7503/>.
In *Proceedings of European Conference on Digital Libraries* 2003, pp.
207-220, Trondheim, Norway.
Brody, T., Carr, L., Harnad, S. and Swan, A. (2007) Time to Convert to
Metrics <http://eprints.ecs.soton.ac.uk/14329/>. *Research Fortnight* pp.
17-18.
Brody, T., Carr, L., Gingras, Y., Hajjem, C., Harnad, S. and Swan, A.
(2007) Incentivizing the Open Access Research Web: Publication-Archiving,
Data-Archiving and Scientometrics
<http://eprints.ecs.soton.ac.uk/14418/>. *CTWatch
Quarterly* 3(3).
Carr, L., Hitchcock, S., Oppenheim, C., McDonald, J. W., Champion, T. and
Harnad, S. (2006) Extending journal-based research impact assessment to
book-based disciplines <http://eprints.ecs.soton.ac.uk/12725/>. Technical
Report, ECS, University of Southampton.
Harnad, S. (2001) Research access, impact and
assessment.<http://cogprints.org/1683/>
*Times Higher Education Supplement* 1487: p. 16.
Harnad, S., Carr, L., Brody, T. & Oppenheim, C. (2003) Mandated online RAE
CVs Linked to University Eprint Archives: Improving the UK Research
Assessment Exercise whilst making it cheaper and
easier<http://www.ariadne.ac.uk/issue35/harnad/>.
Ariadne 35.
Harnad, S. (2006) Online, Continuous, Metrics-Based Research
Assessment<http://eprints.ecs.soton.ac.uk/12130/>.
Technical Report, ECS, University of Southampton.
Harnad, S. (2007) Open Access Scientometrics and the UK Research Assessment
Exercise <http://eprints.ecs.soton.ac.uk/13804/>. In *Proceedings of 11th
Annual Meeting of the International Society for Scientometrics and
Informetrics *11(1), pp. 27-33, Madrid, Spain. Torres-Salinas, D. and Moed,
H. F., Eds.
Harnad, S. (2008) Self-Archiving, Metrics and
Mandates<http://www.councilscienceeditors.org/members/secureDocument.cfm?docID=1916>
. *Science Editor* 31(2) 57-59
Harnad, S. (2008) Validating Research Performance Metrics Against Peer
Rankings <http://eprints.ecs.soton.ac.uk/15619/>. Ethics in Science and
Environmental Politics 8 (11) doi:10.3354/esep00088 The Use And Misuse Of
Bibliometric Indices In Evaluating Scholarly Performance
Harnad, S., Carr, L. and Gingras, Y. (2008) Maximizing Research Progress
Through Open Access Mandates and Metrics<http://eprints.ecs.soton.ac.uk/16617/>
. *Liinc em Revista*.
Harnad, S. (2009) Multiple metrics required to measure research
performance<http://openaccess.eprints.org/index.php?/archives/508-guid.html>
. Nature (Correspondence) 457
(785)<http://www.nature.com/nature/journal/v457/n7231/full/457785a.html>
(12
February 2009)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.asis.org/pipermail/sigmetrics/attachments/20090319/d8bbb6be/attachment.html>
More information about the SIGMETRICS
mailing list