Future UK RAEs to be Metrics-Based
Stevan Harnad
harnad at ECS.SOTON.AC.UK
Tue Mar 28 08:13:32 EST 2006
The UK has a "dual" funding system: (1) conventional direct research
grant applications, with peer review of competitive proposals (RCUK)
and (2) top-sliced funding accorded to departments (not individuals)
based on past departmental research performance (RAE). The RAE was a
monstrously expensive and time-consuming exercise, with paper
collection and submission of all kinds of performance markers,
including 4 full-text papers, for peer-re-review by RAE panels. It
turned out that the RAE's outcome -- each departmental RAE "rank"
from 1 to 5*, with top-sliced funding given according to the rank and
number of researchers submitted -- was highly correlated with total
citation counts for the department's submitted researchers (r = .7
to .9+) and even more highly correlated with prior RCUK funding (.98).
So RAE rank correlates highly with prior RCUK (and European) funding
and almost as highly with citations (and with other metrics, such as
number of doctorates accorded, etc.). The RAE rank is based on the
data received and evaluated by the panel -- not through multiple
regression, but through some sort of subjective weighting, including
a "peer-re-review" of already published, already peer-reviewed
articles (although I very much doubt many of them are actually read,
the panels not being specific experts in their subject matter as the
original journal peer-reviewers were meant to be -- it is far more
likely that their ranking of the articles is based on the reputation
of the journal in which they were published, and there is definitely
pressure in the departments to preferentially submit articles that
have been published in high-quality, high-impact journals).
So what is counted explicitly is prior funding, doctorates, and a few
other explicit measures; in addition, there is the "peer-re-review"
-- whatever that amounts to -- which is no doubt *implicitly*
influenced by journal reputations and impact factors. However,
neither journal impact factors nor article/author citations are
actually counted *explicitly* -- indeed it is explicitly forbidden to
count citations for the RAE. That makes the high correlation of the
RAE outcome with citation counts all the more remarkable -- and less
remarkable than the even higher correlation with prior funding, which
*is* counted explicitly.
The multiple regression ("metric") method is not yet in use at all.
It will now be tried out, in parallel with the next RAE (2008), which
will be conducted in the usual way, but doing the metrics alongside.
Prior funding counts are no doubt causal in the present RAE outcome
(since they are explicitly counted), but that is not the same as
saying that research funding is causal in generating research
performance quality! (Funding is no doubt causal in being a necessary
precondition for research quality, because without funding one cannot
do research, but to what extent prior funding levels in and of
themselves are causes of research quality variance over and above
being a Matthew Effect or self-fulfilling prophecy is an empirical
question about how good a predictor individual research-proposal peer-
review is for allotting departmental top-sliced finding to reward and
foster research performance.
Hence the causality question is in a sense a question about the
causal efficacy of UK's dual funding system itself, and the relative
independence of its two components. For if they are indeed measuring
and rewarding the very same thing, then RAE and the dual system may
as well be scrapped, and the individual RCUK proposal funding with
the redirected funds simply scaled up proportionately .
I am not at all convinced that the dual system itself should be
scrapped, however; just that the present costly and wasteful
implementation of the RAE component should be replaced by metrics.
And those metrics should certainly not be restricted to prior
funding, even though it was so highly correlated with RAE ranking. It
should be enriched by many other metric variables in a regression
equation, composed and calibrated according to each discipline's
peculiar profile as well as its internal and external validation
results. And let us supplement conservative metrics with the many
richer and more diverse ones that will be afforded by an online, open-
access full-text corpus, citation-interlinked, tagged, and usage-
monitored.
Stevan Harnad
On 28-Mar-06, at 6:39 AM, Loet Leydesdorff wrote
>> To repeat: The RAE itself is a predictor, in want of
>> validation. Prior funding correlates 0.98 with this predictor
>> (in some fields, and is hence virtually identical with it),
>> but is itself in want of validation.
>
> Do you wish to say that both the RAE and the multivariate
> regression method
> correlate highly with prior funding. Is the latter perhaps causal for
> research quality, in your opinion?
>
> The policy conclusion would then be that both indicators are very
> conservative. Perhaps, that is not a bad thing, but one may wish to
> state it
> straightforwardly.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.asis.org/pipermail/sigmetrics/attachments/20060328/00eb8095/attachment.html>
More information about the SIGMETRICS
mailing list