RAE Questions

Loet Leydesdorff loet at LEYDESDORFF.NET
Tue Apr 4 16:07:39 EDT 2006


Oh, I thought that you were advocating a metrics research project instead of
the RAE. But this now seems all to be hot air. Success with your project in
Canada.

Best,  Loet

> -----Original Message-----
> From: ASIS&T Special Interest Group on Metrics
> [mailto:SIGMETRICS at listserv.utk.edu] On Behalf Of Stevan Harnad
> Sent: Tuesday, April 04, 2006 9:34 PM
> To: SIGMETRICS at listserv.utk.edu
> Subject: Re: [SIGMETRICS] RAE Questions
>
> Adminstrative info for SIGMETRICS (for example unsubscribe):
> http://web.utk.edu/~gwhitney/sigmetrics.html
>
> On Tue, 4 Apr 2006, Loet Leydesdorff wrote:
>
> > Now that we have amply discussed the political side of the
> RAE, let us
> > turn to your research program of replacing the RAE with a metrics.
>
> Loet, we can discuss my research progam if you like, but we
> were not discussing that. We were discussing the UK
> government's proposed policy of replacing the RAE with
> metrics. That has nothing to do with my research program.
> They decided to switch from the present hybrid system (of
> re-reviewing published articles plus some metrics) to metrics
> alone because metrics alone are already so highly correlated
> with the current RAE outcomes (in many, though not
> necessarily all fields). No critique of metrics over-rides
> that decision where the two are already so highly correlated.
> It would be pure superstition to continue going through the
> ergonomically and eocnomically wasteful motions of the
> re-review when the outcome is already there in the metrics.
>
> > Two problems have been mentioned which cannot easily be solved:
> >
> > 1. the skewness of the distributions
>
> I think there are ways to adjust for this.
>
> > 2. the heterogeneity of department as units of analysis
>
> That is a separate matter. The proposal to swap metrics alone
> for a redundant, expensive, time-consuming hybrid process
> that yields the same outcome was based on the units of
> analysis as they now are. The units too could be revised, and
> perhaps should be, but that is an independent question.
>
> > The first problem can be solved by using non-parametric regression
> > analysis (probit or logit) instead of multi-variate regression
> > analysis of the LISREL type. However, will this provide you with a
> > ranking? I cannot oversee it because I never did it myself.
>
> The present RAE outcome (rankings) is highly correlated with
> metrics already. If we correct the metrics for skewness, this
> may continue to give the same highly correlated outcome, or
> another one. RAE can then decide which one it wants to trust
> more, and why, but either way, it has no bearing on the
> validity of the decision to scrap re-reviews for metrics when
> they give almost the same outcome anyway.
>
> > Stephen Bensman also mentioned the
> > instability of these skewed curves over time. I would anyhow be
> > worried about the comparisons over time because of auto-correlation
> > (auto-covariance) effects.
>
> Whatever their skewness, temporal variability and
> auto-correlation, the ranking based on metrics are very
> similar to the rankings based on re-review. The starting
> point is to have a metric that does *at least as
> well* as the re-review did, and then to start work on
> optimizing it. Let us not forget the real alternatives at
> issue. As I said, it would be superstitious and absurd to go
> back from cheap metrics to profligate re-reviews because of
> putative blemishes in the metrics *when both yield the same outcome*.
>
> > I have run into these problems before, and therefore I am a
> big fan of
> > entropy statistics. But policy makers tend not to understand the
> > results if one can teach them something about "reduction of the
> > uncertainty". They will wish firm numbers to legitimate decisions.
>
> If policy makers have been content to rank the departments
> and shell out the money in proportion with the ranks for two
> decades now, and those ranks are derivable from cheap metrics
> instead of costly re-reviews, they will understand enough to
> know they should go with metrics. Then you can give them a
> course on how to improve on their metrics with "entropy statistics".
>
> > The second problem is generated because you will have institutional
> > units of analysis which may be composed of different disciplinary
> > affiliations and to a variable extent.
>
> That is already true, and it is true regardless of whether
> the RAE does or does not do the re-review over and above the
> metrics which are already highly correlated with the outcome.
> If rejuggling units improves the equity and predictivity of
> the rankings, by all means rejuggle them. But in and of
> itself that has nothing to do with the obvious good sense of
> scrapping profligate re-review in favour of parsimonious
> metrics when they yield the same outcome -- even with the
> present unit structure.
>
> > For example, I am myself misplaced in a unit of
> communication studies.
> > In other cases, universities will have set up "interdisciplinary
> > units" on purpose while individual scholars continue to affiliate
> > themselves with their original disciplines. We know that
> publication
> > and citation practices vary among disciplines. Thus, one should not
> > compare apples with oranges.
>
> It sounds worth remedying, but the question is orthogonal to
> the question of whether to retain wasteful re-review or to
> rely on metrics that give the same outcome at a fraction of
> the cost in lost time and money (that could have been devoted
> to funding research instead of just rating it).
>
> > I would be inclined to disadvise to embark on this research project
> > before one has an idea of how to handle these two problems.
> > Fortunately, I was not the reviewer :-).
>
> I am not sure which research project you are talking about.
> (I was just funded for a metrics project in Canada, but it
> has nothing to do with the RAE. The RAE, in contrast, has
> elected to scrap re-review in favour of the metrics that
> already yield the same outcome, but that has nothing to do
> with my research project.)
>
> Stevan Harnad
> American Scientist Open Access Forum
> http://amsci-forum.amsci.org/archives/American-Scientist-Open-
> Access-Forum.html
>
> Chaire de recherche du Canada           Professor of Cognitive Science
> Ctr. de neuroscience de la cognition    Dpt. Electronics &
> Computer Science
> Université du Québec à Montréal         University of Southampton
> Montréal, Québec                        Highfield, Southampton
> Canada  H3C 3P8                         SO17 1BJ United Kingdom
> http://www.crsc.uqam.ca/
> http://www.ecs.soton.ac.uk/~harnad/
>



More information about the SIGMETRICS mailing list