Future UK RAEs to be Metrics-Based (fwd)

Stevan Harnad harnad at ECS.SOTON.AC.UK
Tue Sep 19 12:17:00 EDT 2006


---------- Forwarded message ----------
Date: Tue, 19 Sep 2006 16:27:38 +0100
From: C.Oppenheim <C.Oppenheim at LBORO.AC.UK>
To: AMERICAN-SCIENTIST-OPEN-ACCESS-FORUM at LISTSERVER.SIGMAXI.ORG
Subject: Re: Future UK RAEs to be Metrics-Based

My answer is that it is statistically significantly correlated with the RAE
results, based upon long, intensive peer group assessment by a group of
experts.  As Stevan Harnad has frequently commented, what is remarkable is
that despite the fact that journals are relatively unimportant in the
humanities, the correlation still works.

I stress that my approach is purely pragmatic.  I'm not suggesting a cause
and effect, simply that there is a strong correlation.

Charles

Professor Charles Oppenheim
Head
Department of Information Science
Loughborough University
Loughborough
Leics LE11 3TU

Tel 01509-223065
Fax 01509-223053
e mail C.Oppenheim at lboro.ac.uk
----- Original Message -----
From: <l.hurtado at ED.AC.UK>
To: <AMERICAN-SCIENTIST-OPEN-ACCESS-FORUM at LISTSERVER.SIGMAXI.ORG>
Sent: Tuesday, September 19, 2006 3:22 PM
Subject: Re: Future UK RAEs to be Metrics-Based


Sorry, Charles (if I may), but your response betrays a misunderstanding
of my concern and question.  Please try to hear me before giving an
answer:
You're assuming that picking up items that happen to be cited in a
selection of journals is somehow adequate, and it's THIS that I'm
concerned about.  What is your BASIS for your assumption about my field?
Yes, of course, you can collect everything cited in a given set of
journals, in principle.  But is that the same thing as a representative
picture of how scholarship is being treated in a field such as mine in
which journals are not necessarily the principle medium in which
scholarship is established or exhibited?  This readily illustrates my
concern about what assumptions go into experiments or "empirical"
studies before they are run.

So, QUESTION:  What is your empirical basis for the assumption that
simply monitoring a given set of journals is sufficient for any/all
fields.  You haven't addressed this yet.
Larry

Quoting "C.Oppenheim" <C.Oppenheim at LBORO.AC.UK>:

> The question betrays a misunderstanding of how citation indexes work.
> Citation indexes scan the journal literature for citations to all media,
> not
> just other journals.  So it makes no difference what vehicle the
> humanities
> scholar disseminated his/her output in, the item will get picked up by the
> citation index.
>
> The notion that a group of informed scholars could come up with a ranking
> list over 30 minutes is an appealing one, but the fact remains that the
> UK's
> RAE takes about a year to collect and analyse the data, together with many
> meetings of the group of scholars, before decisions are made.  one reason
> for this tedious approach is to avoid legal challenges that the results
> were
> not robustly reached.
>
> Charles
>
> Professor Charles Oppenheim
> Head
> Department of Information Science
> Loughborough University
> Loughborough
> Leics LE11 3TU
>
> Tel 01509-223065
> Fax 01509-223053
> e mail C.Oppenheim at lboro.ac.uk
> ----- Original Message -----
> From: <l.hurtado at ED.AC.UK>
> To: <AMERICAN-SCIENTIST-OPEN-ACCESS-FORUM at LISTSERVER.SIGMAXI.ORG>
> Sent: Tuesday, September 19, 2006 12:34 PM
> Subject: Re: Future UK RAEs to be Metrics-Based
>
>
> My scepticism (which is capable of being satisfied) is not toward the
> *idea* that there may well be a correlation between the frequency with
> which scholarly work is cited and the wider estimate of that
> scholar/dept.  I am dubious that it has been demonstrated that
> conducting such an analysis *can be done* for all disciplines,
> particularly at least some Humanities fields.  My scepticism rests upon
> the bases I've iterated before. I apologize if I seem to be replaying a
> record, but it's not yet clear to me that my concerns are effectively
> engaged.
> --Humanities scholarly publishing is more diverse in venue/genre than
> in some other fields.  Indeed, journals are not particularly regarded
> as quite so central, but only one among several respected and
> frequented genres, which include multi-author books, and (perhaps
> particularly) monographs.
>
> QUESTION:  Are the studies that supposedly show such meaningful
> correlations actually drawing upon the full spread of publication
> genres appropriate to the fields in view?  (I'd be surprised but
> delighted were the answer yes, because I'm not aware of any mechanism
> in place, such as ISI in journal monitoring, for surveying and counting
> in such a vast body of material.
>
> I'm not pushing at all for the labor-intensive RAE of the past.
> Indeed, if the question is not how do individual scholars stack up in
> comparison to others in their field (which the RAE actually wasn't
> designed to determine), but instead how can we identify depts into
> which a disproportionate amount of govt funding should be pumped, then
> I think in almost any field a group of informed scholars could readily
> determine the top 5-10 places within 30 minutes, and with time left
> over for coffee.
>
> I'm just asking for more transparency and evidence behind the
> enthusiasm for replacing RAE with "metrics".
>
> Larry
>
> Quoting "C.Oppenheim" <C.Oppenheim at LBORO.AC.UK>:
>
>> The correlation is between number of citations in total (and average
>> number
>> of citations per member of staff) received by a Department over the RAE
>> period (1996-2001) and the RAE score received by the Department following
>> expert peer review.  Correlation analyses are done using Pearson or
>> Spearman
>> correlation coefficients. The fact that so few humanities scholars
>> publish
>> journal articles does not affect this result.
>>
>> A paper on the topic  is in preparation at the moment.
>>
>> What intrigues me is why there is so much scepticism about the notion.
>> RAE
>> is done by peer review experts.  Citations are also done by (presumably)
>> experts who choose to cite a particular work.  So one would expect a
>> correlation between the two, wouldn't one?  What it tells us is that high
>> quality research leads to both high RAE scores AND high citation counts.
>>
>>  I do these calculations (and I've covered many subject areas over the
>> years, but not biblical studies - something for the future!) in a totally
>> open-minded manner.  If I get a non-significant or zero correlation in
>> such
>> a study in the future, I will faithfully report it.  But so far, that
>> hasn't
>> happened.
>>
>> Charles
>>
>> Professor Charles Oppenheim
>> Head
>> Department of Information Science
>> Loughborough University
>> Loughborough
>> Leics LE11 3TU
>>
>> Tel 01509-223065
>> Fax 01509-223053
>> e mail C.Oppenheim at lboro.ac.uk
>> ----- Original Message -----
>> From: <l.hurtado at ED.AC.UK>
>> To: <AMERICAN-SCIENTIST-OPEN-ACCESS-FORUM at LISTSERVER.SIGMAXI.ORG>
>> Sent: Monday, September 18, 2006 8:37 PM
>> Subject: Re: Future UK RAEs to be Metrics-Based
>>
>>
>> Well, I'm all for empirically-based views in these matters.  So, if
>> Openheim or others have actually soundly based studies showing what
>> Stevan and Openheim claim, then that's to be noted.  I'll have to see
>> the stuff when it's published.  In the meanwhile, a couple of further
>> questions:
>> --Pardon me for being out of touch, perhaps, but more precisely what is
>> being measured?  What does journal "citation counts" refer to?
>> Citation of journal articles?  Or citation of various things in journal
>> articles (and why privilege this medium?)?  Or . . . what?
>> --What does "correlation" between RAE results and "citation counts"
>> actually comprise?
>>
>> Let me lay out further reasons for some skepticism.  In my own field
>> (biblical studies/theology), I'd say most senior-level scholars
>> actually publish very infrequently in refereed journals. We do perhaps
>> more in earlier years, but as we get to senior levels we tend (a) to
>> get requests for papers for multi-author volumes, and (b) we devote
>> ourselves to projects that best issue in book-length publications.  So,
>> if my own productivity and impact were assessed by how many journal
>> articles I've published in the last five years, I'd look poor (even
>> though . . . well, let's say that I rather suspect that wouldn't be the
>> way I'm perceived by peers in the field).
>> Or is the metric to comprise how many times I'm *cited* in journals?
>> If so, is there some proven correlation between a scholar's impact or
>> significance of publications in the field and how many times he happens
>> to be cited in this one genre of publication?  I'm just a bit
>> suspicious of the assumptions, which I still suspect are drawn (all
>> quite innocently, but naively) from disciplines in which journal
>> publication is much more the main and significant venue for scholarly
>> publication.
>> And, as we all know, "empirical" studies depend entirely on the
>> assumptions that lie at their base.  So their value is heavily framed
>> by the validity and adequacy of the governing assumptions. No
>> accusations, just concerns.
>> Larry Hurtado
>>
>> Quoting Stevan Harnad <harnad at ECS.SOTON.AC.UK>:
>>
>>> On Mon, 18 Sep 2006, Larry Hurtado wrote:
>>>
>>>> Stevan and I have exchanged views on the *feasibility* of a metrics
>>>> approach to assessing research strength in the Humanities, and he's
>>>> impressed me that something such *might well* be feasible *when/if*
>>>> certain as-yet untested and undeveloped things fall into place. I note,
>>>> e.g., in Stevan's addendum to Oppenheim's comment that a way of
>>>> handling
>>>> book-based disciplines "has not yet been looked at", and that a number
>>>> of other matters are as yet "untested".
>>>
>>> Larry is quite right that the (rather obvious and straightforward)
>>> procedure of self-archiving books' metadata and cited references in
>>> order to derive a comprehensive book-citation index (which would
>>> of course include journal articles citing books, books citing books,
>>> and books citing journal articles) had not yet been implemented or
>>> tested.
>>>
>>> However, the way to go about it is quite clear, and awaits only OA
>>> self-archiving mandates (to which a mandate to self-archive one's book
>>> metadata and reference list should be added as a matter of course).
>>>
>>> But please recall that I am an evangelist for OA self-archiving, because
>>> I *know* it can be done, that it works, and that it confers substantial
>>> benefits in terms of research access, usage and impact.
>>>
>>> Insofar as metrics are concerned, I am not an evangelist, but merely an
>>> enthusiast: The evidence is there, almost as clearly as it is with the
>>> OA impact-advantage, that citation counts are strongly correlated with
>>> RAE rankings in every discipline so far tested. Larry seems to pass over
>>> evidence in his remark about the as yet incomplete book citation data
>>> (ISI has some, but they are only partial). But what does he have to say
>>> about the  correlation between RAE rankings and *journal article
>>> citation
>>> counts* in the humanities (i.e., in the "book-based" disciplines)?
>>> Charles will, for example, soon be reporting strong correlations in
>>> Music. Even without having to wait for a book-impact index, it seems
>>> clear that there are as yet no reported empirical exceptions to the
>>> correlation between journal article citation metrics and RAE outcomes.
>>>
>>> (I hope Charles will reply directly, posting some references to his and
>>> others' studies.)
>>>
>>>> This being the case, it is certainly not so a priori to say that a
>>>> metrics approach is not now really feasible for some disciplines.
>>>
>>> Nothing a priori about it: A posteriori, every discipline so far tested
>>> has shown positive correlations between its journal citation counts and
>>> its
>>> RAE rankings, including several Humanities disciplines.
>>>
>>> The advantage of having one last profligate panel-based RAE in parallel
>>> with the metric one in 2008 is that not a stone will be left unturned.
>>> If there prove to be any disciplines having small or non-existent
>>> correlations with metrics, they can and should be evaluated otherwise.
>>> But let us not assume, a priori, that there will be any such
>>> disciplines.
>>>
>>>> I emphasize that my point is not a philosophical one, but strictly
>>>> whether as yet a worked out scheme for handling all Humanities
>>>> disciplines rightly is in place, or capable of being mounted without
>>>> some significant further developments, or even thought out adequately.
>>>
>>> It depends entirely on the size of the metric correlations with the
>>> present RAE rankings. Some disciplines may need some supplementary forms
>>> of (non-metric) evaluation if their correlations are too weak. That is
>>> an
>>> empirical question. Meanwhile, the metrics will also be growing in power
>>> and diversity.
>>>
>>>> That's not an antagonistic question, simply someone asking for the
>>>> basis for the evangelistic stance of Stevan and some others.
>>>
>>> I evangelize for OA self-archiving of research and merely advocate
>>> further development, testing and use of metrics in research performance
>>> assessment, in all disciplines, until/unless evidence appears that there
>>> are exceptions. So far, the objections I know of are all only in the
>>> form of a priori preconceptions and habits, not objective data.
>>>
>>> Stevan Harnad
>>>
>>>> > Charles Oppenheim has authorised me to post this on his behalf:
>>>> >
>>>> >     "Research I have done indicates that the same correlations
>>>> > between
>>>> >     RAE scores and citation counts already noted in the sciences  and
>>>> >     social sciences apply just as strongly (sometimes more strongly)
>>>> >     in the humanities!  But you are right, Richard, that metrics are
>>>> >     PERCEIVED to be inappropriate for the humanities and a lot of
>>>> >     educating is needed on this topic."
>>>
>>
>>
>>
>> L. W. Hurtado, Professor of New Testament Language, Literature & Theology
>> Director of Postgraduate Studies
>> School of Divinity, New College
>> University of Edinburgh
>> Mound Place
>> Edinburgh, UK. EH1 2LX
>> Office Phone:  (0)131 650 8920. FAX:  (0)131 650 7952
>>
>
>
>
> L. W. Hurtado, Professor of New Testament Language, Literature & Theology
> Director of Postgraduate Studies
> School of Divinity, New College
> University of Edinburgh
> Mound Place
> Edinburgh, UK. EH1 2LX
> Office Phone:  (0)131 650 8920. FAX:  (0)131 650 7952
>



L. W. Hurtado, Professor of New Testament Language, Literature & Theology
Director of Postgraduate Studies
School of Divinity, New College
University of Edinburgh
Mound Place
Edinburgh, UK. EH1 2LX
Office Phone:  (0)131 650 8920. FAX:  (0)131 650 7952



More information about the SIGMETRICS mailing list