STI conference Leiden--Quality standards for evaluation indicators

Loet Leydesdorff loet at LEYDESDORFF.NET
Thu Aug 28 14:38:53 EDT 2014


Dear Stephen, 

 

Nowadays, it is not the program, but individual faculty whom are thus
assessed. 

It was mentioned to me by a Vice Provost who was shocked when he found out.

 

Best,

Loet

 

  _____  

Loet Leydesdorff 

University of Amsterdam
Amsterdam School of Communications Research (ASCoR)

 <mailto:loet at leydesdorff.net> loet at leydesdorff.net ;
<http://www.leydesdorff.net/> http://www.leydesdorff.net/ 
Honorary Professor, SPRU,  <http://www.sussex.ac.uk/spru/> University of
Sussex; 

Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/> , Hangzhou;
Visiting Professor, ISTIC,  <http://www.istic.ac.cn/Eng/brief_en.html>
Beijing;

Visiting Professor, Birkbeck <http://www.bbk.ac.uk/> , University of London;


http://scholar.google.com/citations?user=ych9gNYAAAAJ
<http://scholar.google.com/citations?user=ych9gNYAAAAJ&hl=en> &hl=en  

 

From: ASIS&T Special Interest Group on Metrics
[mailto:SIGMETRICS at LISTSERV.UTK.EDU] On Behalf Of Stephen J Bensman
Sent: Thursday, August 28, 2014 6:49 PM
To: SIGMETRICS at LISTSERV.UTK.EDU
Subject: Re: [SIGMETRICS] STI conference Leiden--Quality standards for
evaluation indicators

 


Loet,

Thank you very much for information about this site.  The fact that many US
universities are utilizing this site is further testimony of the failure of
the last ratings done by the US National Research Council.  These used to be
authoritative and done every ten years.  Discussions were always carried out
within the parameters marked out by this agency and predecessor ones such as
the American Council on Education.  University administrations used to
announce the rankings.  But—interestingly enough—the attention was always
focused on the traditional peer rating rankings.  The bibliometric
indicators were either ignored or used to dispute the peer rating rankings.
University administrations are somewhat handicapped in these matters,
because they are statistically stupid and do not understand things like
standardized scores.  On top of that, it was discovered that major mistakes
were being made.  The most major one was to confuse departmental
organizations with disciplinary categories.  This most affected biological
disciplines, because: 1) some universities have medical schools, some do
not; some universities have colleges of agriculture, some do not; some
schools have vet schools, some do not.  It was then decided that the
ratings, etc., could not be based on departments but subject categories.
This opened an entirely different can of worms—proper subject taxonomy.  It
seems to me that Lutz and you have recently submitted a paper on this
problem.  ;-)

 

For some odd reason, I have become considered the local expert on university
rankings—me, a dumb librarian of all things—and am frequently consulted by
the university administrators on these matters.  I was even asked by the NRC
to test the database it created for the previous ratings.  Having studied
the problem and made my own horrendous mistakes,  I have come to come to
these conclusions:

 

1)      A university like LSU has no chance in hell of breaking into the top
general rankings.  The system is set in concrete and highly stable, and
Louisiana does not have the money to compete at this level.

2)      Therefore, LSU must carefully select the taxonomic ground on which
it will fight. LSU has certain natural advantages: coastal preservation,
wetlands, petroleum, chemistry, fisheries, Southern history, Southern
literature, etc.  It is in these categories that LSU can make its major
contributions and be at the top.

3)      LSU must reform its administrative structure to match its taxonomic
goals, integrating the medical schools, vet school, college of agriculture,
etc. etc.  This must be done, because size counts, and you must enlarge the
size of the taxonomic categories, in which you will choose to compete, as
much as possible.  LSU had been ranked very low in biology, but—when the
medical schools, vet school, college of agriculture, etc., were combined—we
ranked quite highly.  This matches the history of the state, which has been
a pioneer in health care.  

 

I hope that I have not bored you with this, but this is my practitioner
advice on the discussions recently taking place on this listserv.   In
military terms, the high ground is already occupied, and do not make any
frontal assaults there.  Concentrate on defining your taxonomic goals and
throw as much weight as possible in terms of faculty, money, etc., into
these goals.  I think that this is a good way to approach these problems for
most institutions—taxonomic category by selected taxonomic category.  Take
your measures by these.    

 

Respectfully,

Stephen J. Bensman, Ph.D.

LSU Libraries

Louisiana State University

Baton Rouge, LA  USA

 

From: ASIS&T Special Interest Group on Metrics
[mailto:SIGMETRICS at LISTSERV.UTK.EDU] On Behalf Of Loet Leydesdorff
Sent: Thursday, August 28, 2014 10:25 AM
To: SIGMETRICS at LISTSERV.UTK.EDU
Subject: Re: [SIGMETRICS] STI conference Leiden--Quality standards for
evaluation indicators

 


Very interesting is also the following page:

 

http://www.academicanalytics.com/Public/WhatWeDo 

Many US universities use these services.

 

Best,

Loet

 

  _____  

Loet Leydesdorff 

University of Amsterdam
Amsterdam School of Communications Research (ASCoR)

 <mailto:loet at leydesdorff.net> loet at leydesdorff.net ;
<http://www.leydesdorff.net/> http://www.leydesdorff.net/ 
Honorary Professor, SPRU,  <http://www.sussex.ac.uk/spru/> University of
Sussex; 

Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/> , Hangzhou;
Visiting Professor, ISTIC,  <http://www.istic.ac.cn/Eng/brief_en.html>
Beijing;

Visiting Professor, Birkbeck <http://www.bbk.ac.uk/> , University of London;


http://scholar.google.com/citations?user=ych9gNYAAAAJ
<http://scholar.google.com/citations?user=ych9gNYAAAAJ&hl=en> &hl=en  

 

From: ASIS&T Special Interest Group on Metrics
[mailto:SIGMETRICS at LISTSERV.UTK.EDU] On Behalf Of Yves Gingras
Sent: Thursday, August 28, 2014 4:44 PM
To: SIGMETRICS at LISTSERV.UTK.EDU
Subject: Re: [SIGMETRICS] STI conference Leiden--Quality standards for
evaluation indicators

 


Hello all

Here is an importnt document about good and bad indicators to add up to the
discussion on research evaluation:

http://www.scienceadvice.ca/en/assessments/completed/science-performance.asp
x

The whole report is free in pdf.

Best regards


Yves Gingras





Le 27/08/14 20:21, « Ismael Rafols » <ismaelrafols at gmail.com> a écrit :

With apologies for cross-posting)

Dear all,
to warm up forweek in the ST Indicators Conference in Leiden, let us share
the topic of a debate:

Quality standards for evaluation indicators: Any chance for the dream to
come true?
Special session at the STI-ENID conference in Leiden, 3 September 2014,
16-17.30h 
Organisers: Ismael Rafols (INGENIO & SPRU), Paul Wouters (CWTS, Leiden
University), Sarah de Rijcke (CWTS, Leiden University)
Location:  Aalmarkt-hall, Stadsgehoorzaal Leiden

There is a growing realization in the scientometrics community of the need
to offer clearer guidance to users and further develop standards for
professional use of bibliometrics in research evaluations. Indeed the
STI-ENID Conference 2014 has the telling sub-title ‘Context Matters’. This
session continues from the 2013 ISSI and STI conferences in Vienna and
Berlin, where full plenary sessions were convened on the need for standards
in evaluative bibliometrics, and the ethical and policy implications of
individual-level bibliometrics. The need to debate these issues has come to
the forefront in light of reports that uses of certain easy-to-use metrics
for evaluative purposes have become a routine part of academic life, despite
misgivings within the profession itself about its validity. Very recently
high-profile movements against certain metric indicators (e.g. the DORA
declaration about the Journal Impact Factor) have brought possible misuses
of metrics further to the center of attention. There may be a growing need
for standards – also to promote for accountability of scientometricians as
experts.

Indeed the relationship between scientometricians and end-users has been
changing over the years due to factors like: 1. Increasing demands for
bibliometric services in research management at various levels of
aggregation, 2. New capacities and demands for performance information
through the greater availability of new research technologies and their
applications, and 3. The emergence of “citizen bibliometrics” (i.e.
bibliometrics carried out by non-expert end-users) due to larger
availability of data and indicators. Some of these developments may result
in new opportunities for research contributions and information-use, and may
increase effectiveness of bibliometrics due to more advanced indicators and
increased availability of data sets (including web data). Yet some
innovations also risk bypassing the quality control mechanisms of fields
like scientometrics and the standards they promote. The implications of this
increasing scope and intensity of bibliometric practices requires a
concerted response from scientometrics to produce more explicit guidelines
and expert advice on good scientometric practices for specific evaluative
practices such as recruitment, grant awards, institutional or national
benchmarking. 

This special session will bring together scientometric experts,
representatives of funding agencies, policy makers and opinion leaders on
the role of metrics in research assessment to discuss the extent to which
moving towards clearer, standardised guidelines over usage and consultancy
can be achieved, both technically and strategically, and what the guidelines
should look like concretely.

---
Background material:
- Report on International workshop "Guidelines and good practices on
quantitative assessments of research" (OST, Paris, 12 May 2014):
http://www.obs-ost.fr/fractivit%C3%A9s/workshop_international
<https://agenda.upv.es/owa/redir.aspx?C=55006d6295ec47ed88d25cb18366d006
<https://agenda.upv.es/owa/redir.aspx?C=55006d6295ec47ed88d25cb18366d006&amp
;URL=http%3a%2f%2fwww.obs-ost.fr%2ffractivit%25C3%25A9s%2fworkshop_internati
onal>
&URL=http%3a%2f%2fwww.obs-ost.fr%2ffractivit%25C3%25A9s%2fworkshop_inter
national> 
- Blogposts Paul Wouters on previous debates at the ISSI and STI conferences
in 2013, and on the DORA declaration:
http://citationculture.wordpress.com/2013/07/29/bibliometrics-of-individual-
researchers/
<https://agenda.upv.es/owa/redir.aspx?C=55006d6295ec47ed88d25cb18366d006
<https://agenda.upv.es/owa/redir.aspx?C=55006d6295ec47ed88d25cb18366d006&amp
;URL=http%3a%2f%2fcitationculture.wordpress.com%2f2013%2f07%2f29%2fbibliomet
rics-of-individual-researchers%2f>
&URL=http%3a%2f%2fcitationculture.wordpress.com%2f2013%2f07%2f29%2fbibli
ometrics-of-individual-researchers%2f> 
http://citationculture.wordpress.com/2013/10/03/bibliometrics-of-individual-
researchers-the-debate-in-berlin/
<https://agenda.upv.es/owa/redir.aspx?C=55006d6295ec47ed88d25cb18366d006
<https://agenda.upv.es/owa/redir.aspx?C=55006d6295ec47ed88d25cb18366d006&amp
;URL=http%3a%2f%2fcitationculture.wordpress.com%2f2013%2f10%2f03%2fbibliomet
rics-of-individual-researchers-the-debate-in-berlin%2f>
&URL=http%3a%2f%2fcitationculture.wordpress.com%2f2013%2f10%2f03%2fbibli
ometrics-of-individual-researchers-the-debate-in-berlin%2f> 
http://citationculture.wordpress.com/2013/05/23/dora-a-stimulus-for-a-new-ev
aluation-culture-in-science/
<https://agenda.upv.es/owa/redir.aspx?C=55006d6295ec47ed88d25cb18366d006
<https://agenda.upv.es/owa/redir.aspx?C=55006d6295ec47ed88d25cb18366d006&amp
;URL=http%3a%2f%2fcitationculture.wordpress.com%2f2013%2f05%2f23%2fdora-a-st
imulus-for-a-new-evaluation-culture-in-science%2f>
&URL=http%3a%2f%2fcitationculture.wordpress.com%2f2013%2f05%2f23%2fdora-
a-stimulus-for-a-new-evaluation-culture-in-science%2f> 
- Information on the Higher Education Funding Council for England (HEFCE)
"Independent review of the role of metrics in research assessment" + SPRU
response
http://citationculture.wordpress.com/2014/05/02/metrics-in-research-assessme
nt-under-review/
<https://agenda.upv.es/owa/redir.aspx?C=55006d6295ec47ed88d25cb18366d006
<https://agenda.upv.es/owa/redir.aspx?C=55006d6295ec47ed88d25cb18366d006&amp
;URL=http%3a%2f%2fcitationculture.wordpress.com%2f2014%2f05%2f02%2fmetrics-i
n-research-assessment-under-review%2f>
&URL=http%3a%2f%2fcitationculture.wordpress.com%2f2014%2f05%2f02%2fmetri
cs-in-research-assessment-under-review%2f> 
http://www.hefce.ac.uk/whatwedo/rsrch/howfundr/metrics/
<https://agenda.upv.es/owa/redir.aspx?C=55006d6295ec47ed88d25cb18366d006
<https://agenda.upv.es/owa/redir.aspx?C=55006d6295ec47ed88d25cb18366d006&amp
;URL=http%3a%2f%2fwww.hefce.ac.uk%2fwhatwedo%2frsrch%2fhowfundr%2fmetrics%2f
>
&URL=http%3a%2f%2fwww.hefce.ac.uk%2fwhatwedo%2frsrch%2fhowfundr%2fmetric
s%2f> 
https://www.sussex.ac.uk/webteam/gateway/file.php?name=spru-response-final.p
df
<https://www.sussex.ac.uk/webteam/gateway/file.php?name=spru-response-final.
pdf&site=25> &site=25
<https://agenda.upv.es/owa/redir.aspx?C=55006d6295ec47ed88d25cb18366d006
<https://agenda.upv.es/owa/redir.aspx?C=55006d6295ec47ed88d25cb18366d006&amp
;URL=https%3a%2f%2fwww.sussex.ac.uk%2fwebteam%2fgateway%2ffile.php%3fname%3d
spru-response-final.pdf%26site%3d25>
&URL=https%3a%2f%2fwww.sussex.ac.uk%2fwebteam%2fgateway%2ffile.php%3fnam
e%3dspru-response-final.pdf%26site%3d25> 
- Opinion article for JASIST by Sarah de Rijcke and Alex Rushforth "To
intervene, or not to intervene; is that the question? On the role of
scientometrics in research evaluation."
https://citationculture.files.wordpress.com/2014/08/de-rijcke_rushforth_jasi
st_preprint2014.pdf
<https://agenda.upv.es/owa/redir.aspx?C=55006d6295ec47ed88d25cb18366d006
<https://agenda.upv.es/owa/redir.aspx?C=55006d6295ec47ed88d25cb18366d006&amp
;URL=https%3a%2f%2fcitationculture.files.wordpress.com%2f2014%2f08%2fde-rijc
ke_rushforth_jasist_preprint2014.pdf>
&URL=https%3a%2f%2fcitationculture.files.wordpress.com%2f2014%2f08%2fde-
rijcke_rushforth_jasist_preprint2014.pdf> 



Yves Gingras

Professeur 
Département d'histoire
Centre interuniversitaire de recherche
sur la science et la technologie (CIRST) 
Chaire de recherche du Canada en histoire
et sociologie des sciences
Observatoire des sciences et des technologies (OST) 
UQAM
C.P. 8888, Succ. Centre-Ville
Montréal, Québec
Canada, H3C 3P8

Tel: (514)-987-3000-7053
Fax: (514)-987-7726

http://www.chss.uqam.ca
http://www.cirst.uqam.ca
http://www.ost.uqam.ca

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.asis.org/pipermail/sigmetrics/attachments/20140828/34d41883/attachment.html>


More information about the SIGMETRICS mailing list