STI conference Leiden--Quality standards for evaluation indicators

Loet Leydesdorff loet at LEYDESDORFF.NET
Thu Aug 28 11:25:02 EDT 2014

Very interesting is also the following page: 

Many US universities use these services.






Loet Leydesdorff 

University of Amsterdam
Amsterdam School of Communications Research (ASCoR)

 <mailto:loet at> loet at ;
Honorary Professor, SPRU,  <> University of

Guest Professor Zhejiang Univ. <> , Hangzhou;
Visiting Professor, ISTIC,  <>

Visiting Professor, Birkbeck <> , University of London;
<> &hl=en  


From: ASIS&T Special Interest Group on Metrics
[mailto:SIGMETRICS at LISTSERV.UTK.EDU] On Behalf Of Yves Gingras
Sent: Thursday, August 28, 2014 4:44 PM
Subject: Re: [SIGMETRICS] STI conference Leiden--Quality standards for
evaluation indicators


Hello all

Here is an importnt document about good and bad indicators to add up to the
discussion on research evaluation:

The whole report is free in pdf.

Best regards

Yves Gingras

Le 27/08/14 20:21, « Ismael Rafols » <ismaelrafols at> a écrit :

With apologies for cross-posting)

Dear all,
to warm up forweek in the ST Indicators Conference in Leiden, let us share
the topic of a debate:

Quality standards for evaluation indicators: Any chance for the dream to
come true?
Special session at the STI-ENID conference in Leiden, 3 September 2014,
Organisers: Ismael Rafols (INGENIO & SPRU), Paul Wouters (CWTS, Leiden
University), Sarah de Rijcke (CWTS, Leiden University)
Location:  Aalmarkt-hall, Stadsgehoorzaal Leiden

There is a growing realization in the scientometrics community of the need
to offer clearer guidance to users and further develop standards for
professional use of bibliometrics in research evaluations. Indeed the
STI-ENID Conference 2014 has the telling sub-title ‘Context Matters’. This
session continues from the 2013 ISSI and STI conferences in Vienna and
Berlin, where full plenary sessions were convened on the need for standards
in evaluative bibliometrics, and the ethical and policy implications of
individual-level bibliometrics. The need to debate these issues has come to
the forefront in light of reports that uses of certain easy-to-use metrics
for evaluative purposes have become a routine part of academic life, despite
misgivings within the profession itself about its validity. Very recently
high-profile movements against certain metric indicators (e.g. the DORA
declaration about the Journal Impact Factor) have brought possible misuses
of metrics further to the center of attention. There may be a growing need
for standards – also to promote for accountability of scientometricians as

Indeed the relationship between scientometricians and end-users has been
changing over the years due to factors like: 1. Increasing demands for
bibliometric services in research management at various levels of
aggregation, 2. New capacities and demands for performance information
through the greater availability of new research technologies and their
applications, and 3. The emergence of “citizen bibliometrics” (i.e.
bibliometrics carried out by non-expert end-users) due to larger
availability of data and indicators. Some of these developments may result
in new opportunities for research contributions and information-use, and may
increase effectiveness of bibliometrics due to more advanced indicators and
increased availability of data sets (including web data). Yet some
innovations also risk bypassing the quality control mechanisms of fields
like scientometrics and the standards they promote. The implications of this
increasing scope and intensity of bibliometric practices requires a
concerted response from scientometrics to produce more explicit guidelines
and expert advice on good scientometric practices for specific evaluative
practices such as recruitment, grant awards, institutional or national

This special session will bring together scientometric experts,
representatives of funding agencies, policy makers and opinion leaders on
the role of metrics in research assessment to discuss the extent to which
moving towards clearer, standardised guidelines over usage and consultancy
can be achieved, both technically and strategically, and what the guidelines
should look like concretely.

Background material:
- Report on International workshop "Guidelines and good practices on
quantitative assessments of research" (OST, Paris, 12 May 2014):
- Blogposts Paul Wouters on previous debates at the ISSI and STI conferences
in 2013, and on the DORA declaration:
- Information on the Higher Education Funding Council for England (HEFCE)
"Independent review of the role of metrics in research assessment" + SPRU
pdf&site=25> &site=25
- Opinion article for JASIST by Sarah de Rijcke and Alex Rushforth "To
intervene, or not to intervene; is that the question? On the role of
scientometrics in research evaluation."

Yves Gingras

Département d'histoire
Centre interuniversitaire de recherche
sur la science et la technologie (CIRST) 
Chaire de recherche du Canada en histoire
et sociologie des sciences
Observatoire des sciences et des technologies (OST) 
C.P. 8888, Succ. Centre-Ville
Montréal, Québec
Canada, H3C 3P8

Tel: (514)-987-3000-7053
Fax: (514)-987-7726

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <>

More information about the SIGMETRICS mailing list