STI conference Leiden--Quality standards for evaluation indicators

Katy Borner katy at INDIANA.EDU
Tue Sep 2 09:00:06 EDT 2014

Dear all,
I very much like to see that there will be a discussion about quality 
standards for evaluation indicators at STI.
The development of validated data analysis and visualization 
workflows/tools and the generation of replicable results is at the core 
of any scientific effort that aims to convert data into actionable 
insights. Good to see the background material provided (you might also 
like to review results from the below workshops) and hope results of the 
special session will be shared widely.
Best regards,

*Recent Standards Workshops*

OECD-experts dialogue on scientometrics: Improving the use of 
bibliometric indicators and analysis for policy-making 
March 25, 2014 | OECD, Paris, France

Science Mapping Standards Workshop 
November 04-05, 2013 | Bloomington, Indiana

Standards for Science Mapping and Classifications 
July 15, 2013 | ISSI, Vienna, Austria

JSMF Workshop on Standards for Science Metrics, Classifications, and 
Mapping <>
August 11-12, 2011 | Bloomington, IN

On 8/28/2014 4:12 AM, Paul Wouters wrote:
> Adminstrative info for SIGMETRICS (for example unsubscribe): 
> Dear Loet and Lutz,
> Many thanks for this contribution. The motivation for the discussion 
> about standards, as far as I am concerned, is the need to protect 
> research groups and researchers against sloppy or damaging evaluation 
> practices. I agree with Loet that standards are often a powerful 
> competition weapon to protect industry interests. It is certainly not 
> the motivation for this panel, but it may end up like that if the 
> process of standard setting, and the sociological interpretation of 
> those standards, is not taken into account carefully. In my view the 
> STI conference is the best place to have this discussion, because it 
> is a meeting place between metrics experts and policy experts. In my 
> view, this does not lead to the question whether or not one should 
> have some quality control process of evaluation processes, but what 
> kind of quality control we need and what kind of standards with 
> respect to data and indicators can play a role in this.
> In other words, you have raised a crucial point for the panel 
> discussion next week.
> Regards,
> Paul Wouters
> Professor of Scientometrics
> Director Centre for Science and Technology Studies
> Leiden University
> PS: I am pleased to announce the release of our completely renewed 
> CWTS website:
> <> - all information now easily available!
> Visiting address:
> Willem Einthoven Building
> Wassenaarseweg 62A
> 2333 AL Leiden
> Mail address: P.O. Box 905
> 2300 AX Leiden
> T: +31 71 5273909 (secr.)
> F: +31 71 5273911
> E: p.f.wouters at <mailto:p.f.wouters at>
>     *From:*ASIS&T Special Interest Group on Metrics
>     <mailto:SIGMETRICS at LISTSERV.UTK.EDU>] *On Behalf Of *Ismael Rafols
>     *Sent:* Thursday, August 28, 2014 2:21 AM
>     *Subject:* [SIGMETRICS] STI conference Leiden--Quality standards
>     for evaluation indicators
>     Adminstrative info for SIGMETRICS (for example unsubscribe):
>     <>
>     With apologies for cross-posting)
>     Dear all,
>     to warm up forweek in the ST Indicators Conference in Leiden, let
>     us share the topic of a debate:
>     *Quality standards for evaluation indicators: Any chance for the
>     dream to come true?*
>     /Special session at the STI-ENID conference in Leiden, 3 September
>     2014, 16-17.30h /
>     /Organisers/: Ismael Rafols (INGENIO & SPRU), Paul Wouters (CWTS,
>     Leiden University), Sarah de Rijcke (CWTS, Leiden University)
>     /Location/:  Aalmarkt-hall, Stadsgehoorzaal Leiden
>     There is a growing realization in the scientometrics community of
>     the need to offer clearer guidance to users and further develop
>     standards for professional use of bibliometrics in research
>     evaluations. Indeed the STI-ENID Conference 2014 has the telling
>     sub-title ‘Context Matters’. This session continues from the 2013
>     ISSI and STI conferences in Vienna and Berlin, where full plenary
>     sessions were convened on the need for standards in evaluative
>     bibliometrics, and the ethical and policy implications of
>     individual-level bibliometrics. The need to debate these issues
>     has come to the forefront in light of reports that uses of certain
>     easy-to-use metrics for evaluative purposes have become a routine
>     part of academic life, despite misgivings within the profession
>     itself about its validity. Very recently high-profile movements
>     against certain metric indicators (e.g. the DORA declaration about
>     the Journal Impact Factor) have brought possible misuses of
>     metrics further to the center of attention. There may be a growing
>     need for standards – also to promote for accountability of
>     scientometricians as experts.
>     Indeed the relationship between scientometricians and end-users
>     has been changing over the years due to factors like: 1.
>     Increasing demands for bibliometric services in research
>     management at various levels of aggregation, 2. New capacities and
>     demands for performance information through the greater
>     availability of new research technologies and their applications,
>     and 3. The emergence of “citizen bibliometrics” (i.e.
>     bibliometrics carried out by non-expert end-users) due to larger
>     availability of data and indicators. Some of these developments
>     may result in new opportunities for research contributions and
>     information-use, and may increase effectiveness of bibliometrics
>     due to more advanced indicators and increased availability of data
>     sets (including web data). Yet some innovations also risk
>     bypassing the quality control mechanisms of fields like
>     scientometrics and the standards they promote. The implications of
>     this increasing scope and intensity of bibliometric practices
>     requires a concerted response from scientometrics to produce more
>     explicit guidelines and expert advice on good scientometric
>     practices for specific evaluative practices such as recruitment,
>     grant awards, institutional or national benchmarking.
>     This special session will bring together scientometric experts,
>     representatives of funding agencies, policy makers and opinion
>     leaders on the role of metrics in research assessment to discuss
>     the extent to which moving towards clearer, standardised
>     guidelines over usage and consultancy can be achieved, both
>     technically and strategically, and what the guidelines should look
>     like concretely.
>     ---
>     /Background material/:
>     - Report on International workshop "Guidelines and good practices
>     on quantitative assessments of research" (OST, Paris, 12 May
>     2014):
>     <>
>     - Blogposts Paul Wouters on previous debates at the ISSI and STI
>     conferences in 2013, and on the DORA declaration:
>     <>
>     <>
>     <>
>     - Information on the Higher Education Funding Council for England
>     (HEFCE) "Independent review of the role of metrics in research
>     assessment" + SPRU response
>     <>
>     <>
>     <>
>     - Opinion article for JASIST by Sarah de Rijcke and Alex Rushforth
>     "To intervene, or not to intervene; is that the question? On the
>     role of scientometrics in research evaluation."
>     <>

Katy Borner
Victor H. Yngve Professor of Information Science
Director, CI for Network Science Center,
Curator, Mapping Science exhibit,

ILS, School of Informatics and Computing, Indiana University
Wells Library 021, 1320 E. Tenth Street, Bloomington, IN 47405, USA
Phone: (812) 855-3256  Fax: -6166

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <>

More information about the SIGMETRICS mailing list