STI conference Leiden--Quality standards for evaluation indicators

Stephen J Bensman notsjb at LSU.EDU
Thu Aug 28 15:55:27 EDT 2014


Loet,
I am afraid that what is being used more and more in the US for individual evaluations is your Google Scholar Citations (GSC) page.  It is easy to use.  More and more institutions and people are recommending that you put the URL for your Google Scholar Citations page in your CV and on  your Web site.  Even Elsevier is recommending this.  One argument is that people can easily evaluate departments and institutions if the Google Scholar Citations pages of the faculty are easily available.  I know all the arguments against GSC, and most of them are invalid or ISI citations suffer from the same fault.  One advantage of GSC is that it captures your altmetrics-a point often used against it.

That is one reason we are doing so much analysis of GSC.  Our study of economist laureates shows that GSC is highly valid.  With them the h-index and the asymptote are contiguous, validating both GSC and the h-index.  The extreme outliers at the right are usually works on the topics for which they were awarded the prize.  I am sorry, but the hyperlink has replaced the citation as the measure for evaluation purposes.  Time marches on, and technology changes.  The WWW lacks an authority structure, but so do ISI citations.

Respectfully,

Stephen J. Bensman, Ph.D.
LSU Libraries
Louisiana State University
Baton Rouge, LA  USA




From: ASIS&T Special Interest Group on Metrics [mailto:SIGMETRICS at LISTSERV.UTK.EDU] On Behalf Of Loet Leydesdorff
Sent: Thursday, August 28, 2014 1:39 PM
To: SIGMETRICS at LISTSERV.UTK.EDU
Subject: Re: [SIGMETRICS] STI conference Leiden--Quality standards for evaluation indicators

Dear Stephen,

Nowadays, it is not the program, but individual faculty whom are thus assessed.
It was mentioned to me by a Vice Provost who was shocked when he found out.

Best,
Loet

________________________________
Loet Leydesdorff
University of Amsterdam
Amsterdam School of Communications Research (ASCoR)
loet at leydesdorff.net <mailto:loet at leydesdorff.net> ; http://www.leydesdorff.net/
Honorary Professor, SPRU, <http://www.sussex.ac.uk/spru/> University of Sussex;
Guest Professor Zhejiang Univ.<http://www.zju.edu.cn/english/>, Hangzhou; Visiting Professor, ISTIC, <http://www.istic.ac.cn/Eng/brief_en.html> Beijing;
Visiting Professor, Birkbeck<http://www.bbk.ac.uk/>, University of London;
http://scholar.google.com/citations?user=ych9gNYAAAAJ&hl=en

From: ASIS&T Special Interest Group on Metrics [mailto:SIGMETRICS at LISTSERV.UTK.EDU] On Behalf Of Stephen J Bensman
Sent: Thursday, August 28, 2014 6:49 PM
To: SIGMETRICS at LISTSERV.UTK.EDU<mailto:SIGMETRICS at LISTSERV.UTK.EDU>
Subject: Re: [SIGMETRICS] STI conference Leiden--Quality standards for evaluation indicators

Loet,
Thank you very much for information about this site.  The fact that many US universities are utilizing this site is further testimony of the failure of the last ratings done by the US National Research Council.  These used to be authoritative and done every ten years.  Discussions were always carried out within the parameters marked out by this agency and predecessor ones such as the American Council on Education.  University administrations used to announce the rankings.  But-interestingly enough-the attention was always focused on the traditional peer rating rankings.  The bibliometric indicators were either ignored or used to dispute the peer rating rankings.  University administrations are somewhat handicapped in these matters, because they are statistically stupid and do not understand things like standardized scores.  On top of that, it was discovered that major mistakes were being made.  The most major one was to confuse departmental organizations with disciplinary categories.  This most affected biological disciplines, because: 1) some universities have medical schools, some do not; some universities have colleges of agriculture, some do not; some schools have vet schools, some do not.  It was then decided that the ratings, etc., could not be based on departments but subject categories.  This opened an entirely different can of worms-proper subject taxonomy.  It seems to me that Lutz and you have recently submitted a paper on this problem.  ;-)

For some odd reason, I have become considered the local expert on university rankings-me, a dumb librarian of all things-and am frequently consulted by the university administrators on these matters.  I was even asked by the NRC to test the database it created for the previous ratings.  Having studied the problem and made my own horrendous mistakes,  I have come to come to these conclusions:


1)      A university like LSU has no chance in hell of breaking into the top general rankings.  The system is set in concrete and highly stable, and Louisiana does not have the money to compete at this level.

2)      Therefore, LSU must carefully select the taxonomic ground on which it will fight. LSU has certain natural advantages: coastal preservation, wetlands, petroleum, chemistry, fisheries, Southern history, Southern literature, etc.  It is in these categories that LSU can make its major contributions and be at the top.

3)      LSU must reform its administrative structure to match its taxonomic goals, integrating the medical schools, vet school, college of agriculture, etc. etc.  This must be done, because size counts, and you must enlarge the size of the taxonomic categories, in which you will choose to compete, as much as possible.  LSU had been ranked very low in biology, but-when the medical schools, vet school, college of agriculture, etc., were combined-we ranked quite highly.  This matches the history of the state, which has been a pioneer in health care.

I hope that I have not bored you with this, but this is my practitioner advice on the discussions recently taking place on this listserv.   In military terms, the high ground is already occupied, and do not make any frontal assaults there.  Concentrate on defining your taxonomic goals and throw as much weight as possible in terms of faculty, money, etc., into these goals.  I think that this is a good way to approach these problems for most institutions-taxonomic category by selected taxonomic category.  Take your measures by these.

Respectfully,
Stephen J. Bensman, Ph.D.
LSU Libraries
Louisiana State University
Baton Rouge, LA  USA

From: ASIS&T Special Interest Group on Metrics [mailto:SIGMETRICS at LISTSERV.UTK.EDU] On Behalf Of Loet Leydesdorff
Sent: Thursday, August 28, 2014 10:25 AM
To: SIGMETRICS at LISTSERV.UTK.EDU<mailto:SIGMETRICS at LISTSERV.UTK.EDU>
Subject: Re: [SIGMETRICS] STI conference Leiden--Quality standards for evaluation indicators

Very interesting is also the following page:

http://www.academicanalytics.com/Public/WhatWeDo
Many US universities use these services.

Best,
Loet

________________________________
Loet Leydesdorff
University of Amsterdam
Amsterdam School of Communications Research (ASCoR)
loet at leydesdorff.net <mailto:loet at leydesdorff.net> ; http://www.leydesdorff.net/
Honorary Professor, SPRU, <http://www.sussex.ac.uk/spru/> University of Sussex;
Guest Professor Zhejiang Univ.<http://www.zju.edu.cn/english/>, Hangzhou; Visiting Professor, ISTIC, <http://www.istic.ac.cn/Eng/brief_en.html> Beijing;
Visiting Professor, Birkbeck<http://www.bbk.ac.uk/>, University of London;
http://scholar.google.com/citations?user=ych9gNYAAAAJ&hl=en

From: ASIS&T Special Interest Group on Metrics [mailto:SIGMETRICS at LISTSERV.UTK.EDU] On Behalf Of Yves Gingras
Sent: Thursday, August 28, 2014 4:44 PM
To: SIGMETRICS at LISTSERV.UTK.EDU<mailto:SIGMETRICS at LISTSERV.UTK.EDU>
Subject: Re: [SIGMETRICS] STI conference Leiden--Quality standards for evaluation indicators

Hello all

Here is an importnt document about good and bad indicators to add up to the discussion on research evaluation:

http://www.scienceadvice.ca/en/assessments/completed/science-performance.aspx

The whole report is free in pdf.

Best regards


Yves Gingras





Le 27/08/14 20:21, « Ismael Rafols » <ismaelrafols at gmail.com<mailto:ismaelrafols at gmail.com>> a écrit :
With apologies for cross-posting)

Dear all,
to warm up forweek in the ST Indicators Conference in Leiden, let us share the topic of a debate:

Quality standards for evaluation indicators: Any chance for the dream to come true?
Special session at the STI-ENID conference in Leiden, 3 September 2014, 16-17.30h
Organisers: Ismael Rafols (INGENIO & SPRU), Paul Wouters (CWTS, Leiden University), Sarah de Rijcke (CWTS, Leiden University)
Location:  Aalmarkt-hall, Stadsgehoorzaal Leiden

There is a growing realization in the scientometrics community of the need to offer clearer guidance to users and further develop standards for professional use of bibliometrics in research evaluations. Indeed the STI-ENID Conference 2014 has the telling sub-title 'Context Matters'. This session continues from the 2013 ISSI and STI conferences in Vienna and Berlin, where full plenary sessions were convened on the need for standards in evaluative bibliometrics, and the ethical and policy implications of individual-level bibliometrics. The need to debate these issues has come to the forefront in light of reports that uses of certain easy-to-use metrics for evaluative purposes have become a routine part of academic life, despite misgivings within the profession itself about its validity. Very recently high-profile movements against certain metric indicators (e.g. the DORA declaration about the Journal Impact Factor) have brought possible misuses of metrics further to the center of attention. There may be a growing need for standards - also to promote for accountability of scientometricians as experts.

Indeed the relationship between scientometricians and end-users has been changing over the years due to factors like: 1. Increasing demands for bibliometric services in research management at various levels of aggregation, 2. New capacities and demands for performance information through the greater availability of new research technologies and their applications, and 3. The emergence of "citizen bibliometrics" (i.e. bibliometrics carried out by non-expert end-users) due to larger availability of data and indicators. Some of these developments may result in new opportunities for research contributions and information-use, and may increase effectiveness of bibliometrics due to more advanced indicators and increased availability of data sets (including web data). Yet some innovations also risk bypassing the quality control mechanisms of fields like scientometrics and the standards they promote. The implications of this increasing scope and intensity of bibliometric practices requires a concerted response from scientometrics to produce more explicit guidelines and expert advice on good scientometric practices for specific evaluative practices such as recruitment, grant awards, institutional or national benchmarking.

This special session will bring together scientometric experts, representatives of funding agencies, policy makers and opinion leaders on the role of metrics in research assessment to discuss the extent to which moving towards clearer, standardised guidelines over usage and consultancy can be achieved, both technically and strategically, and what the guidelines should look like concretely.

---
Background material:
- Report on International workshop "Guidelines and good practices on quantitative assessments of research" (OST, Paris, 12 May 2014):  http://www.obs-ost.fr/fractivit%C3%A9s/workshop_international <https://agenda.upv.es/owa/redir.aspx?C=55006d6295ec47ed88d25cb18366d006&URL=http%3a%2f%2fwww.obs-ost.fr%2ffractivit%25C3%25A9s%2fworkshop_international>
- Blogposts Paul Wouters on previous debates at the ISSI and STI conferences in 2013, and on the DORA declaration:
http://citationculture.wordpress.com/2013/07/29/bibliometrics-of-individual-researchers/ <https://agenda.upv.es/owa/redir.aspx?C=55006d6295ec47ed88d25cb18366d006&URL=http%3a%2f%2fcitationculture.wordpress.com%2f2013%2f07%2f29%2fbibliometrics-of-individual-researchers%2f>
http://citationculture.wordpress.com/2013/10/03/bibliometrics-of-individual-researchers-the-debate-in-berlin/ <https://agenda.upv.es/owa/redir.aspx?C=55006d6295ec47ed88d25cb18366d006&URL=http%3a%2f%2fcitationculture.wordpress.com%2f2013%2f10%2f03%2fbibliometrics-of-individual-researchers-the-debate-in-berlin%2f>
http://citationculture.wordpress.com/2013/05/23/dora-a-stimulus-for-a-new-evaluation-culture-in-science/ <https://agenda.upv.es/owa/redir.aspx?C=55006d6295ec47ed88d25cb18366d006&URL=http%3a%2f%2fcitationculture.wordpress.com%2f2013%2f05%2f23%2fdora-a-stimulus-for-a-new-evaluation-culture-in-science%2f>
- Information on the Higher Education Funding Council for England (HEFCE) "Independent review of the role of metrics in research assessment" + SPRU response
http://citationculture.wordpress.com/2014/05/02/metrics-in-research-assessment-under-review/ <https://agenda.upv.es/owa/redir.aspx?C=55006d6295ec47ed88d25cb18366d006&URL=http%3a%2f%2fcitationculture.wordpress.com%2f2014%2f05%2f02%2fmetrics-in-research-assessment-under-review%2f>
http://www.hefce.ac.uk/whatwedo/rsrch/howfundr/metrics/ <https://agenda.upv.es/owa/redir.aspx?C=55006d6295ec47ed88d25cb18366d006&URL=http%3a%2f%2fwww.hefce.ac.uk%2fwhatwedo%2frsrch%2fhowfundr%2fmetrics%2f>
https://www.sussex.ac.uk/webteam/gateway/file.php?name=spru-response-final.pdf&site=25 <https://agenda.upv.es/owa/redir.aspx?C=55006d6295ec47ed88d25cb18366d006&URL=https%3a%2f%2fwww.sussex.ac.uk%2fwebteam%2fgateway%2ffile.php%3fname%3dspru-response-final.pdf%26site%3d25>
- Opinion article for JASIST by Sarah de Rijcke and Alex Rushforth "To intervene, or not to intervene; is that the question? On the role of scientometrics in research evaluation."
https://citationculture.files.wordpress.com/2014/08/de-rijcke_rushforth_jasist_preprint2014.pdf <https://agenda.upv.es/owa/redir.aspx?C=55006d6295ec47ed88d25cb18366d006&URL=https%3a%2f%2fcitationculture.files.wordpress.com%2f2014%2f08%2fde-rijcke_rushforth_jasist_preprint2014.pdf>


Yves Gingras

Professeur
Département d'histoire
Centre interuniversitaire de recherche
sur la science et la technologie (CIRST)
Chaire de recherche du Canada en histoire
et sociologie des sciences
Observatoire des sciences et des technologies (OST)
UQAM
C.P. 8888, Succ. Centre-Ville
Montréal, Québec
Canada, H3C 3P8

Tel: (514)-987-3000-7053
Fax: (514)-987-7726

http://www.chss.uqam.ca
http://www.cirst.uqam.ca
http://www.ost.uqam.ca
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.asis.org/pipermail/sigmetrics/attachments/20140828/0c1835d1/attachment.html>


More information about the SIGMETRICS mailing list