CWTS Leiden Ranking 2015

David Wojick dwojick at CRAIGELLACHIE.US
Thu May 21 18:56:13 EDT 2015


Thank you, Nees Jan, but the paper is behind pay wall. Perhaps you could send me a copy, to dwojick at craigellachie.us.

In any case my thinking is this. The citation network of science is seamless, or mostly so. That is, there are few,if any, cases where there occurs a cluster such that all the authors therein cite each other and no one else. Thus any segmentation of this seamless network into clusters must require algorithmic assumptions that are more or less arbitrary, in the sense that alternative assumptions are available. 

One wonders, therefore, to what extent the university rankings are sensitive to the specific assumptions made in order to cluster the citation network and create the fields? Have you tested this sensitivity?

I suggest a test. Change the algorithm such that it creates 2000 micro fields instead of 4000. Then rank the universities and see what difference it makes. Mind you this is just a crude first thought and better tests may be possible.

I do not question the value of the work, but as you know using metrics in this way is itself a sensitive issue.

My best regards,

David

On May 21, 2015, at 1:51 AM, "Eck, N.J.P. van" <ecknjpvan at CWTS.LEIDENUNIV.NL> wrote:

> Adminstrative info for SIGMETRICS (for example unsubscribe): http://web.utk.edu/~gwhitney/sigmetrics.html
> Dear David,
> 
>  
> 
> The 4000 fields are constructed using a clustering algorithm based on citation relations between publications. A detailed explanation is provided in the following paper: http://dx.doi.org/10.1002/asi.22748.
> 
>  
> 
> The clustering methodology for constructing the fields is fully transparent. The methodology is documented in the above-mentioned paper, and the computer software that is required to implement the methodology is freely available (open source) at www.ludowaltman.nl/slm/. It is true that the results produced by the clustering methodology are not transparent. The assignment of individual publications to the 4000 fields is not visible. As already mentioned, this is something that hopefully can be improved in the future. Please keep in mind that there is a growing consensus among bibliometricians that the use of the Web of Science subject categories for field normalization of bibliometric indicators is unsatisfactory and does not yield sufficiently accurate results. The normalization approach that is taken in the Leiden Ranking offers a more accurate alternative, but indeed the transparency of the Web of Science subject categories is lost.
> 
>  
> 
> Best regards,
> 
> Nees Jan
> 
>  
> 
>  
> 
> From: ASIS&T Special Interest Group on Metrics [mailto:SIGMETRICS at LISTSERV.UTK.EDU] On Behalf Of David Wojick
> Sent: Wednesday, May 20, 2015 11:23 PM
> To: SIGMETRICS at LISTSERV.UTK.EDU
> Subject: Re: [SIGMETRICS] CWTS Leiden Ranking 2015
> 
>  
> 
> Adminstrative info for SIGMETRICS (for example unsubscribe): http://web.utk.edu/~gwhitney/sigmetrics.html
> 
> Dear Nees Jan,
> 
> How do you apply 4000 field categories to individual papers? A semantic algorithm? Is this explained on the website? It sounds very difficult.
> 
> Also if the categories are not visible how is the methodology transparent?
> 
> My best wishes,
> 
> David
> http://insidepublicaccess.com/
> 
> At 04:06 PM 5/20/2015, you wrote:
> 
> 
> Adminstrative info for SIGMETRICS (for example unsubscribe): http://web.utk.edu/~gwhitney/sigmetrics.html 
> Dear Loet,
>  
> Yes, your understanding is correct. MNCS, TNCS, PP(top 10%), P(top 10%), and the other field-normalized impact indicators all use the 4000 fields for the purpose of normalization. The Web of Science subject categories are not used.
>  
> Unfortunately, the 4000 fields are not visible. Because these fields are defined at the level of individual publications rather than at the journal level, there is no easy way to make the fields visible. This is something that hopefully can be improved in the future.
>  
> We have decided to move from 800 to 4000 fields because our analyses indicate that with 800 fields there still is too much heterogeneity in citation density within fields. A detailed analysis of the effect of performing field normalization at different levels of aggregation is reported in the following paper by Javier Ruiz-Castillo and Ludo Waltman: http://dx.doi.org/10.1016/j.joi.2014.11.010. In this paper, it is also shown that at the level of entire universities field-normalized impact indicators are quite insensitive to the choice of an aggregation level.
>  
> Best regards,
> Nees Jan
>  
>  
> From: ASIS&T Special Interest Group on Metrics [ mailto:SIGMETRICS at LISTSERV.UTK.EDU] On Behalf Of Loet Leydesdorff
> Sent: Wednesday, May 20, 2015 9:28 PM
> To: SIGMETRICS at LISTSERV.UTK.EDU
> Subject: Re: [SIGMETRICS] CWTS Leiden Ranking 2015
>  
> Adminstrative info for SIGMETRICS (for example unsubscribe): http://web.utk.edu/~gwhitney/sigmetrics.html 
> Dear Nees Jan, 
>  
> As always impressive! Thank you.
>  
> Are the approximately 4,000 fields also visible in one way or another? Do I correctly understand that MNCS is defined in relation to these 4,000 fields and not to the 251 WCs? Is there a concordance table between the fields and WCs as there is between WCs and five broad fields in the Excel sheet? 
>  
> I think that I understand from your and Ludo’s previous publications how the 4,000 fields are generated. Why are there 4,000 such fields in 2015, and 800+ in 2014? Isn’t it amazing that trends can despite the discontinuities be smooth? Or are indicators robust across these scales?
>  
> Best wishes, 
> Loet
>  
> 
>  
> 
> Loet Leydesdorff 
> Emeritus University of Amsterdam
> Amsterdam School of Communication Research (ASCoR)
> loet at leydesdorff.net ; http://www.leydesdorff.net/ 
> Honorary Professor, SPRU, University of Sussex; 
> Guest Professor Zhejiang Univ., Hangzhou; Visiting Professor, ISTIC, Beijing;
> Visiting Professor, Birkbeck, University of London; 
> http://scholar.google.com/citations?user=ych9gNYAAAAJ&hl=en
>  
> From: ASIS&T Special Interest Group on Metrics [ mailto:SIGMETRICS at LISTSERV.UTK.EDU] On Behalf Of Eck, N.J.P. van
> Sent: Wednesday, May 20, 2015 8:27 PM
> To: SIGMETRICS at LISTSERV.UTK.EDU
> Subject: [SIGMETRICS] CWTS Leiden Ranking 2015
>  
> Adminstrative info for SIGMETRICS (for example unsubscribe): http://web.utk.edu/~gwhitney/sigmetrics.html 
> Release of the CWTS Leiden Ranking 2015
> Today CWTS has released the 2015 edition of the Leiden Ranking. The CWTS Leiden Ranking 2015 offers key insights into the scientific performance of 750 major universities worldwide. A sophisticated set of bibliometric indicators provides statistics on the scientific impact of universities and on universities’ involvement in scientific collaboration. The CWTS Leiden Ranking 2015 is based on Web of Science indexed publications from the period 2010–2013.
>  
> Improvements and new features in the 2015 edition
> Compared with the 2014 edition of the Leiden Ranking, the 2015 edition includes a number of enhancements. First of all, the 2015 edition offers the possibility to perform trend analyses. Bibliometric statistics are available not only for the period 2010–2013 but also for earlier periods. Second, the 2015 edition of the Leiden Ranking provides new impact indicators based on counting publications that belong to the top 1% or top 50% of their field. And third, improvements have been made to the presentation of the ranking. Size-dependent indicators are presented in a more prominent way, and it is possible to obtain a convenient one-page overview of all bibliometric statistics for a particular university.
>  
> Differences with other university rankings
> Compared with other university rankings, the Leiden Ranking offers more advanced indicators of scientific impact and collaboration and uses a more transparent methodology. The Leiden Ranking does not rely on highly subjective data obtained from reputational surveys or on data provided by universities themselves. Also, the Leiden Ranking refrains from aggregating different dimensions of university performance into a single overall indicator.
>  
> Website
> The Leiden Ranking is available at www.leidenranking.com.
>  
>  
> ========================================================
> Nees Jan van Eck PhD
> Researcher
> Head of ICT
>  
> Centre for Science and Technology Studies
> Leiden University
> P.O. Box 905
> 2300 AX Leiden
> The Netherlands
>  
> Willem Einthoven Building, Room B5-35
> Tel: +31 (0)71 527 6445
> Fax: +31 (0)71 527 3911
> E-mail: ecknjpvan at cwts.leidenuniv.nl
> Homepage: www.neesjanvaneck.nl
> VOSviewer: www.vosviewer.com
> CitNetExplorer: www.citnetexplorer.nl
> ========================================================
>  
>  
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.asis.org/pipermail/sigmetrics/attachments/20150521/ee225c76/attachment.html>


More information about the SIGMETRICS mailing list