CWTS Journal Indicators
Sylvan Katz
j.s.katz at SUSSEX.AC.UK
Thu Sep 26 18:40:58 EDT 2013
RIP/SNIP Indicator Question
Does any one know if the distribution of citations in the current year
to papers in the three preceding years is a scaling distribution?
Sylvan Katz
Visiting Research Fellow
http://www.sussex.ac.uk/Users/sylvank/
On 9/26/2013 12:15 PM, Yves Gingras wrote:
> Adminstrative info for SIGMETRICS (for example unsubscribe):
> http://web.utk.edu/~gwhitney/sigmetrics.html Re: [SIGMETRICS] CWTS
> Journal Indicators
> Hello all
>
> In thinking about publishing _every year_ an indicator we clearly know
> will be used for evaluation purposes and thus can (and will) have
> conrete effects on rersearchers, one should also think about the meaning
> of having such an indicator calculated _every year_ as if a change in a
> single year made sense and could be usful for evaluations. It would be
> much more meaningful to calculate a moving average based on 2 or 3 years
> (for example). There is a real danger about trying to interpret a
> variation over a single year as it may be subject to meaningless
> stochastic variations and then be used by administrators as if it meant
> something real. Such an interpreation can in turn generate decisions
> that are not rational.
>
> One can undersand that a business (like THES of QS for example) want to
> be visibile every year with a “product” ranking institutions, for
> example, but as scholars we should make clear that annual variations in
> a complex system like science are rarely meaningful.
>
> Publishing and “announcing” a simple yearly value for an indicator can
> thus generate perverse effects and contribute to the “fever” of
> evaluation by administrators of research.
>
> Best regards
>
> Yves Gingras
>
>
> Le 25/09/13 12:18, « Éric Archambault »
> <eric.archambault at SCIENCE-METRIX.COM> a écrit :
>
> Adminstrative info for SIGMETRICS (for example unsubscribe):
> http://web.utk.edu/~gwhitney/sigmetrics.html
> Dear Nees Jan
>
> Thank you for this addition to the growing list of journal
> indicators. Having a publicly accessible list of scores like this is
> really important and will play an important role in the debate on
> journal impact. Having rigorous researchers such as the ones at CWTS
> pursuing this project initiated by Michel Zitt and Henry Small and
> pursued by Moed is certainly useful.
>
> However, I feel this is still at the stage of a research project and
> we should be careful to characterize our indicators carefully before
> telling the wider community that they are ready for prime time. We
> can’t afford to have any more flaky journal impact indicators. This
> is now the forth proposition for such an indicator, after the
> Journal Impact Factor, the Scimago indicator, and the Eigenfactor.
> In this context, allowing both practitioners and users to decide
> which one seems to have the greatest scientific merit is essential.
> This requires that the methods and all ingredients be known to users
> and practitioners. Your paper is useful to understand the recipe but
> some ingredients are missing from the public disclosure and these
> need to be made public to help the community characterize your tool.
>
> In particular, I think a few more details on the methods would be
> useful here. Firstly, having more details about the bootsrapping
> method that you use to compute the stability intervals would be
> welcome. Have you written a paper on this technique? Secondly, an
> additional column with the field of each journal would be more
> transparent and useful to users.
>
> Do you have an explanation for this behavior:
> The Journal of Engineering Education is one of the source journals
> with the highest SNIP in 2008. Is this an artifact or in 2008 the
> journal became that good compared to 2007? I find the jump
> surprising as it is outside the boundaries that you calculated. Of
> course, it is not impossible by chance to fall outside these, just
> that the jump is somewhat large.
>
> Kind regards
>
> Eric
>
> Source title Source type Print ISSN Year P SNIP SNIP (lower bound)
> SNIP (upper bound) % self cit
> Journal of Engineering Education Journal 1069-4730 2002 207 7.901979
> 6.625 9.355 16%
> Journal of Engineering Education Journal 1069-4730 2003 215 6.587213
> 5.393 7.838 11%
> Journal of Engineering Education Journal 1069-4730 2004 194 9.710727
> 7.719 11.838 9%
> Journal of Engineering Education Journal 1069-4730 2005 133 2.498504
> 1.685 3.463 48%
> Journal of Engineering Education Journal 1069-4730 2006 111 4.458215
> 3.042 6.121 16%
> Journal of Engineering Education Journal 1069-4730 2007 98 6.650165
> 4.437 9.274 23%
> Journal of Engineering Education Journal 1069-4730 2008 93 20.62702
> 14.286 28.396 10%
> Journal of Engineering Education Journal 1069-4730 2009 75 15.92148
> 12.191 20.305 16%
> Journal of Engineering Education Journal 1069-4730 2010 77 16.12523
> 12.181 20.454 14%
> Journal of Engineering Education Journal 1069-4730 2011 76 16.1012
> 11.783 21.15 14%
> Journal of Engineering Education Journal 1069-4730 2012 90 12.49939
> 9.933 15.098 7%
>
>
> *From:* ASIS&T Special Interest Group on Metrics
> [mailto:SIGMETRICS at LISTSERV.UTK.EDU] *On Behalf Of *Nees Jan van Eck
> *Sent:* September-25-13 10:09 AM
> *To:* SIGMETRICS at LISTSERV.UTK.EDU
> *Subject:* [SIGMETRICS] CWTS Journal Indicators
>
> Adminstrative info for SIGMETRICS (for example unsubscribe):
> http://web.utk.edu/~gwhitney/sigmetrics.html
> The 2012 SNIP values have been released on CWTS Journal Indicators
> (www.journalindicators.com <http://www.journalindicators.com>
> <http://www.journalindicators.com> ). SNIP (source normalized impact
> per paper) is a freely available journal impact indicator that uses
> a source normalization mechanism to correct for differences in
> citation practices between fields of science. Compared with the
> journal impact factor, SNIP allows for more accurate comparisons
> between journals active in different scientific fields. SNIP is
> calculated by CWTS based on Elsevier’s Scopus database. With the
> release of the 2012 SNIP values, stability intervals have been added
> to CWTS Journal Indicators. These intervals indicate the reliability
> of the SNIP value of a journal. For instance, if a journal’s SNIP
> value is largely due to a single very highly cited publication, this
> is indicated by a wide stability interval. SNIP is the only freely
> available journal impact indicator that is presented with stability
> intervals.
>
> Your feedback on CWTS Journal Indicators is greatly appreciated.
>
> Best regards,
> Nees Jan van Eck
>
> ========================================================
> Nees Jan van Eck PhD
> Researcher
> Head of ICT
>
> Centre for Science and Technology Studies
> Leiden University
> P.O. Box 905
> 2300 AX Leiden
> The Netherlands
>
> Willem Einthoven Building, Room B5-35
> Tel: +31 (0)71 527 6445
> Fax: +31 (0)71 527 3911
> E-mail: ecknjpvan at cwts.leidenuniv.nl
> <mailto:ecknjpvan at cwts.leidenuniv.nl>
> <mailto:ecknjpvan at cwts.leidenuniv.nl>
> Homepage: www.neesjanvaneck.nl <http://www.neesjanvaneck.nl/>
> <http://www.neesjanvaneck.nl/>
> VOSviewer: www.vosviewer.com <http://www.vosviewer.com/>
> <http://www.vosviewer.com/>
> ========================================================
>
>
>
>
> Yves Gingras
>
> Professeur
> Département d'histoire
> Centre interuniversitaire de recherche
> sur la science et la technologie (CIRST)
> Chaire de recherche du Canada en histoire
> et sociologie des sciences
> Observatoire des sciences et des technologies (OST)
> UQAM
> C.P. 8888, Succ. Centre-Ville
> Montréal, Québec
> Canada, H3C 3P8
>
> Tel: (514)-987-3000-7053
> Fax: (514)-987-7726
>
> http://www.chss.uqam.ca
> http://www.cirst.uqam.ca
> http://www.ost.uqam.ca
More information about the SIGMETRICS
mailing list