CWTS Journal Indicators

Nees Jan van Eck info at VOSVIEWER.COM
Thu Sep 26 05:03:50 EDT 2013


Dear Eric,

 

Thank you for your feedback on CWTS Journal Indicators.

 

We haven’t documented the bootstrapping technique applied to the SNIP
indicator in a paper. This is because it is a standard statistical
technique. There is nothing new in our application of this technique. As you
probably know, we also use the bootstrapping technique in the construction
of stability intervals in the CWTS Leiden Ranking. In our paper on this
ranking (http://dx.doi.org/10.1002/asi.22708), we briefly explain the
technique (p. 2429) and we offer some literature references.

 

We appreciate your idea of adding a column with the field(s) to which a
journal belongs. We will update this in the next few weeks.

 

Your remark on the Journal of Engineering Education is a valuable one. It
allows me to explain some of the difficulties of calculating journal impact
indicators based on a large multidisciplinary database (Scopus) that
includes not only major international scientific journals but also a large
number of other types of sources, such as trade journals, conference
proceedings, and more locally oriented scientific journals. Part of the SNIP
philosophy (but the same applies to the CWTS Leiden Ranking as well) is that
in the calculation of field-normalized impact indicators one’s starting
point should not simply be the entire bibliographic database one has
available (Scopus or Web of Science) but rather a selection of sources
within one’s database that, based on certain criteria, can be considered to
be reasonably comparable to each other. For instance,  an international
scientific journal and a trade journal targeted at an industrial audience
can hardly be considered comparable to each other. What is needed is a
consistent set of sources selected based on clear criteria (rather than
based on the not so clear criteria used by database producers). In the case
of SNIP, we have chosen to work with all sources targeted at a scientific
audience (i.e., no trade journals) that give at least a certain minimum
amount of references to other sources. In the SNIP calculation, sources that
do not satisfy these criteria are excluded. Citations originating from these
sources are not counted (but a SNIP value can still be calculated for these
sources).

 

Why does the Journal of Engineering Education display such a large SNIP
increase between 2007 and 2008? It turns out that the Journal of Engineering
Education receives a considerable share of its citations from the
proceedings of the ASEE Annual Conference and Exposition. Publications in
these proceedings tend to have only a small number of references (within the
three-year citation window used by SNIP), but the number of references has
increased over time. For this reason, before 2008, the proceedings of the
ASEE Annual Conference and Exposition were excluded from the SNIP
calculation, but starting from 2008 they met our inclusion criteria (at
least 20% of the publications have at least one ‘active’ reference) and were
included in the calculation. This resulted in a large increase in the SNIP
value of the Journal of Engineering Education. As a consequence of the
inclusion of the proceedings of the ASEE Annual Conference and Exposition in
2008, the Journal of Engineering Education not only received more citations,
but because the additional citations originate from a source with relatively
few references, they have a high weight in the source normalization approach
taken by SNIP.

 

I hope the above explanation clarifies the case of the Journal of
Engineering Education. Of course, we may have further discussion on the
appropriateness of the criteria used for selecting the sources to be
included in the SNIP calculation. We welcome any suggestions on alternative
criteria that could be considered.

 

Best regards,

Nees Jan

 

 

From: Éric Archambault [mailto:eric.archambault at science-metrix.com] 
Sent: Wednesday, September 25, 2013 6:19 PM
To: Eck, N.J.P. van; SIGMETRICS at LISTSERV.UTK.EDU
Subject: RE: [SIGMETRICS] CWTS Journal Indicators

 

Dear Nees Jan

 

Thank you for this addition to the growing list of journal indicators.
Having a publicly accessible list of scores like this is really important
and will play an important role in the debate on journal impact. Having
rigorous researchers such as the ones at CWTS pursuing this project
initiated by Michel Zitt and Henry Small and pursued by Moed is certainly
useful. 

 

However, I feel this is still at the stage of a research project and we
should be careful to characterize our indicators carefully before telling
the wider community that they are ready for prime time. We can’t afford to
have any more flaky journal impact indicators. This is now the forth
proposition for such an indicator, after the Journal Impact Factor, the
Scimago indicator, and the Eigenfactor. In this context, allowing both
practitioners and users to decide which one seems to have the greatest
scientific merit is essential. This requires that the methods and all
ingredients be known to users and practitioners. Your paper is useful to
understand the recipe but some ingredients are missing from the public
disclosure and these need to be made public to help the community
characterize your tool.

 

In particular, I think a few more details on the methods would be useful
here. Firstly, having more details about the bootsrapping method that you
use to compute the stability intervals would be welcome. Have you written a
paper on this technique? Secondly, an additional column with the field of
each journal would be more transparent and useful to users. 

 

Do you have an explanation for this behavior:

The Journal of Engineering Education is one of the source journals with the
highest SNIP in 2008. Is this an artifact or in 2008 the journal became that
good compared to 2007? I find the jump surprising as it is outside the
boundaries that you calculated. Of course, it is not impossible by chance to
fall outside these, just that the jump is somewhat large.

 

Kind regards

 

Eric

 


Source title

Source type

Print ISSN

Year

P

SNIP

SNIP (lower bound)

SNIP (upper bound)

% self cit


Journal of Engineering Education

Journal

1069-4730

2002

207

7.901979

6.625

9.355

16%


Journal of Engineering Education

Journal

1069-4730

2003

215

6.587213

5.393

7.838

11%


Journal of Engineering Education

Journal

1069-4730

2004

194

9.710727

7.719

11.838

9%


Journal of Engineering Education

Journal

1069-4730

2005

133

2.498504

1.685

3.463

48%


Journal of Engineering Education

Journal

1069-4730

2006

111

4.458215

3.042

6.121

16%


Journal of Engineering Education

Journal

1069-4730

2007

98

6.650165

4.437

9.274

23%


Journal of Engineering Education

Journal

1069-4730

2008

93

20.62702

14.286

28.396

10%


Journal of Engineering Education

Journal

1069-4730

2009

75

15.92148

12.191

20.305

16%


Journal of Engineering Education

Journal

1069-4730

2010

77

16.12523

12.181

20.454

14%


Journal of Engineering Education

Journal

1069-4730

2011

76

16.1012

11.783

21.15

14%


Journal of Engineering Education

Journal

1069-4730

2012

90

12.49939

9.933

15.098

7%

 

 

From: ASIS&T Special Interest Group on Metrics
[mailto:SIGMETRICS at LISTSERV.UTK.EDU] On Behalf Of Nees Jan van Eck
Sent: September-25-13 10:09 AM
To: SIGMETRICS at LISTSERV.UTK.EDU
Subject: [SIGMETRICS] CWTS Journal Indicators

 


The 2012 SNIP values have been released on CWTS Journal Indicators
(www.journalindicators.com). SNIP (source normalized impact per paper) is a
freely available journal impact indicator that uses a source normalization
mechanism to correct for differences in citation practices between fields of
science. Compared with the journal impact factor, SNIP allows for more
accurate comparisons between journals active in different scientific fields.
SNIP is calculated by CWTS based on Elsevier’s Scopus database. With the
release of the 2012 SNIP values, stability intervals have been added to CWTS
Journal Indicators. These intervals indicate the reliability of the SNIP
value of a journal. For instance, if a journal’s SNIP value is largely due
to a single very highly cited publication, this is indicated by a wide
stability interval. SNIP is the only freely available journal impact
indicator that is presented with stability intervals.

 

Your feedback on CWTS Journal Indicators is greatly appreciated.

 

Best regards,

Nees Jan van Eck

 

========================================================

Nees Jan van Eck PhD

Researcher

Head of ICT

 

Centre for Science and Technology Studies

Leiden University

P.O. Box 905

2300 AX Leiden

The Netherlands

 

Willem Einthoven Building, Room B5-35

Tel:       +31 (0)71 527 6445

Fax:       +31 (0)71 527 3911

E-mail:     <mailto:ecknjpvan at cwts.leidenuniv.nl>
ecknjpvan at cwts.leidenuniv.nl

Homepage:   <http://www.neesjanvaneck.nl/> www.neesjanvaneck.nl

VOSviewer:  <http://www.vosviewer.com/> www.vosviewer.com

========================================================

 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.asis.org/pipermail/sigmetrics/attachments/20130926/528a2824/attachment.html>


More information about the SIGMETRICS mailing list