"Indicators and Judgment" of the Council of Canadian Academies
Loet Leydesdorff
loet at LEYDESDORFF.NET
Mon Sep 10 03:59:27 EDT 2012
Dear colleagues,
Last week at the Science & Technology Indicators Conference in Montreal, a
report entitled Informing Research Choices: Indicators and Judgment was
presented. The report of the Expert Panel on Science Performance and
Research Funding of the Council of Canadian Academies aims at proposing
standards for indicators. Let me focus in this commentary on the two quality
indicators proposed by this panel for publications and citations,
respectively.
1. Citations
At p. 68, one formulates: First, any citation-based indicators used to
assess and compare research fields should be field normalized. The paper
then abstracts from how to field-normalize (e.g., using the fields and
subfields of the NSF or the WoS Subject Categories or perhaps fractional
counting of the citations?) and ignores the problem of the effect of
field-normalizations on interdisciplinary research (Rafols et al., 2012).
Instead, the authors formulate: The most prominent example is the average
relative citation indicator (sometimes referred to as relative citation
impact), which compares the average level of citations in a particular field
in a particular country to the world average level of citations in that
field.
The relative citation impact thus defined (Schubert & Braun, 1986; Moed et
al., 1995) is also known in the literature as the (old) crown indicator or
Rate of Averages (Gingras & Lariviere, 2011). Opthof & Leydesdorff (2010)
argued that the division of two averages provides a quotient instead of a
statistics and can be considered as a transgression of the order of
operations. One should first normalize each paper against its reference set
and sum only thereafter.
Instead of averaging one is advised to use non-parametric statistics because
the distributions are extremely skewed. Using percentile ranks classes, for
example, is standard practice in the Science and Engineering Indicator of
the NSF. One can use, for example, chi-square, Kruskall-Wallis, or the
Integrated Impact Indicator (Leydesdorff & Bornmann, 2011).
2. Publications
On p. 66, the authors of the report emphasize that publications should not
be counted straightforwardly, but be weighted, for example, in accordance
with the impact factor of the journal in which it is published. This is
indeed a standard practice, but one should be cautioned against it.
First, the impact factor is a journal indicator and not a document
indicator. Thus, there is the risk of a category mistake (ecological
fallacy; Robinson, 1954). Can the impact factor of a journal be considered
as a quality indicator of the articles published in this journal?
The impact factor is based on averaging over an extremely skewed
distribution of citations. As in the case of comparing document, one should
use non-parametric statistics for comparing these distributions. The
Integrated Impact Indicator allows for the decomposition in terms of
journals or institutional (personal) document sets. Thus, one does not have
to make an inference from the journal level to the document level
(Leydesdorff, 2012).
It seems to me that weighting publications using a citation measure,
furthermore, may lead to the risk of confounding publication and citation
measures, whereas the report pleads for considering the two as different
dimensions of research quality (using multivariate analysis). Perhaps, the
committee did not realize that weighting is sometimes problematic; it is
often wise to begin with raw (unweighted) publication and citation counts
and then to explicate specifically which normalization is used and why. In
my opinion, some of the weightings advised by were not prudent choices.
Weighting may easily introduce a (set of) normative assumption(s)
(Leydesdorff et al., 2011: 1971f.).
Best wishes,
Loet
References
Council of Canadian Academies (2012). Informing Research Choices: Indicators
and Judgment: The Expert Panel on Science Performance and Research Funding.
Ottawa.
Gingras, Y., & Larivière, V. (2011). There are neither king nor crown in
scientometrics: Comments on a supposed "alternative" method of
normalization. Journal of Informetrics, 5(1), 226-227.
Leydesdorff, L. (in press). An Evaluation of Impacts in "Nanoscience &
nanotechnology:" Steps towards standards for citation analysis.
Scientometrics, doi: 10.1007/s11192-012-0750-5
<http://www.springerlink.com/content/6082p65177r04425/?MUD=MP>
<http://www.springerlink.com/content/6082p65177r04425/?MUD=MP> < online
version>.
Leydesdorff, L., & Bornmann, L. (2011). Integrated Impact Indicators (I3)
compared with Impact Factors (IFs): An alternative design with policy
implications. Journal of the American Society for Information Science and
Technology, 62(11), 2133-2146.
Leydesdorff, L., Bornmann, L., Mutz, R., & Opthof, T. (2011). Turning the
tables in citation analysis one more time: Principles for comparing sets of
documents Journal of the American Society for Information Science and
Technology, 62(7), 1370-1381.
Moed, H. F., De Bruin, R. E., & Van Leeuwen, T. N. (1995). New bibliometric
tools for the assessment of national research performance: Database
description, overview of indicators and first applications. Scientometrics,
33(3), 381-422.
Opthof, T., & Leydesdorff, L. (2010). Caveats for the journal and field
normalizations in the CWTS (Leiden) evaluations of research performance.
Journal of Informetrics, 4(3), 423-430.
Rafols, I., Leydesdorff, L., OHare, A., Nightingale, P., & Stirling, A.
(2012). How journal rankings can suppress interdisciplinary research: A
comparison between innovation studies and business & management. Research
Policy 41(7) 1262-1282.
Schubert, A., & Braun, T. (1986). Relative indicators and relational charts
for comparative assessment of publication output and citation impact.
Scientometrics, 9(5), 281-291.
_____
Loet Leydesdorff
Professor, University of Amsterdam
Amsterdam School of Communications Research (ASCoR)
Kloveniersburgwal 48, 1012 CX Amsterdam.
Tel. +31-20-525 6598; fax: +31-842239111
<mailto:loet at leydesdorff.net> loet at leydesdorff.net ;
<http://www.leydesdorff.net/> http://www.leydesdorff.net/
Visiting Professor, <http://www.istic.ac.cn/Eng/brief_en.html> ISTIC,
Beijing; Honorary Fellow, <http://www.sussex.ac.uk/spru/> SPRU, University
of Sussex; <http://scholar.google.com/citations?user=ych9gNYAAAAJ&hl=en>
http://scholar.google.com/citations?user=ych9gNYAAAAJ&hl=en
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.asis.org/pipermail/sigmetrics/attachments/20120910/2df2d3ca/attachment.html>
More information about the SIGMETRICS
mailing list