Papers

Jesper Wiborg Schneider jws at CFA.AU.DK
Wed Apr 16 14:54:40 EDT 2014


Dear Lutz,

Interesting paper, the latter one, and interesting to see how the 'debate' in our field is reflected in the references you and your coauthor give:

"In bibliometrics, it has been also recommended to go beyond statistical significance testing (Bornmann & Leydesdorff, 2013; Schneider, 2012)."

I guess you can call this quote an understatement, at least from my perspective. I do not think anyone recommended to go 'beyond statistical significance testing' in scientometrics/bibliometrics before I criticized the current practice in 'Caveats for using statistical significance tests in research assessments" first published in the Arxiv in 2011: http://arxiv.org/abs/1112.2516 and later in 2013 in Journal of Informetrics. 
In 2012, at the STI conference I exteneded the critic in the paper you mention in the quote, discussing one of your papers on university rankings and exemplifying the use of effect sizes in relation to such rankings, in fact the use of Cohen's h in relation to the proportion of top 10 percent highly cited papers - basically the same example you bring forward in this paper.
Only then - as far as I can follow the ever faster publishing chronology - did you and other colleagues react to some my criticisms, including an endorsement of the use of effect sizes and CI until then not visible.
Now I do not hunger for more references or the like, but I would appreciate that when we in the community have a debate or thread that such a debate/thread is outlined thoroughly and honestly in the review section - the purpose with a review. This case is not the first one and it gives one the impression that our literature is not read ... or worse ...? I am not sure whether this paper is under review, but I guess me writing this mail is the risk you run when announcing this on the this list. 

Kind regards Jesper

 


________________________________________
From: ASIS&T Special Interest Group on Metrics [SIGMETRICS at LISTSERV.UTK.EDU] on behalf of Bornmann, Lutz [lutz.bornmann at GV.MPG.DE]
Sent: 16 April 2014 15:53
To: SIGMETRICS at LISTSERV.UTK.EDU
Subject: [SIGMETRICS] Papers

BRICS countries and scientific excellence: A bibliometric analysis of most frequently-cited papers
Lutz Bornmann<http://arxiv.org/find/cs/1/au:+Bornmann_L/0/1/0/all/0/1>, Caroline Wagner<http://arxiv.org/find/cs/1/au:+Wagner_C/0/1/0/all/0/1>, Loet Leydesdorff<http://arxiv.org/find/cs/1/au:+Leydesdorff_L/0/1/0/all/0/1>

(Submitted on 14 Apr 2014)

The BRICS countries (Brazil, Russia, India, and China, and South Africa) are noted for their increasing participation in science and technology. The governments of these countries have been boosting their investments in research and development to become part of the group of nations doing research at a world-class level. This study investigates the development of the BRICS countries in the domain of top-cited papers (top 10% and 1% most frequently cited papers) between 1990 and 2010. To assess the extent to which these countries have become important players on the top level, we compare the BRICS countries with the top-performing countries worldwide. As the analyses of the (annual) growth rates show, with the exception of Russia, the BRICS countries have increased their output in terms of most frequently-cited papers at a higher rate than the top-cited countries worldwide. In a further step of analysis for this study, we generate co-authorship networks among authors of highly cited papers for four time points to view changes in BRICS participation (1995, 2000, 2005, and 2010). Here, the results show that all BRICS countries succeeded in becoming part of this network, whereby the Chinese collaboration activities focus on the USA.

Available at: http://arxiv.org/abs/1404.3721


The substantive and practical significance of citation impact differences between institutions: Guidelines for the analysis of percentiles using effect sizes and confidence intervals
Richard Williams<http://arxiv.org/find/cs/1/au:+Williams_R/0/1/0/all/0/1>, Lutz Bornmann<http://arxiv.org/find/cs/1/au:+Bornmann_L/0/1/0/all/0/1>

(Submitted on 12 Apr 2014)

In our chapter we address the statistical analysis of percentiles: How should the citation impact of institutions be compared? In educational and psychological testing, percentiles are already used widely as a standard to evaluate an individual's test scores - intelligence tests for example - by comparing them with the percentiles of a calibrated sample. Percentiles, or percentile rank classes, are also a very suitable method for bibliometrics to normalize citations of publications in terms of the subject category and the publication year and, unlike the mean-based indicators (the relative citation rates), percentiles are scarcely affected by skewed distributions of citations. The percentile of a certain publication provides information about the citation impact this publication has achieved in comparison to other similar publications in the same subject category and publication year. Analyses of percentiles, however, have not always been presented in the most effective and meaningful way. New APA guidelines (American Psychological Association, 2010) suggest a lesser emphasis on significance tests and a greater emphasis on the substantive and practical significance of findings. Drawing on work by Cumming (2012) we show how examinations of effect sizes (e.g. Cohen's d statistic) and confidence intervals can lead to a clear understanding of citation impact differences.

Available at: http://arxiv.org/abs/1404.3720

---------------------------------------

Dr. Dr. habil. Lutz Bornmann
Division for Science and Innovation Studies
Administrative Headquarters of the Max Planck Society
Hofgartenstr. 8
80539 Munich
Tel.: +49 89 2108 1265
Mobil: +49 170 9183667
Email: bornmann at gv.mpg.de<mailto:bornmann at gv.mpg.de>
WWW: www.lutz-bornmann.de<http://www.lutz-bornmann.de/>
ResearcherID: http://www.researcherid.com/rid/A-3926-2008
ResearchGate: http://www.researchgate.net/profile/Lutz_Bornmann



More information about the SIGMETRICS mailing list