Is bibliometrics at danger?

Isidro F. Aguillo isidro.aguillo at CCHS.CSIC.ES
Tue Oct 7 10:16:40 EDT 2014

Dear Wouter,

You are right, I remember the problem. But it was solved in the next 
edition as you can expect once the issue has been identified. But the 
situation I described has been already reported for several years. 
However my main point is proliferation of comments like:

"The perils of ranking things based on citations"

as appeared today in Twitter. Bad bibliometrics by main citation 
database developer is affecting very negatively to our discipline.


On 06/10/2014 23:24, Gerritsma, Wouter wrote:
> Adminstrative info for SIGMETRICS (for example unsubscribe):
> Isidoro
> Years ago a similar issue played around the Leiden ranking as well, with a German university coming out in the first place, based on a single, young, well cited paper.
> This might be one of the reasons CWTS put more emphasis on the %top10% most cited papers in the new versions of the Leiden Ranking
> Wouter
> -----Original Message-----
> From: ASIS&T Special Interest Group on Metrics [mailto:SIGMETRICS at LISTSERV.UTK.EDU] On Behalf Of Isidro F. Aguillo
> Sent: maandag 6 oktober 2014 10:36
> Subject: [SIGMETRICS] Is bibliometrics at danger?
> Adminstrative info for SIGMETRICS (for example unsubscribe):
> During our last conferences (Vienna, Berlin, Leiden) we discussed the problems related to the uncontrolled usage of bibliometric techniques by people without enough knowledge of the quality standards needed for research assessment. In fact with the spread usage of "bad"
> bibliometrics the discipline is starting to be viewed as irrelevant or seriously flawed and biased. It is important to read carefully the now famous DORA declaration that not only discourages the usage of the impact factor but it is also attacking the whole citation analysis as the recommended evaluation tool.
> I already mentioned during the Vienna session that from a practical point of view the success of certain rankings of Universities that use flawed citation data is also a source of potential danger for the prestige of the discipline.
> A few days ago the British magazine Times Higher Education (THE) published the last edition of its very popular ranking of Universities.
> Besides a reputation survey-based indicator they also collect citation data (30% of the overall score) that during the last years have produced very striking results. Among others, you can check in the current edition that Federico Santa Maria Technical University, Chile has a larger score than Harvard or Princeton, Tokyo Metropolitan University larger than Caltech or Stanford, or Bogazici University, Turkey is performing better than Oxford or Cambridge.
> My point here is that data does not come from a THE journalist but, surprise, directly from Thomson Reuters, as stated in their methodology
> webpage: "this year, our data supplier Thomson Reuters examined more than 50 million citations to 6 million journal articles, published over five years. The data are drawn from the 12,000 academic journals indexed by Thomson Reuters' Web of Science database and include all indexed journals published between 2008 and 2012. Citations to these papers made in the six years from 2008 to 2013 are also collected".
> You can find a very good analysis with tables in the blog of Richard
> Holmes:
> I know that Thomson Reuters is an independent private company, but I wonder if our community as represented in this forum could ask for a strong action regarding this unfortunate situation.


Isidro F. Aguillo, HonDr.
The Cybermetrics Lab, IPP-CSIC
Grupo Scimago
Madrid. SPAIN

isidro.aguillo at
ORCID 0000-0001-8927-4873
ResearcherID: A-7280-2008
Scholar Citations SaCSbeoAAAAJ
Twitter @isidroaguillo
Rankings Web

Este mensaje no contiene virus ni malware porque la protección de avast! Antivirus está activa.

More information about the SIGMETRICS mailing list