Rankings of Universities and Research Centers

Stephen J Bensman notsjb at LSU.EDU
Wed Aug 27 12:22:46 EDT 2014


Isidoro,
In the US institutional rankings have taken a different course.  As long as they were done in a simple way that everybody could understand--basically peer ratings as Cattell did--these ratings were well known, taken into account, and important.  However, everything  began to change as soon as bibliometrics/scientometrics were added to the mix.  The ratings became increasingly complex and difficult to understand.  The data became organized so that different facets could be measured, and different conclusions, drawn.  The last ratings were so complex that they landed with a thud, and nobody paid attention to them.  If you want a simple insight into the mess, go to the URLs below:

http://en.wikipedia.org/wiki/United_States_National_Research_Council_rankings

http://www.nap.edu/rdp/

The Wikipedia article gives you a short, sharp summary of its faults.  It is obvious that the people producing the ratings found the politics so awful that they punted and refused to take a clear position.  The Dean of the College of Humanities and Social Sciences once asked me to write a report on LSU's position in the rankings.  I took one look at the data, and my eyes crossed.  I simply told him that these is not your father's ratings, and I could not summarize it in a simple manner that he could understand and present to the faculty.  No policy could be based on them.  The data contained all the counterarguments.   

However, if you go back to peer ratings and simple, understandable bibliometric/scientometric measures, you get the same results as Cattell did over one hundred years ago.  For my part, I find that all these so-called new "metrics" are so complex and contradictory that the only way I can judge them is to test how well they correlate with peer ratings.  I have gone back to Cattell, and I will not study anything unless there is some form of expert, subjective judgment.  That is why I concentrate on prize winners.  But then I am an old man and a traditionalist by nature.  I cling to Keynes' criticism of "algebraic logic" that judges on mathematical models not representative of reality.

Respectfully,

Stephen J. Bensman, Ph.D.
LSU Libraries
Louisiana State University
Baton Rouge, LA  USA





,  


-----Original Message-----
From: ASIS&T Special Interest Group on Metrics [mailto:SIGMETRICS at LISTSERV.UTK.EDU] On Behalf Of Isidro F. Aguillo
Sent: Wednesday, August 27, 2014 9:59 AM
To: SIGMETRICS at LISTSERV.UTK.EDU
Subject: Re: [SIGMETRICS] Rankings of Universities and Research Centers


Dear Stephen,

Thank you for your input, historical perspective is always welcomed. And sorry for the parade, just coming back after summer holidays.

For the readers of this list perhaps it can be interesting to point about a few facts I realized after 11 years of ranking experience (shared with 20 years of belonging to metrics -biblio, webo, alt- community). My major surprise was how much bibliometrics was unknown in many universities around the world (yes, even in Europe!). Everybody talked about impact factor but citation analysis was nowhere.

Then, suddenly in 2003 it was the publication of the Shanghai Ranking and its shocking success even after bibliometric 'leaders' immediately rejected it due to strong theoretical and technical limitations. More rankings, most of them giving large weights to bibliometric indicators, were published in the following years by people not belonging to the metrics community. The composite indicators they introduced were frequently built by combining highly correlated bibliometric variables. 
In several cases, although these models were never presented in international conferences or formally published in referred journals, they were granted enormous popularity. Even today, the CWTS Leiden Ranking, a far more rigorous and orthodox approach is not being considered among the "top 3 more prestigious" rankings.

A few months ago I asked Paul Wouters if the success of the rankings does not mean in some way the failure of bibliometricians (not the bibliometry)?

Open floor ...

See all of you attending STI2014 conference at Leiden next week,

On 27/08/2014 15:45, Stephen J Bensman wrote:
> Isidoro,
>
> I hate to rain on your little parade, but--what the hell--I will.  The first ranking of American universities was done  by James McKeen Cattell in 1906 in the following article:
>
> Cattell, James McKeen. (1906)  A statistical study of American men of science, the selection of a group of one thousand scientific men. Science, 24, 658-665, 699-707, 732-742.
>
> Your rankings are basically the same as his.  Plus ca change, plus ca la meme chose,  Needless to say, Cattell was a eugenicist, and this is where all this academic ranking hysteria originates.  To make my point, I have attached a pdf of that part of his article, in which he presents his university and institutional rankings.  See in particular Tables III-IV on pp. 739-740.  They will look somewhat familiar to you.  The available evidence is that these rankings are not random but the result of some  process of "preferential attachment" to what institutions leading scientists adhere for study, teaching, and research, leading to high stability over time.  To beat upon another of my little drums, it seems strangely similar to how the WWW and Google PageRank operates.  That does not leave much hope for those trying to blast their way to the top.  It seems set in concrete.
>
> Respectfully,
>
> Stephen J. Bensman, Ph.D.
> LSU Librairies
> Louisiana State University
> Baton Rouge, LA  USA
>
> PS I was in the Cal Band (Berkeley), and we used to call Stanford the "Leland Stanford Junior High," and that was before the days of mooning the crowd.
>
>
> -----Original Message-----
> From: ASIS&T Special Interest Group on Metrics 
> [mailto:SIGMETRICS at LISTSERV.UTK.EDU] On Behalf Of Isidro F. Aguillo
> Sent: Wednesday, August 27, 2014 6:16 AM
> To: SIGMETRICS at LISTSERV.UTK.EDU
> Subject: [SIGMETRICS] Rankings of Universities and Research Centers
>
> Adminstrative info for SIGMETRICS (for example unsubscribe):
> http://web.utk.edu/~gwhitney/sigmetrics.html
>
> The new editions of the Rankings Web of Universities (now in its 11th
> year) and Research Centers have been published with data collected during July 2014. The rankings consist now of close to 22000 Higher Education Institutions and 8000 Research Centers with their own independent web presence.
>
> http://www.webometrics.info/
> http://research.webometrics.info/
>
> Ranking is built combining the following variables: Web presence, including general information, structure and organization, governance and transparency related documents, learning supporting items, technology transfer or community engagement among other webpages; Visibility, a virtual referendum about the global impact of such web contents; Openness, the commitment of the University to the open access initiatives through its institutional repository, portal of academic journals and availability of the full texts papers in the personal pages of their authors and Excellence, the number of highly cited papers (top 10% most cited) in 21 disciplines by is faculty members.
>
> In the current edition the top ranked universities are:
>
> 1.    Harvard University
> 2.    MIT
> 3.    Stanford University
> 4.    Cornell University
> 5.    University of Michigan
> 6.    University of California Berkeley
> 7=   Columbia University
> 8=   University of Washington
> 9.    University of Minnesota
> 10.  University of Pennsylvania
>
> Countries with Universities in the Top Hundred
>
> USA                    66
> Canada                 7
> UK                         4
> Germany              3
> China                    3
> Japan                    2
> Switzerland         2
> Netherlands        1
> Australia              1
> Italy                      1
> South Korea        1
> Taiwan                 1
> Belgium               1
> Hong Kong          1
> Brazil                    1
> Austria                 1
> Czech Republic    1
> Singapore            1
> Mexico                 1
>
> The Top Ranked in Region are:
>
> USA                            Harvard
> Canada                       Toronto
> Latin America            Sao Paulo
> Caribbean                  University of the West Indies
> Europe                        Oxford
> Russia                         Lomonosov MSU
> Africa                          University of Cape Town
> Asia                            Seoul National University
> China                          Peking
> Japan                          Tokyo
> South Asia                 IIT Bombay
> Southeast Asia          National University of Singapore
> Middle East               Hebrew University of Jerusalem
> Arab World                King Saud University
> Oceania                      Melbourne
> BRICS                          Sao Paulo
>


-- 

************************************
Isidro F. Aguillo, HonDr.
The Cybermetrics Lab, IPP-CSIC
Grupo Scimago
Madrid. SPAIN

isidro.aguillo at csic.es
ORCID 0000-0001-8927-4873
ResearcherID: A-7280-2008
Scholar Citations SaCSbeoAAAAJ
Twitter @isidroaguillo
Rankings Web webometrics.info
************************************


---
Este mensaje no contiene virus ni malware porque la protección de avast! Antivirus está activa.
http://www.avast.com



More information about the SIGMETRICS mailing list