PS : [SIGMETRICS] Wikiscience

Pikas, Christina K. Christina.Pikas at JHUAPL.EDU
Wed Nov 2 13:01:56 EDT 2011

Ah, but having read Michael Nielsen's book prior to publication I can say that one of his prime examples is "wikimath." Specifically, he points to the Polymath project. Articles published as a result of that work have used a sign-up sheet for contributors to add their names. Contributors include Fields Medal winners and other notable mathematicians. If I recall correctly, they did want to list the author as "Polymath" or something (like Bourbaki) but were required to list names by the rather traditional journals they submitted to.

Also, I think if we want to see more of this type of collaboration using social media and what impact that has on more traditional journal articles, we need to look farther than Google Scholar as it is a less transparent (if free) version of a general research database. Some of the altmetrics efforts are a start (see, for example, , or Euan Adie's new ScienceDirect App, or but the area deserves more attention.


Christina K Pikas
The Johns Hopkins University Applied Physics Laboratory
Christina.Pikas at
(240) 228 4812 (DC area)
(443) 778 4812 (Baltimore area)

From: ASIS&T Special Interest Group on Metrics [mailto:SIGMETRICS at] On Behalf Of Stephen J Bensman
Sent: Wednesday, November 02, 2011 10:59 AM
Subject: [SIGMETRICS] PS : [SIGMETRICS] Wikiscience

In reference to your last sentence, the data is part of two companion papers being written here at LSU on Nobel prize winners in chemistry and economics as well as winners of the Fields Medal in mathematics.  The authorship patterns of these fields will be compared.  Due to the notorious inability of mathematicians to communicate with each other, I do not expect to find "wikimathematics," and I have my doubts about economics.  But you never know what you are going to find with Google Scholar.

Stephen J Bensman
LSU Libraries
Lousiana State University
Baton Rouge, LA 70803

From: Stephen J Bensman
Sent: Wednesday, November 02, 2011 9:19 AM
Subject: RE: [SIGMETRICS] Wikiscience

Thank you all for your kind responses to my latest missive.  I have printed them off and filed them with my papers, so that I may incorporate the comments in our article.  Taking all things together, I do think that we are on the verge of some sort of revolutionary breakthrough that is going to alter everything.  Partly this is due to a number of sessions given by Wikipedia Ambassadors that I attended.  As a result of these, I gained a lot more respect for Wikipedia and its methods.  It may be pioneering a new way of creating knowledge.  Using Harzing's Publish or Perish software to access Google Scholar data gives you new perspectives on old questions, and I do think that a lot of things are going to have to be rethought.  The Wall Street Journal article is one indication that this is taking place.  We all may be becoming mere cogs in knowledge machines.  I wrote of review of Harzing's software for Scientometrics, and I am attaching it in case you want to read it for my considerations on the questions it raises.

Stephen J Bensman
LSU Libraries
Lousiana State University
Baton Rouge, LA 70803

From: ASIS&T Special Interest Group on Metrics [mailto:SIGMETRICS at LISTSERV.UTK.EDU]<mailto:[mailto:SIGMETRICS at LISTSERV.UTK.EDU]> On Behalf Of Gemma Derrick
Sent: Wednesday, November 02, 2011 5:39 AM
Subject: Re: [SIGMETRICS] Wikiscience

Hola de España Stephen,

Thank you so much for this email and for sharing the results of you study with us.  Although, I found it very interesting would like to point out something that may be of interest.

You say that the Nobel Prize winners were usually ranked far down the authorship list and that this reflects how authorship position is not indicative of an authors importance. Whereas this would be extremely interesting if it were true, I thought that I should point out that in many scientific disciplines, chemistry included,  there is a specific cultural practice surrounding authorship order.  Authorship order is not always done by 'importance' nor is it done in order of contribution to the paper - with the author who contributed the most to the paper ranked first and so on and so forth until the last author is the one who contributed the least.  Instead, chemistry included, usually the last author is the most senior member of the team.  More often than not, the first author is the main contributor, but the last author may also be a major contributor but he is put last because he is usually the head of the laboratory (the most senior author).  Since Nobel Prize winners, I can safely assume, are heads of large laboratories by the time their Nobel Prize is announced, this finding does not surprise me.

I hope that this may help in the interpretation of your results.  Authorship practice and the difference between fields is something that interests me greatly and I look forward to hearing more of your results.


Dr Gemma Derrick PhD (ANU) | JAE Postdoctoral Research Fellow
Institute of Public Goods and Policies | Centre for Human and Social Sciences | Spanish National Research Council
C/-Albasanz, 26-28 | Madrid | Espana (Spain) | 28037
T +34 91 602 23 89 | M +34 650 697 832  | F +34 91 602 29 71
E gemma.derrick at<mailto:gemma.derrick at>  | W<>

De: ASIS&T Special Interest Group on Metrics [mailto:SIGMETRICS at LISTSERV.UTK.EDU]<mailto:[mailto:SIGMETRICS at LISTSERV.UTK.EDU]> En nombre de Stephen J Bensman
Enviado el: Monday, 31 October 2011 6:50 PM
Asunto: [SIGMETRICS] Wikiscience

I have been using the Publish or Perish software, which was created by Anne-Wil Harzing, to study the h-index publications of the winners of the Nobel Prize in chemistry.  These publications fulfilled the stipulation Garfield's law of concentration by all being articles published in the few elite journals highest in total cites.  The median rank of these journals by total cites was 22.  What struck me most about these publication was the amount of co-authorship of these articles  and the fact that the winners of the Nobel prize most often were not the primary authors but ranked far down the authorship list.  It struck me that breakthrough chemical research was highly collaborative and authorship position is not indicative of the author's importance.  One of these papers had 22 co-authors, and the prize winner was last.  It struck me that attributing citations to one author or another in certain fields is archaic as we are dealing with collectives or what I call "wikiscience."  For this reason, I found the Wall Street Journal article below of extreme interest.  It seems that, to evaluate a scientist's true importance, you must use something like Google Scholar, which can retrieve the scientist's works no matter what her/his authorship position.  Harzing's Publish or Perish software can be downloaded for free from the following Web site:

Stephen J Bensman
LSU Libraries
Lousiana State University
Baton Rouge, LA 70803

[cid:image001.gif at 01CC995D.BA20C5E0]
OCTOBER 29, 2011
The New Einsteins Will Be Scientists Who Share
>From cancer to cosmology, researchers could race ahead by working together-online and in the open

In January 2009, a mathematician at Cambridge University named Tim Gowers decided to use his blog to run an unusual social experiment. He picked out a difficult mathematical problem and tried to solve it completely in the open, using his blog to post ideas and partial progress. He issued an open invitation for others to contribute their own ideas, hoping that many minds would be more powerful than one. He dubbed the experiment the Polymath Project.
[cid:image002.jpg at 01CC995D.BA20C5E0]Alex Nabaum
On an experimental blog, a far-flung group of mathematicians cracked a tough problem in weeks.
Several hours after Mr. Gowers opened up his blog for discussion, a Canadian-Hungarian mathematician posted a comment. Fifteen minutes later, an Arizona high-school math teacher chimed in. Three minutes after that, the UCLA mathematician Terence Tao commented. The discussion ignited, and in just six weeks, the mathematical problem had been solved.
Other challenges have followed, and though the polymaths haven't found solutions every time, they have pioneered a new approach to problem-solving. Their work is an example of the experiments in networked science that are now being done to study everything from galaxies to dinosaurs.
These projects use online tools as cognitive tools to amplify our collective intelligence. The tools are a way of connecting the right people to the right problems at the right time, activating what would otherwise be latent expertise.
Networked science has the potential to speed up dramatically the rate of discovery across all of science. We may well see the day-to-day process of scientific research change more fundamentally over the next few decades than over the past three centuries.
But there are major obstacles to realizing this goal. Though you might think that scientists would aggressively adopt new tools for discovery, they have been surprisingly inhibited. Ventures such as the Polymath Project remain the exception, not the rule.
Consider the idea of sharing scientific data online. The best-known example of this is the human genome project, whose data may be downloaded by anyone. When you read in the news that a certain gene is associated with a particular disease, you're almost certainly seeing a discovery made possible by the project's open-data policy.
Despite the value of open data, most labs make no systematic effort to share data with other scientists. As one biologist told me, he had been "sitting on [the] genome" for an entire species of life for more than a year. A whole species of life! Just imagine the vital discoveries that other scientists could have made if that genome had been uploaded to an online database.
Why don't scientists share?
If you're a scientist applying for a job or a grant, the biggest factor determining your success will be your record of scientific publications. If that record is stellar, you'll do well. If not, you'll have a problem. So you devote your working hours to tasks that will lead to papers in scientific journals.
Even if you personally think it would be far better for science as a whole if you carefully curated and shared your data online, that is time away from your "real" work of writing papers. Except in a few fields, sharing data is not something your peers will give you credit for doing.
There are other ways in which scientists are still backward in using online tools. Consider, for example, the open scientific wikis launched by a few brave pioneers in fields like quantum computing, string theory and genetics (a wiki allows the sharing and collaborative editing of an interlinked body of information, the best-known example being Wikipedia).
Specialized wikis could serve as up-to-date reference works on the latest research in a field, like rapidly evolving super-textbooks. They could include descriptions of major unsolved scientific problems and serve as a tool to find solutions.
But most such wikis have failed. They have the same problem as data sharing: Even if scientists believe in the value of contributing, they know that writing a single mediocre paper will do far more for their careers. The incentives are all wrong.
If networked science is to reach its potential, scientists will have to embrace and reward the open sharing of all forms of scientific knowledge, not just traditional journal publication. Networked science must be open science. But how to get there?
A good start would be for government grant agencies (like the National Institutes of Health and the National Science Foundation) to work with scientists to develop requirements for the open sharing of knowledge that is discovered with public support. Such policies have already helped to create open data sets like the one for the human genome. But they should be extended to require earlier and broader sharing. Grant agencies also should do more to encourage scientists to submit new kinds of evidence of their impact in their fields-not just papers!-as part of their applications for funding.
The scientific community itself needs to have an energetic, ongoing conversation about the value of these new tools. We have to overthrow the idea that it's a diversion from "real" work when scientists conduct high-quality research in the open. Publicly funded science should be open science.
Improving the way that science is done means speeding us along in curing cancer, solving the problem of climate change and launching humanity permanently into space. It means fundamental insights into the human condition, into how the universe works and what it's made of. It means discoveries not yet dreamt of.
In the years ahead, we have an astonishing opportunity to reinvent discovery itself. But to do so, we must first choose to create a scientific culture that embraces the open sharing of knowledge.
-Mr. Nielsen is a pioneer in the field of quantum computing and the author of "Reinventing Discovery: The New Era of Networked Science," from which this is adapted.
Copyright 2011 Dow Jones & Company, Inc. All Rights Reserved
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image001.gif
Type: image/gif
Size: 2151 bytes
Desc: image001.gif
URL: <>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image002.jpg
Type: image/jpeg
Size: 41229 bytes
Desc: image002.jpg
URL: <>

More information about the SIGMETRICS mailing list