Peer Review Scandals

David Wojick dwojick at CRAIGELLACHIE.US
Tue Jul 15 14:03:05 EDT 2014




There is a lot of junk in this article.

Here is the second paragraph: “Acoustics is an important field. But in 
biomedicine faulty research and a dubious peer-review process can have 
life-or-death consequences. In June, Dr. Francis Collins, director of the 
National Institutes of Health and responsible for $30 billion in annual 
government-funded research, held a meeting to discuss ways to ensure that 
more published scientific studies and results are accurate. According to a 
2011 report in the monthly journal Nature Reviews Drug Discovery, the 
results of two-thirds of 67 key studies analyzed by Bayer researchers from 
2008-2010 couldn't be reproduced.”

Of course peer review has nothing to do with replication.

My guess is there are between 5 and 10 million peer reviews a year, but it 
only takes 4 or 5 anecdotes, some way off base, to generate broad claims of 
wholesale corruption, that is hurting science. This is what social 
movements feed on, and there is plenty to go around.

Interestingly, there is a metric angle to the JVC scandal. I think that 
with proper research an algorithm could be developed that will detect this 
sort of fraud. It would operate on the article submission tracking systems 
that all large publishers use. I discuss this in the comments to Kent 
Anderson's article here:
http://scholarlykitchen.sspnet.org/2014/07/14/trust-but-verify-identity-fraud-and-exploitation-of-the-trust-economy-in-scholarly-publishing/

David Wojick
http://insidepublicaccess.com/issues.html


At 09:14 AM 7/15/2014, you wrote:
>Adminstrative info for SIGMETRICS (for example unsubscribe): 
>http://web.utk.edu/~gwhitney/sigmetrics.html
>So much for peer review.
>
>Stephen J Bensman, Ph.D.
>LSU Libraries
>Lousiana State University
>Baton Rouge, LA 70803
>USA
>
>
>WALL STREET JOURNAL  OPINION PIECE
>
>The Corruption of Peer Review Is Harming Scientific Credibility
>Dubious studies on the danger of hurricane names may be laughable. But bad 
>science can cause bad policy.
>By
>Hank Campbell
>July 13, 2014 6:32 p.m. ET
>Academic publishing was rocked by the news on July 8 that a company called 
>Sage Publications is retracting 60 papers from its Journal of Vibration 
>and Control, about the science of acoustics. The company said a researcher 
>in Taiwan and others had exploited peer review so that certain papers were 
>sure to get a positive review for placement in the journal. In one case, a 
>paper's author gave glowing reviews to his own work using phony names.
>Acoustics is an important field. But in biomedicine faulty research and a 
>dubious peer-review process can have life-or-death consequences. In June, 
>Dr. Francis Collins, director of the National Institutes of Health and 
>responsible for $30 billion in annual government-funded research, held a 
>meeting to discuss ways to ensure that more published scientific studies 
>and results are accurate. According to a 2011 report in the monthly 
>journal Nature Reviews Drug Discovery, the results of two-thirds of 67 key 
>studies analyzed by Bayer researchers from 2008-2010 couldn't be reproduced.
>Enlarge Image Close
>http://si.wsj.net/public/resources/images/BN-DR115_edp071_D_201
>cat
>
>Getty Images
>That finding was a bombshell. Replication is a fundamental tenet of 
>science, and the hallmark of peer review is that other researchers can 
>look at data and methodology and determine the work's validity. Dr. 
>Collins and co-author Dr. Lawrence Tabak highlighted the problem in a 
>January 2014 article in Nature. "What hope is there that other scientists 
>will be able to build on such work to further biomedical progress," if no 
>one can check and replicate the research, they wrote.
>The authors pointed to several reasons for flawed studies, including "poor 
>training of researchers in experimental design," an "emphasis on making 
>provocative statements," and publications that don't "report basic 
>elements of experimental design." They also said that "some scientists 
>reputedly use a 'secret sauce' to make their experiments work­and withhold 
>details from publication or describe them only vaguely to retain a 
>competitive edge."
>Papers with such problems or omissions would never see the light of day if 
>sound peer-review practices were in place­and their absence at many 
>journals is the root of the problem. Peer review involves an anonymous 
>panel of objective experts critiquing a paper on its merits. Obviously, a 
>panel should not contain anyone who agrees in advance to give the paper 
>favorable attention and help it get published. Yet a variety of journals 
>have allowed or overlooked such practices.
>Absent rigorous peer review, we get the paper published in June in the 
>Proceedings of the National Academy of Sciences. Titled "Female hurricanes 
>are deadlier than male hurricanes," it concluded that hurricanes with 
>female names cause more deaths than male-named hurricanes­ostensibly 
>because implicit sexism makes people take the storms with a woman's name 
>less seriously. The work was debunked once its methods were examined, but 
>not before it got attention nationwide.
>Such a dubious paper made its way into national media outlets because of 
>the imprimatur of the prestigious National Academy of Sciences.
>Yet a look at the organization's own submission guidelines makes clear 
>that if you are a National Academy member today, you can edit a research 
>paper that you wrote yourself and only have to answer a few questions 
>before an editorial board; you can even arrange to be the official 
>reviewer for people you know. The result of such laxity isn't just the 
>publication of a dubious finding like the hurricane gender-bias claim. 
>Some errors can have serious consequences if bad science leads to bad policy.
>In 2002 and 2010, papers published in the Proceedings of the National 
>Academy of Sciences claimed that a pesticide called atrazine was causing 
>sex changes in frogs. As a result the Environmental Protection Agency set 
>up special panels to re-examine the product's safety. Both papers had the 
>same editor, David Wake of the University of California, Berkeley, who is 
>a colleague of the papers' lead author, Tyrone Hayes, also of Berkeley.
>In keeping with National Academy of Sciences policy, Prof. Hayes 
>preselected Prof. Wake as his editor. Both studies were published without 
>a review of the data used to reach the finding. No one has been able to 
>reproduce the results of either paper, including the EPA, which did 
>expensive, time-consuming reviews of the pesticide brought about by the 
>published claims. As the agency investigated, it couldn't even use those 
>papers about atrazine's alleged effects because the research they were 
>based on didn't meet the criteria for legitimate scientific work. The 
>authors refused to hand over data that led them to their claimed 
>results­which meant no one could run the same computer program and match 
>their results.
>Earlier this month, Nature retracted two studies it had published in 
>January in which researchers from the Riken Center for Development Biology 
>in Japan asserted that they had found a way to turn some cells into 
>embryonic stem cells by a simple stress process. The studies had passed 
>peer review, the magazine said, despite flaws that included misrepresented 
>information.
>Fixing peer review won't be easy, although exposing its weaknesses is a 
>good place to start. Michael Eisen, a biologist at UC Berkeley, is a 
>co-founder of the Public Library of Science, one of the world's largest 
>nonprofit science publishers. He told me in an email that, "We need to get 
>away from the notion, proven wrong on a daily basis, that peer review of 
>any kind at any journal means that a work of science is correct. What it 
>means is that a few (1-4) people read it over and didn't see any major 
>problems. That's a very low bar in even the best of circumstances."
>But even the most rigorous peer review can be effective only if authors 
>provide the data they used to reach their results, something that many 
>still won't do and that few journals require for publication. Some 
>publishers have begun to mandate open data. In March the Public Library of 
>Science began requiring that study data be publicly available. That means 
>anyone with the ability to check should be able to reproduce, validate and 
>understand the findings in a published paper. This should also ensure that 
>there is much better scrutiny of flawed claims about sexist weather events 
>and hermaphroditic frogs­before they appear on every news station in America.
>Mr. Campbell is the founder of Science 2.0 and co-author of "Science Left 
>Behind" (PublicAffairs, 2012).
>Content-Type: image/jpeg;
>         name="image001.jpg"
>Content-Description: image001.jpg
>Content-Disposition: inline;
>         size=22479;
>         creation-date=Tue, 15 Jul 2014 13:14:32 GMT;
>         modification-date=Tue, 15 Jul 2014 13:14:32 GMT;
>         filename="image001.jpg"
>Content-ID: <image001.jpg at 01CFA004.CDBC0B40>
>
>Content-Type: image/jpeg;
>         name="image002.jpg"
>Content-Description: image002.jpg
>Content-Disposition: inline;
>         size=76790;
>         creation-date=Tue, 15 Jul 2014 13:14:33 GMT;
>         modification-date=Tue, 15 Jul 2014 13:14:33 GMT;
>         filename="image002.jpg"
>Content-ID: <image002.jpg at 01CFA004.CDBC0B40>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.asis.org/pipermail/sigmetrics/attachments/20140715/e3136845/attachment.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: 13cdf8b.jpg
Type: image/jpeg
Size: 22479 bytes
Desc: 13cdf8b.jpg
URL: <http://mail.asis.org/pipermail/sigmetrics/attachments/20140715/e3136845/attachment.jpg>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: 13cdf9b.jpg
Type: image/jpeg
Size: 76790 bytes
Desc: 13cdf9b.jpg
URL: <http://mail.asis.org/pipermail/sigmetrics/attachments/20140715/e3136845/attachment-0001.jpg>


More information about the SIGMETRICS mailing list