Do academic journals pose a threat to the advancement of science?
Stephen J Bensman
notsjb at LSU.EDU
Wed Aug 19 09:45:19 EDT 2009
Some anti-metric, anti-evaluative screeds making the rounds. Their
authors may be members of this listserv. I myself am not so hot on
metric evaluations, but the ultimate argument in their favor is that
evaluations will be made consciously or unconsciously, and you might as
well attempt to quantify the biases as a the first step in obtaining a
somewhat more accurate picture.
Stephen J. Bensman
LSU Libraries
Louisiana State University
Baton Rouge, LA 70803
USA
notsjb at lsu.edu
----------------------------------------------------------
From: owner-liblicense-l at lists.yale.edu
[mailto:owner-liblicense-l at lists.yale.edu] On Behalf Of Colin Steele
Sent: Monday, August 17, 2009 3:33 PM
To: liblicense-l at lists.yale.edu
Subject: Do academic journals pose a threat to the advancement of
science?
A long article from Zoe Corbyn, in the British Times Higher
Education Supplement for August 13th with the above title has
some extremely cogent comments regarding the present situation in
academic publishing and the impact of the increasing trends to
measure research both individually and institutionally through
bibliometric and other numeric processes.
http://www.timeshighereducation.co.uk/story.asp?sectioncode=26&storycode
=407705&c=1
"But have these gatekeepers for what counts as acceptable science
become too powerful? Is the system of reward that has developed
around them the best for science - and what does the future hold?
Unpicking the power of academic and scholarly journals, with
their estimated global turnover of at least $5 billion (3 billion
UK pounds) a year, is a complex business. There are an estimated
25,000 scholarly peer-reviewed journals in existence, about
15,000 of which cover the science, technical and medical
communities....
Richard Horton, editor of The Lancet, describes the growth of the
importance of citations and impact factors as "divisive" ...If I
could get rid of the impact factor tomorrow, I would. I hate it.
I didn't invent it and I did not ask for it. It totally distorts
decision-making and it is a very, very bad influence on science,"
he says.
Noting that the medical journal articles that get the most
citations are studies of randomised trials from rich countries,
he speculates that if The Lancet published more work from Africa,
its impact factor would go down.
"The incentive for me is to cut off completely parts of the world
that have the biggest health challenges ... citations create a
racist culture in journals' decision-making and embody a system
that is only about us (in the developed world)."
Corbyn quotes Sir John Sulston:
"(Journal metrics) are the disease of our times," says Sir John
Sulston, chairman of the Institute for Science, Ethics and
Innovation at the University of Manchester, and Nobel prizewinner
in the physiology or medicine category in 2002.
He is also a member of an International Council for Science
committee that last year drafted a statement calling for
collective action to halt the uncritical use of such metrics.
Sulston argues that the use of journal metrics is not only a
flimsy guarantee of the best work (his prize-winning discovery
was never published in a top journal), but he also believes that
the system puts pressure on scientists to act in ways that
adversely affect science - from claiming work is more novel than
it actually is to over-hyping, over-interpreting and prematurely
publishing it, splitting publications to get more credits and, in
extreme situations, even committing fraud.
The system also creates what he characterises as an "inefficient
treadmill" of resubmissions to the journal hierarchy. The whole
process ropes in many more reviewers than necessary, reduces the
time available for research, places a heavier burden on peer
review and delays the communication of important results.
The sting in the tail, he says, is the long list of names that
now appears on papers, when it is clear that few of the named
contributors can have made more than a marginal contribution.
This method provides citations for many, but does little for the
scientific enterprise.
It is not only scientists but journal editors, too, who see the
growing reliance on metrics as extremely damaging, with journals
feeling increasing pressure to publish certain work."
In this context, the publications of Professor Anne-Wil Harzing
at the University of Melbourne are relevant. See her recent
article:
http://www.harzing.com/download/wkw.pdf 'When Knowledge Wins:
Transcending the Sense and Nonsense of Academic Rankings'
"Has university scholarship gone astray? Do our academic
assessment systems reward scholarship that addresses the
questions that matter most to society? Using international
business as an example, this article highlights the problematic
nature of academic ranking systems and questions if such
assessments are drawing scholarship away from its fundamental
purpose.
The article calls for an immediate examination of existing
ratings systems, not only as a legitimate scholarly question vis
a vis performance-a conceptual lens with deep roots in management
research-but also because the very health and vibrancy of the
field are at stake. Indeed, in light of the data presented here,
which suggest that current systems are dysfunctional and
potentially cause more harm than good, a temporary moratorium on
rankings may be appropriate until more valid and reliable ways to
assess scholarly contributions can be developed.
The worldwide community of scholars, along with the global
network of institutions interacting with and supporting
management scholarship (such as the Academy of Management, AACSB,
and Thomson Reuters Scientific) are invited to innovate and
design more reliable and valid ways to assess scholarly
contributions that truly promote the advancement of relevant
21st-century knowledge and likewise recognize those individuals
and institutions that best fulfill the university's fundamental
purpose."
Reading these articles and listening to David Prosser from SPARC
Europe, in a speech he gave in Canberra at the National Library
of Australia on 14 August, reaffirms the view that now is the
time to look collectively at new models of funding scholarly
communication, rather than simply following, in the digital
environment, the historical models of the print environment.
If we were to start again, would the model be the same, except
for the need for a form of peer review and appropriate
reputational branding? One suspects not, and while on this topic,
why do libraries still need to give publishers pre-publication
interest free 'loans' amounting to hundreds of millions of
dollars,euros and pounds for content which may not be delivered
to the libraries for up to 12 months. If a fraction of that money
was available for realistic projects to work with the academic
community and research councils/funding bodies on effective
scholarly communication advocacy and new access and distribution
models who knows what could be achieved? Best Colin
Colin Steele
Emeritus Fellow
The Australian National University
More information about the SIGMETRICS
mailing list