Citation Amnesia and memories of John Maryn's classic study of Unwitting Dupllcation of Research

Eugene Garfield eugene.garfield at THOMSONREUTERS.COM
Fri Jun 26 12:53:10 EDT 2009

The Scientist: NewsBlog:

Citation amnesia: The results

Posted by Bob Grant

[Entry posted at 25th June 2009 03:57 PM GMT]

View comments(9)
<>  | 
Comment on this news story


Citing past scientific work in present-day research papers can be a
slippery business. Contributions from competing labs can be glossed
over, pertinent studies accidentally left out, or similar research not
mentioned in an attempt to give the study at hand a sheen of novelty. 

We at The Scientist often hear complaints from our readers concerning
what they regard as either honest or purposeful omissions in the
reference lists of high-profile scientific papers. So we conducted a 
study <>  of our own
to try and quantify the prevalence of these types of slights and ask our
readers how the problem might be fixed. 


Image: Wikimedia

Indeed, the vast majority of the survey's roughly 550 respondents
<>  -- 85% -- said
that citation amnesia in the life sciences literature is an
already-serious or potentially serious problem. A full 72% of
respondents said their own work had been regularly or frequently ignored
in the citations list of subsequent publications. Respondents'
explanations of the causes range from maliciousness to laziness. 

"It certainly shows a lot of frustration out there," Geoffrey Bilder,
director of strategic initiatives at the UK-based non-profit association
CrossRef, said of the survey results and the accompanying anonymous
comments which respondents were encouraged to leave. 

The root of this frustration is likely twofold, Bilder told The
Scientist. First, there is such a vast and growing body of scientific
literature in existence that authors have an increasingly difficult job
of finding and citing all the published work that relates to their own
research. With modern indexing and search technologies, publishers and
the publishing community may be able to help scientists accomplish the
Herculean task of combing the literature. "The joke I tell is that if
you can help researchers avoid reading, you're going to make a lot of
money," Bilder said. 

But there's also a certain "perversion" at play in the citation
practices of some authors, Bilder said. "We have this naive notion that
a citation is a vote." Because so much of a scientific author's worth is
encapsulated in the raw numerical heft of his or her citation record,
some researchers purposefully avoid citing colleagues with whose work or
viewpoints they disagree. "The hidden motivation here is that [authors]
don't want to give it any more prominence or any more of a vote than
they have to." 

One of the themes to emerge in respondents' comments was simple
resentment. "Competitors willfully exclude references to my work and no
one, even other colleagues, can do anything about it," one respondent
wrote. "Papers published in lower impact factor journals are presumed to
be second rate and ignoring/disregarding them is easy," wrote another.
One commenter suggested that "several papers in prominent journals
including Cell would not have been accepted if the [past] work was

One early-career scientist described his harsh baptism in the
dog-eat-dog world of scientific publishing. "I only have 1 first author
paper, and it was recently published (Jan 2009, online)," the commenter
wrote. "It has already been passed over for citations by the most recent
articles, even though it was very much on topic." Another commenter
wrote that the main perpetrators of citation amnesia seem to be seasoned
researchers, "'big guns' who apparently feel it is safe to appropriate
the work of lesser workers because the journal editors will protect
them. Only junior people ever get nailed." 

Some commenters felt that American scientific authors tend not to
recognize the contributions of their colleagues across the pond.
"Especially in the US there is a trend to cite only papers of fellow
citizens," one wrote. "American clinical researchers tend not to read,
or at least not to cite publications in European journals," wrote

Other survey respondents pointed to less salacious causes of the
problem, noting requests from editors to winnow down the citation lists
or difficulties slogging through databases. "We searched hard for papers
that were relevant to our novel finding and could not find anything,"
one respondent wrote. "We missed one that was apparently too new to be
found in PubMed when we were writing our paper." Some element of the
problem may be unavoidable, simply because of the sheer number of papers
out there. "Nobody can cite all relevant papers all the time; there are
simply too many of them," one commenter wrote. "I do cite a reasonable
selection of relevant papers and I don't try to pass off other peoples'
ideas as my own." 

At least one commenter railed against the importance of citing past work
at all. "I think there is too much emphasis on history," the respondent
wrote. "To cite the original paper can be a waste of time for the reader
who wants a recent relevant summary rather than an 'honour' for the
initial scientist." 

As for curing the problem of improper or missing citations, our survey
respondents seemed evenly split between several possibilities suggested
in the survey: raising awareness, random checks, removing editorial
restrictions on citation numbers, signing a pledge, or including
citations in online supplementary information. But several offered their
own solutions. "Referees can and should make editors and authors aware
of poor citations," a commenter wrote. "This should be cause to refuse
acceptance." Another commenter proposed early training. "The practice of
good citation etiquette should be taught in college, if not earlier," he
or she wrote. "Most students want to cite a review article and call it a
day." Still another, meanwhile, suggested this very idea to avoid
inadvertent citation omissions: "When there are many relevant papers,
I've tried to find a review article that cites them," he or she wrote. 

Part of the solution, though, may lie in changing how citations are
formatted. "I suggest to include a list of references as PubMed IDs, so
all citations of a paper can be downloaded (at least as citations and
URLs to pdfs) in bulk," one respondent wrote. 

Bilder agreed that revising citation formatting could go a long way
toward easing frustration surrounding the issue in the scientific
community. "We could make it more informative and certainly more
efficient," he said. Bilder said that by displaying only the minimum
essential information -- author names, publication years, and numerical
identifiers such as DOIs or PMIDs -- publishers could make room for more
citations. "That would give you enough information to recognize what
[authors] were talking about if you're familiar with the literature, and
if not, to locate [the referenced work]." he said. 

Bilder also suggested borrowing a formatting trick from citations that
appear in the legal literature -- in particular, the parts of those
citations called "signals," which indicate why a particular work is
being cited. Signals specify whether the citing author includes a
particular citation as a comparison, a contrast, or an example of the
point being made. Bilder said that scientific publications could adopt
signals in their citations to make citing previous work more clear cut
than appearing as a simple endorsement. "Then the different kinds of
citations could be treated differently," he said. 

But University of Chicago sociologist James Evans
<>  noted that
some degree of citation amnesia may not be such a bad thing, as it may
be indicative of a healthy level of competition for funding and
recognition in the field. "At some level, people fundamentally think
that their work is important," Evans told The Scientist. "It needs to be
that way." 

Evans, who studies the relationship between markets and science, said
that scientists design research projects anticipating that their
findings will form the hub of a larger network of subsequent research,
and that this expectation makes for good science. "We want people to be
gambling and to try to pick the project that they feel will be at the
center of this network," he said. "My guess is that if you looked across
scientific areas," he added, "in really crowded research areas lots of
people are not going to get cited. And in the most crowded areas, people
would feel the most neglected." 

However the life science and/or publishing communities choose to address
problems with citation practice, researchers that publish their work
should steel themselves for some disappointment down the road. "I have
learned to have a thick skin," one survey respondent wrote. 

Related stories: 

*  Citation Violations
[May 2009] 

*  Critics rip Cell paper
[25th November 2008] 

*  Demand Citation Vigilance
[21st January 2002] 

*  The Ethics Of Citation: A Matter Of Science's Family Values
[9th June 1997]



Rate this article

Currently 4.45/5 

*          1

*          2

*          3

*          4

*          5

Rating: 4.45/5 (11 votes ) 


Comment on this blog




Return to Top <> 

Citations as metaphorical biomarker

by David Weinberg

[Comment posted 2009-06-26 07:41:10]

....of other underlying topics. 

There have been many excellent points made in the previous comments and
it seems that they touch on multiple issues, including, but not limited
to: citations to inform (on previous observations), citations to
acknowledge (other contributors)and citations to promote (one's career).

In turn, I think this reflects the reality that publication itself
serves multiple purposes (to inform, to drive discussion, to promote
one's career). 

And in still another layer on top of the ones cited above is the reality
that not all papers are equally significant to the field, although one
could generate a heated debate on what qualifies as significant, for as
Newton said ?If I have seen further than others, it is by standing upon
the shoulders of giants.? 

If there were limitless jobs and research funds for everyone, and if the
body of literature was small enough, then a lot of the emotion
surrounding this topic would probably go away, but then we would
probably all just debate whether lesser quality science was being
supported and whether it should be published, and who should judge,
and... you get the idea. 

I would be interested to see a survey of The Scientist readers designed
to show what roles of citations are considered most important to them.
In the meantime, I think the topic generates a lot of healthy debate and
sparks useful creativity.




Return to Top <> 

Editorial responsibility?

by anonymous poster

[Comment posted 2009-06-25 16:03:25]

Perhaps closer and more rigorous review of the cited literature is
actually a responsibility of peer reviewers and editors? It would , for
example, be relatively easy for editors to conduct an objective (3rd
party) bibliographic search on the topic at hand...? Such lists could be
easily appended to articles accepted for publication... ("Further
reading"??? Natural History used to do that for its popular articles?) 

An author could annotate the list with "signals" as mentioned in the
legal profession -- and augment where omissions occur? 

I hear the groans -- but such efforts could be highly automated in the
digital realm and could be rapidly appended to articles... 

But more fundamentally, there is a basic question about why citations
(or footnotes more generally) are ever included? This issue is made very
concrete in considering the issue of citation stacking as data are
combined and recombined in meta-analyses 

It's obvious that for many scientists, citation -- as also has been the
case with personal requests for "reprints"-- are a part of the way that
lineages and/or communities of research are construed? One might dare to
say that they are grooming behavior? 




Return to Top <> 

Yes, journals are part of the problem, as are citation systems

by Ellen Hunt

[Comment posted 2009-06-25 14:47:47]

One of the obvious cures is to have the citation counting engines
process reviews differently from original research. All that needs to
happen is to pass-through citations from chains of reviews down to the
original research paper. Since these factors matter, we need to make
them work properly. Right now, citations are absurdly weighted to the
publishers of nice reviews. I am not saying review articles aren't
worthwhile, they most definitely are. But we should modify citation
counts so that a reference to a review automatically counts as a
reference to the citations in the review. And if there is a citation in
a review that is itself a review, then the original research of those
papers in turn should be counted. 

Aside from that, I fail to see why any journal in today's world should
care about number of citations. But if they must care, then what they
can do is have the author produce a subset list for print publication.
The full citation list can be provided for online publication. 

For god's sake people! Virtually unlimited space is the whole point of
online journals! There is absolutely no reason why an online journal
should not allow an unlimited number of citations. 




Return to Top <> 

What about articles you don't have access to?

by anonymous poster

[Comment posted 2009-06-25 13:48:28]

I would argue that some of the problem may stem from whether or not a
given institution has access to a given journal. Who wants to reference
a research article that costs $35 or more to view? I would also argue
that we should vote the wallet and refuse to reference articles that
cost so much to view. (Hence the argument for immediate, public
dissemination of all publicly-funded research.)




Return to Top <> 

Ask your librarian

by Jan W. Schoones

[Comment posted 2009-06-25 13:08:23]

Big part of the solution is easy, but apparently missed by all: ask you
academic librarian for help. She/he will get the best search results,
certainly when a search is performed where both brains work together. 

1. Schoones JW. Selective publication of antidepressant trials. N Engl J
Med. 2008 May 15;358(20):2181




Return to Top <> 

Why are you doing this?

by anonymous poster

[Comment posted 2009-06-25 13:02:54]

Science shouldn't be about getting your ego stroked daily. If your upset
because the other kids didn't include you, perhaps you should be in a
different field. Science is about doing something you love for the
purpose of increasing human knowledge. If I see that my work has done
that, whether cited or not, I'm happy. 

Perhaps what we should do is stop putting names on papers. Then we would
see who is in this for the glory.




Return to Top <> 

Raise the stakes

by anonymous poster

[Comment posted 2009-06-25 12:53:24]

Perhaps it might be valuable to borrow some rules that courts use. If an
author intentionally omits a citation that disagrees or contradicts the
author's paper, as determined by a neutral body, the author surrenders
his/her tenure position at the university, and is put on a tenure track
to compete with younger, more ethical researchers. 




Return to Top <> 

Good riddance.

by Mitchell Wachtel

[Comment posted 2009-06-25 12:28:10]

Anyone can use pubmed to arrive at over 200 relevant articles.
Scientists should use review articles or textbook chapters as much as
possible to prove assertions in the introduction or discussion,
decreasing the burden upon the reader, limiting the number of references
to twenty-five. Review articles and book chapters are increasingly more
often cited than the original research; this does not make the original
research less worthy. Researchers should not be judged by the number of
citations their article produces. 




Return to Top <> 

The journals play a role too!

by John Quackenbush

[Comment posted 2009-06-25 12:17:03]

One of the problems we have faced is a limitation on the number of
citations allowed for specific article types by nearly every journal.
Authors are often forced to pick and choose lest they exceed their
allotted quota for citation. I can point to a number of instances over
the past year where papers I have submitted have been editorially
rejected prior to peer review because I have included too many
citations, forcing a pruning that some might interpret as a either
deliberate omission or a case of "amnesia."



Comment on this blog



When responding, please attach my original message
Eugene Garfield, PhD. email:  garfield at 
home page: <> 
Tel: 215-243-2205 Fax 215-387-1266

Chairman Emeritus, Thomson Reuters Scientific (formerly ISI)
3501 Market Street, Philadelphia, PA 19104-3302
President, The Scientist LLC.
400 Market Street, Suite 1250, Philadelphia, PA 19106-2501
Past President, American Society for Information Science and Technology


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image001.jpg
Type: image/jpeg
Size: 6393 bytes
Desc: image001.jpg
URL: <>

More information about the SIGMETRICS mailing list