The Wisdom of Citing Scientists

Davis, Charles H davisc at INDIANA.EDU
Sat Aug 10 10:31:40 EDT 2013


Some of the contributors to this discussion may have cited the 
following work, an early contribution pertinent to some of the 
questions raised.  Regrettably, the author of this 1970 dissertation at 
the University of Illinois at Urbana-Champaign chose not to publish her 
findings.  It's easily available via WorldCat or otherwise.

http://openlibrary.org/works/OL10345125W/The_relationship_between_intra-document_citation_location_and_citation_level

http://tinyurl.com/md75da2

Cordially,

Charles Davis
_______________________________
Charles H. Davis, BSc, MA, PhD
Senior Fellow, Informatics
http://soic.iu.edu/
Indiana University at Bloomington
http://mypage.iu.edu/~davisc

Quoting A Carlin <acarlin865 at STCOLUMBS.COM>:

> Adminstrative info for SIGMETRICS (for example unsubscribe):
> http://web.utk.edu/~gwhitney/sigmetrics.html
>
> Hi David,
> do you have a reference for this study please?
> with thanks, Andrew
>
>
> -----Original Message-----
> From: ASIS&T Special Interest Group on Metrics on behalf of David Wojick
> Sent: Sat 10/08/2013 14:40
> To: SIGMETRICS at LISTSERV.UTK.EDU
> Subject: Re: [SIGMETRICS] The Wisdom of Citing Scientists
>
> Adminstrative info for SIGMETRICS (for example unsubscribe):
> http://web.utk.edu/~gwhitney/sigmetrics.html
>
> I did a small study that found the majority of citations occurring in
> the introductory part of most of the articles. Over 60% of the
> citations occurred in the first 25% of the text on average. This
> section of the article is basically an historical narrative that
> explains the origin and nature of the research problem being reported
> on. The cited works need not have directly influenced the research
> being reported.
>
> Then the article typically goes on to explain what was done and what
> was found. Here the citations often identify the sources of methods
> used or data or some such. Direct influence is much more likely but
> the percentage of citations may be low. Finally there may be a
> broader discussion section, with relatively more citations.
>
> The point is that many citations may not be indicators of direct
> influence (or impact), but rather of historical relevance. In some
> cases the citations may well be found only after the research is done.
>
> David Wojick
>
> On Aug 9, 2013, at 12:45 PM, "Smalheiser, Neil"
> <Nsmalheiser at PSYCH.UIC.EDU> wrote:
>
>> Since Katy covered one aspect of this issue, let me raise a
>> complementary aspect that I have not seen discussed yet in this
>> forum.
>>
>> When people DO cite references in a paper, they do so possibly for
>> very different reasons, each with a different rationale and pattern
>> of citing.
>>
>> 1.       Ideally, in my opinion, an author should accurately cite
>> the previous works that influenced them in the research that they
>> are reporting. A research paper tells a story, and it is important
>> to know what papers they read, and when, and how they were
>> influenced. So if they were unaware of some relevant research at the
>> time, it is not important (and even intellectually misleading) to
>> cite it!
>>
>> 2.       Another reason that authors omit citations is on purpose -
>> they wish to make their own contribution seem new and fresh, and
>> even if they were aware of some prior relevant work, they may find
>> some excuse not to cite it [e.g. it was done in Drosophila but my
>> study is in rats].
>>
>> 3.     More often, authors attempt to identify all relevant prior
>> research, in a prospective attempt to satisfy reviewers who are
>> likely to give them a hard time if they don't. Some authors even do
>> this out of scholarliness, though that is not a particularly valued
>> attribute in experimental science. As review articles appear on a
>> given topic, it is often acceptable to simply cite one or two
>> reviews which hides the impact of the primary papers (except for
>> those that are most closely relevant to the present article,
>> regardless of their impact to the field at large). This also means
>> that papers will preferentially cite the most similar prior papers.
>>
>> 4.     Even more often, authors go out of their way to cite papers
>> by potential reviewers or editorial board members of the journal
>> that is considering the paper, or folks likely to be reviewing their
>> grants.
>>
>> 5.     A subtle variation of this is that an author will want to
>> cite papers that appeared in prestigious journals, and avoid papers
>> that were published in obscure or questionable places, to make their
>> own paper look more classy and more likely to be reviewed favorably.
>>
>> 6.     Some papers, particularly methods papers or famous papers,
>> are almost pop references that provide bonding between author and
>> reader. Citing the Watson-Crick double-helix paper (or the Mullis
>> PCR method paper) is not just citing that paper, but is really a nod
>> to a lot of related connotations and historical associations. These
>> papers are highly cited because they are celebrities (famous for
>> being famous), which does reflect impact but of a different sort.
>>
>> So counting citations to measure impact is like characterizing a
>> person's health by heart rate - it means something; it is important
>> for sure; but you need to know a lot more to interpret it properly.
>>
>>
>>
>> Neil
>>
>> From: ASIS&T Special Interest Group on Metrics
>> [mailto:SIGMETRICS at LISTSERV.UTK.EDU] On Behalf Of Katy Borner
>> Sent: Friday, August 09, 2013 8:29 AM
>> To: SIGMETRICS at LISTSERV.UTK.EDU
>> Subject: Re: [SIGMETRICS] The Wisdom of Citing Scientists
>>
>>
>>
>> Adminstrative info for SIGMETRICS (for example unsubscribe):
>> http://web.utk.edu/~gwhitney/sigmetrics.html
>>
>> Good discussion. Quick comment:
>>
>> Work by Bollen et al. shows that science maps generated from
>> download (click stream) data have a substantially enlarged medical
>> area. Medical papers, e.g., freely available via Medline, are
>> downloaded/read/used widely by practitioners/doctors interested to
>> improve health/save lives. However, these practitioners/doctors
>> might not necessarily produce papers with citation references.
>>
>> Ideally, 'research evaluation' should aim to capture output and outcomes.
>>
>> Many of us spent a substantial amount of our time training others,
>> developing educational materials, in administration, or improving
>> legal regulations. Research Networking systems like VIVO and others,
>> see http://nrn.cns.iu.edu, provide access to more holistic data
>> (papers, grants, courses; some systems are connected to even more
>> detailed annual faculty report data) on scholar's roles in the S&T
>> system--as researchers, mentors, administrators.
>> k
>>
>> http://scimaps.org/maps/map/a_clickstream_map_of_83/
>> Bollen, Johan, Lyudmila Balakireva, Luís Bettencourt, Ryan Chute,
>> Aric Hagberg, Marko A. Rodriguez, and Herbert Van de Sompel. 2009.
>> "Clickstream Data Yields High-Resolution Maps of Science." PLoS One
>> 4 (3): 1-11.
>>
>>
>> On 8/9/2013 3:22 AM, Bornmann, Lutz wrote:
>>
>> The Wisdom of Citing Scientists
>>
>> Lutz Bornmann, Werner Marx
>>
>> (Submitted on 7 Aug 2013)
>>
>>
>>
>> This Brief Communication discusses the benefits of citation analysis
>> in research evaluation based on Galton's "Wisdom of Crowds" (1907).
>> Citations are based on the assessment of many which is why they can
>> be ascribed a certain amount of accuracy. However, we show that
>> citations are incomplete assessments and that one cannot assume that
>> a high number of citations correlate with a high level of
>> usefulness. Only when one knows that a rarely cited paper has been
>> widely read is it possible to say (strictly speaking) that it was
>> obviously of little use for further research. Using a comparison
>> with 'like' data, we try to determine that cited reference analysis
>> allows a more meaningful analysis of bibliometric data than
>> times-cited analysis.
>>
>>
>>
>> URL: http://arxiv.org/abs/1308.1554
>>
>>
>>
>> ---------------------------------------
>>
>>
>>
>> Dr. Dr. habil. Lutz Bornmann
>>
>> Division for Science and Innovation Studies
>>
>> Administrative Headquarters of the Max Planck Society
>>
>> Hofgartenstr. 8
>>
>> 80539 Munich
>>
>> Tel.: +49 89 2108 1265
>>
>> Mobil: +49 170 9183667
>>
>> Email: bornmann at gv.mpg.de
>>
>> WWW: www.lutz-bornmann.de
>>
>> ResearcherID: http://www.researcherid.com/rid/A-3926-2008
>>
>>
>>
>>
>>
>>
>> --
>> Katy Borner
>> Victor H. Yngve Professor of Information Science
>> Director, CI for Network Science Center, http://cns.iu.edu
>> Curator, Mapping Science exhibit, http://scimaps.org
>>
>> ILS, School of Informatics and Computing, Indiana University
>> Wells Library 021, 1320 E. Tenth Street, Bloomington, IN 47405, USA
>> Phone: (812) 855-3256  Fax: -6166
>
>
>
>
> The opinions expressed are those of the individual and not the school.
> Internet communications are not secure and therefore the school does
> not accept legal responsibility for the content of this message. If the
> reader of this message is not the intended recipient, or the user
> responsible for delivering this communication to the intended recipient,
> you are hereby notified that any disclosure, distribution or copying of
> this communication is strictly prohibited.
>
>



More information about the SIGMETRICS mailing list