The Wisdom of Citing Scientists

Yves Gingras gingras.yves at UQAM.CA
Mon Aug 12 10:01:15 EDT 2013


Hello all

With all due respect for everyone discussing here the many reasons why
people 
cite a given paper, I cannot but be struck by the tendency of some to
reinvent 
the wheel or think (and write) as if nobody had raised those very questions
decades before... One cannot imagine a physics discussion group where
someone 
would write a comment saying "I think that apples do not fall at constant
speed" 
or "I think classical mechanics does not apply at the atomic level". It is
taken 
for granted that before talking physics one must READ the previous
literature. 
WHy should it be different in scientometrics?

Here it seems to be accepted that one can lunch a "serious" discussion about
the 
"fact" that there "can be" negative citations...

As a rader of sigmetrics I find a bit depressing the lack of respect for
previous work. I do not have the time to construct a reference list on the
question of the many kinds of reasons for citing but at least a good minimum
beginning would be Blaise Cronin 1984 (!) book on "The citation process". A
more 
recent review is Loet's 1998  paper in Scientometrics on "Theories of
citations" But the literature is huge...

I guess you will now have understood why I rarely write on those discussion
groups... But it is Sunday and an exception does not change a rule...

:)

Cordially to all

Yves Gingras


Le 11/08/13 11:05, « Loet Leydesdorff » <loet at LEYDESDORFF.NET> a écrit :

> Adminstrative info for SIGMETRICS (for example unsubscribe):
> http://web.utk.edu/~gwhitney/sigmetrics.html
> Dear David, 
> 
> This is precisely the approach about "reasons" which one can attribute to
> citations in the first article in Scientometrics 1989. For example, a citation
> can function as a warrant or a legitimation.
> 
> Best,
> Loet
> 
> 
> On Sun, Aug 11, 2013 at 3:40 PM, David Wojick <dwojick at craigellachie.us>
> wrote:
>> Adminstrative info for SIGMETRICS (for example unsubscribe):
>> http://web.utk.edu/~gwhitney/sigmetrics.html
>> The concept of the "reason" for a citaion is ambiguous because there are
>> different kinds of reasons, some of which have been alluded to in our
>> discussion. There are psychological reasons such as motivation, sociological
>> reasons such as convention, strategic reasons, etc. 
>> 
>> Being a logician my interest is simply the role that the citation plays in
>> the reasoning presented in the article. Science is after all a system of
>> reasoning, often linked by citations. Every article is itself a complex
>> structure of reasoning. I just wrote about this at 
>>  
>> <http://scholarlykitchen.sspnet.org/2013/07/10/the-issue-tree-structure-of-ex
>> pressed-thought/>
>> http://scholarlykitchen.sspnet.org/2013/07/10/the-issue-tree-structure-of-exp
>> ressed-thought/.
>> 
>> For example a citation may be part of the introductory historical narrative
>> or it may be offering evidence supporting a strong claim, and this is a
>> significant difference. We might call these the epistemic reasons for the
>> citations. What role does the citation play in the reasoning?
>> 
>> The point is that there are different kinds of reasons, which need to be
>> sorted out in any scientific inquiry into the reasons for citations.
>> 
>> David Wojick
>> 
>> On Aug 10, 2013, at 9:56 AM, James Hartley < <mailto:j.hartley at KEELE.AC.UK>
>> j.hartley at KEELE.AC.UK> wrote:
>> 
>>> Adminstrative info for SIGMETRICS (for example unsubscribe):
>>> <http://web.utk.edu/~gwhitney/sigmetrics.html>
>>> http://web.utk.edu/~gwhitney/sigmetrics.html
>>> Peter Willett ( <mailto:p.willett at sheffield.ac.uk>
>>> <mailto:p.willett at sheffield.ac.uk> p.willett at sheffield.ac.uk) published an
>>> interesting paper in the Journal of Documentation, 2012, 69, 1 pp??
>>> Showing that most readers found it difficult to detect why authors had cited
>>> their references..
>>>  
>>> I (James Hartley) ( <mailto:J.hartley at keele.ac.uk>
>>> <mailto:J.hartley at keele.ac.uk> J.hartley at keele.ac.uk) suggested 8 reasons
>>> for citing other work (based on other scholars views)  and argued that one
>>> should count citations in the reference lists and not in the texts to avoid
>>> overcounting.  (Scientometrics. 92,2, 313-317.)
>>>  
>>>  
>>> From: ASIS&T Special Interest Group on Metrics
>>> [mailto:SIGMETRICS at LISTSERV.UTK.EDU] On Behalf Of David Wojick
>>> Sent: 10 August 2013 14:40
>>> To:  <mailto:SIGMETRICS at LISTSERV.UTK.EDU>
>>> <mailto:SIGMETRICS at LISTSERV.UTK.EDU> SIGMETRICS at LISTSERV.UTK.EDU
>>> Subject: Re: [SIGMETRICS] The Wisdom of Citing Scientists
>>>  
>>> Adminstrative info for SIGMETRICS (for example unsubscribe):
>>> <http://web.utk.edu/~gwhitney/sigmetrics.html>
>>> <http://web.utk.edu/~gwhitney/sigmetrics.html>
>>> http://web.utk.edu/~gwhitney/sigmetrics.html
>>> I did a small study that found the majority of citations occurring in the
>>> introductory part of most of the articles. Over 60% of the citations
>>> occurred in the first 25% of the text on average. This section of the
>>> article is basically an historical narrative that explains the origin and
>>> nature of the research problem being reported on. The cited works need not
>>> have directly influenced the research being reported. 
>>> 
>>>  
>>> 
>>> Then the article typically goes on to explain what was done and what was
>>> found. Here the citations often identify the sources of methods used or data
>>> or some such. Direct influence is much more likely but the percentage of
>>> citations may be low. Finally there may be a broader discussion section,
>>> with relatively more citations.
>>> 
>>>  
>>> 
>>> The point is that many citations may not be indicators of direct influence
>>> (or impact), but rather of historical relevance. In some cases the citations
>>> may well be found only after the research is done.
>>> 
>>> David Wojick
>>> 
>>> 
>>> On Aug 9, 2013, at 12:45 PM, "Smalheiser, Neil" <
>>> <mailto:Nsmalheiser at PSYCH.UIC.EDU>  <mailto:Nsmalheiser at PSYCH.UIC.EDU>
>>> Nsmalheiser at PSYCH.UIC.EDU> wrote:
>>>> 
>>>> Since Katy covered one aspect of this issue, let me raise a complementary
>>>> aspect that I have not seen discussed yet in this forum.  
>>>> When people DO cite references in a paper, they do so possibly for very
>>>> different reasons, each with a different rationale and pattern of citing.
>>>> 1.       Ideally, in my opinion, an author should accurately cite the
>>>> previous works that influenced them in the research that they are
>>>> reporting. A research paper tells a story, and it is important to know what
>>>> papers they read, and when, and how they were influenced. So if they were
>>>> unaware of some relevant research at the time, it is not important (and
>>>> even intellectually misleading) to cite it!
>>>> 
>>>> 2.       Another reason that authors omit citations is on purpose ­ they
>>>> wish to make their own contribution seem new and fresh, and even if they
>>>> were aware of some prior relevant work, they may find some excuse not to
>>>> cite it [e.g. it was done in Drosophila but my study is in rats].
>>>> 
>>>> 3.     More often, authors attempt to identify all relevant prior research,
>>>> in a prospective attempt to satisfy reviewers who are likely to give them a
>>>> hard time if they don¹t. Some authors even do this out of scholarliness,
>>>> though that is not a particularly valued attribute in experimental science.
>>>> As review articles appear on a given topic, it is often acceptable to
>>>> simply cite one or two reviews which hides the impact of the primary papers
>>>> (except for those that are most closely relevant to the present article,
>>>> regardless of their impact to the field at large). This also means that
>>>> papers will preferentially cite the most similar prior papers.
>>>> 
>>>> 4.     Even more often, authors go out of their way to cite papers by
>>>> potential reviewers or editorial board members of the journal that is
>>>> considering the paper, or folks likely to be reviewing their grants.
>>>> 
>>>> 5.     A subtle variation of this is that an author will want to cite
>>>> papers that appeared in prestigious journals, and avoid papers that were
>>>> published in obscure or questionable places, to make their own paper look
>>>> more classy and more likely to be reviewed favorably.
>>>> 
>>>> 6.     Some papers, particularly methods papers or famous papers, are
>>>> almost pop references that provide bonding between author and reader.
>>>> Citing the Watson-Crick double-helix paper (or the Mullis PCR method paper)
>>>> is not just citing that paper, but is really a nod to a lot of related
>>>> connotations and historical associations. These papers are highly cited
>>>> because they are celebrities (famous for being famous), which does reflect
>>>> impact but of a different sort.
>>>> 
>>>> So counting citations to measure impact is like characterizing a person¹s
>>>> health by heart rate ­ it means something; it is important for sure; but
>>>> you need to know a lot more to interpret it properly.
>>>>  
>>>> Neil
>>>> From: ASIS&T Special Interest Group on Metrics [
>>>> <mailto:SIGMETRICS at LISTSERV.UTK.EDU>  <mailto:SIGMETRICS at LISTSERV.UTK.EDU>
>>>> mailto:SIGMETRICS at LISTSERV.UTK.EDU] On Behalf Of Katy Borner
>>>> Sent: Friday, August 09, 2013 8:29 AM
>>>> To:  <mailto:SIGMETRICS at LISTSERV.UTK.EDU>
>>>> <mailto:SIGMETRICS at LISTSERV.UTK.EDU> SIGMETRICS at LISTSERV.UTK.EDU
>>>> Subject: Re: [SIGMETRICS] The Wisdom of Citing Scientists
>>>>  
>>>> Adminstrative info for SIGMETRICS (for example unsubscribe):
>>>> <http://web.utk.edu/~gwhitney/sigmetrics.html>
>>>> <http://web.utk.edu/~gwhitney/sigmetrics.html>
>>>> http://web.utk.edu/~gwhitney/sigmetrics.html
>>>> Good discussion. Quick comment:
>>>> 
>>>> Work by Bollen et al. shows that science maps generated from download
>>>> (click stream) data have a substantially enlarged medical area. Medical
>>>> papers, e.g., freely available via Medline, are downloaded/read/used widely
>>>> by practitioners/doctors interested to improve health/save lives. However,
>>>> these practitioners/doctors might not necessarily produce papers with
>>>> citation references.
>>>> 
>>>> Ideally, 'research evaluation' should aim to capture output and outcomes.
>>>> 
>>>> Many of us spent a substantial amount of our time training others,
>>>> developing educational materials, in administration, or improving legal
>>>> regulations. Research Networking systems like VIVO and others, see
>>>> <http://nrn.cns.iu.edu>  <http://nrn.cns.iu.edu> http://nrn.cns.iu.edu,
>>>> provide access to more holistic data (papers, grants, courses; some systems
>>>> are connected to even more detailed annual faculty report data) on
>>>> scholar's roles in the S&T system--as researchers, mentors, administrators.
>>>> k
>>>> *  <http://scimaps.org/maps/map/a_clickstream_map_of_83/>
>>>> <http://scimaps.org/maps/map/a_clickstream_map_of_83/>
>>>> http://scimaps.org/maps/map/a_clickstream_map_of_83/
>>>> * Bollen, Johan, Lyudmila Balakireva, Luís Bettencourt, Ryan Chute, Aric
>>>> Hagberg, Marko A. Rodriguez, and Herbert Van de Sompel. 2009. ³Clickstream
>>>> Data Yields High-Resolution Maps of Science.² PLoS One 4 (3): 1-11.
>>>>  
>>>> 
>>>> On 8/9/2013 3:22 AM, Bornmann, Lutz wrote:
>>>>> The Wisdom of Citing Scientists
>>>>> Lutz Bornmann <http://arxiv.org/find/cs/1/au:+Bornmann_L/0/1/0/all/0/1> ,
>>>>> Werner Marx <http://arxiv.org/find/cs/1/au:+Marx_W/0/1/0/all/0/1>
>>>>> (Submitted on 7 Aug 2013)
>>>>>  
>>>>> This Brief Communication discusses the benefits of citation analysis in
>>>>> research evaluation based on Galton's "Wisdom of Crowds" (1907). Citations
>>>>> are based on the assessment of many which is why they can be ascribed a
>>>>> certain amount of accuracy. However, we show that citations are incomplete
>>>>> assessments and that one cannot assume that a high number of citations
>>>>> correlate with a high level of usefulness. Only when one knows that a
>>>>> rarely cited paper has been widely read is it possible to say (strictly
>>>>> speaking) that it was obviously of little use for further research. Using
>>>>> a comparison with 'like' data, we try to determine that cited reference
>>>>> analysis allows a more meaningful analysis of bibliometric data than
>>>>> times-cited analysis.
>>>>>  
>>>>> URL:  <http://arxiv.org/abs/1308.1554>  <http://arxiv.org/abs/1308.1554>
>>>>> http://arxiv.org/abs/1308.1554
>>>>>  
>>>>> ---------------------------------------
>>>>>  
>>>>> Dr. Dr. habil. Lutz Bornmann
>>>>> Division for Science and Innovation Studies
>>>>> Administrative Headquarters of the Max Planck Society
>>>>> Hofgartenstr. 8
>>>>> 80539 Munich
>>>>> Tel.: +49 89 2108 1265 <tel:%2B49%2089%202108%201265>
>>>>> Mobil: +49 170 9183667 <tel:%2B49%20170%209183667>
>>>>> Email:  <mailto:bornmann at gv.mpg.de>  <mailto:bornmann at gv.mpg.de>
>>>>> bornmann at gv.mpg.de
>>>>> WWW:  <http://www.lutz-bornmann.de>  <http://www.lutz-bornmann.de>
>>>>> www.lutz-bornmann.de <http://www.lutz-bornmann.de>
>>>>> ResearcherID:  <http://www.researcherid.com/rid/A-3926-2008>
>>>>> <http://www.researcherid.com/rid/A-3926-2008>
>>>>> http://www.researcherid.com/rid/A-3926-2008
>>>>>  
>>>> 
>>>> 


Yves Gingras

Professeur 
Département d'histoire
Centre interuniversitaire de recherche
sur la science et la technologie (CIRST)
Chaire de recherche du Canada en histoire
et sociologie des sciences
Observatoire des sciences et des technologies (OST)
UQAM
C.P. 8888, Succ. Centre-Ville
Montréal, Québec
Canada, H3C 3P8

Tel: (514)-987-3000-7053
Fax: (514)-987-7726

http://www.chss.uqam.ca
http://www.cirst.uqam.ca
http://www.ost.uqam.ca

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.asis.org/pipermail/sigmetrics/attachments/20130812/23a81303/attachment.html>


More information about the SIGMETRICS mailing list