From eugene.garfield at THOMSONREUTERS.COM Wed Mar 3 22:27:27 2010 From: eugene.garfield at THOMSONREUTERS.COM (Eugene Garfield) Date: Wed, 3 Mar 2010 21:27:27 -0600 Subject: Articles by Latin American Authors in Prestigious Journals Have Fewer Citations PLOS ONE Volume: 3 Issue: 11 Article Number: e3804 Published: NOV 25 2008 Message-ID: Articles by Latin American Authors in Prestigious Journals Have Fewer Citations Author(s): Meneghini R (Meneghini, Rogerio), Packer AL (Packer, Abel L.), Nassi-Calo L (Nassi-Calo, Lilian) Source: PLOS ONE Volume: 3 Issue: 11 Article Number: e3804 Published: NOV 25 2008 Times Cited: 3 References: 16 Citation Map Abstract: Background: The journal Impact factor (IF) is generally accepted to be a good measurement of the relevance/quality of articles that a journal publishes. In spite of an, apparently, homogenous peer-review process for a given journal, we hypothesize that the country affiliation of authors from developing Latin American (LA) countries affects the IF of a journal detrimentally. Methodology/Principal Findings: Seven prestigious international journals, one multidisciplinary journal and six serving specific branches of science, were examined in terms of their IF in the Web of Science. Two subsets of each journal were then selected to evaluate the influence of author's affiliation on the IF. They comprised contributions (i) with authorship from four Latin American (LA) countries (Argentina, Brazil, Chile and Mexico) and (ii) with authorship from five developed countries (England, France, Germany, Japan and USA). Both subsets were further subdivided into two groups: articles with authorship from one country only and collaborative articles with authorship from other countries. Articles from the five developed countries had IF close to the overall IF of the journals and the influence of collaboration on this value was minor. In the case of LA articles the effect of collaboration (virtually all with developed countries) was significant. The IFs for non-collaborative articles averaged 66% of the overall IF of the journals whereas the articles in collaboration raised the IFs to values close to the overall IF. Conclusion/Significance: The study shows a significantly lower IF in the group of the subsets of non-collaborative LA articles and thus that country affiliation of authors from non-developed LA countries does affect the IF of a journal detrimentally. There are no data to indicate whether the lower IFs of LA articles were due to their inherent inferior quality/relevance or psycho-social trend towards under-citation of articles from these countries. However, further study is required since there are foreseeable consequences of this trend as it may stimulate strategies by editors to turn down articles that tend to be under-cited. Document Type: Article Language: English KeyWords Plus: IMPACT FACTOR; SCIENCE; PUBLICATIONS; INDICATORS Reprint Address: Meneghini, R (reprint author), Latin Amer & Caribbean Ctr Hlth Sci Informat, BIREME PAHO WHO, Sao Paulo, Brazil E-mail Addresses: rogerio.meneghini at bireme.org Publisher: PUBLIC LIBRARY SCIENCE, 185 BERRY ST, STE 1300, SAN FRANCISCO, CA 94107 USA ------------------------------------------------------------------------ ------------ Eugene Garfield, PhD. email: garfield at codex.cis.upenn.edu home page: www.eugenegarfield.org Tel: 215-243-2205 Fax: 610-560-4749 Chairman Emeritus, Thomson Reuters Scientific (formerly ISI) 1500 Spring Garden Street, Philadelphia, PA 19130-4067 President, The Scientist LLC. www.the-scientist.com 400 Market Street, Suite 1250, Philadelphia, PA 19106-2501 Past President, American Society for Information Science and Technology (ASIS&T) www.asist.org -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image001.gif Type: image/gif Size: 43 bytes Desc: image001.gif URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image002.gif Type: image/gif Size: 836 bytes Desc: image002.gif URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image003.gif Type: image/gif Size: 208 bytes Desc: image003.gif URL: From Wolfgang.Glanzel at ECON.KULEUVEN.AC.BE Mon Mar 8 10:54:40 2010 From: Wolfgang.Glanzel at ECON.KULEUVEN.AC.BE (Glanzel, Wolfgang) Date: Mon, 8 Mar 2010 16:54:40 +0100 Subject: Job announcement Bibliometrics - ECOOM Leuven (Belgium) Message-ID: NEW JUNIOR OR SENIOR POSITION IN THE AREA OF SCIENTOMETRICS/BIBLIOMETRICS The Centre for R&D Monitoring (ECOOM), constructed as an interuniversity consortium, is housed within the Faculty of Economics and Applied Economics at the Catholic University of Leuven. ECOOM develops a consistent system of R&D and Innovation (RD&I) indicators for the Flemish government. This indicator system is designed to assist the Flemish government in mapping and monitoring the RD&I efforts in the Flemish region. Within this context, ECOOM is the pre-eminent Flemish site for the study and application of advanced bibliometric techniques in the evaluation of research, and has become a prominent research centre of international visibility. To support the project, we are currently recruiting 1 Senior Fellow or Junior Research Fellow. Applicants for a post of junior researcher hold a university diploma and have the skills necessary to work with the bibliographic databases and bibliometric indicators based on journal publications, and should have an interest in bibliometric methods, quantitative science studies and quantitative aspects of communication in science and research policy. The research fellow assists in the processing, development, statistical analysis and interpretation of data, intended to initiate, structure, process and report research results. He or she collaborates in the modelling of complex research problems and applies corresponding analyses. In particular, the research fellow will process and analyse publication and citation data. Junior fellows are expected to obtain a PhD during their appointment. Applicants as candidates for senior research fellow should be willing and able to contribute to all bibliometrics related work necessary to accomplish the mission of ECOOM, should have a completed PhD in an area of relevance to the work of the Centre as well as a strong publication record and should have considerable research experience and major original contributions to the study of bibliometrics and research evaluation. Applications are sought from researchers with following background. Knowledge of quantitative data processing, bibliometric and statistical methods, data engineering as well as information retrieval skills are expected. Further profound knowledge is expected in computer science, particularly, in the field of text mining and visualisation techniques. Fluency in written and spoken English is essential. We offer a fulltime position in the internationally oriented research environment of the Catholic University of Leuven. Appointment will be a fixed-term for four years for the juniors and till End 2014 for the senior position, respectively. The salary will be assessed in accordance to the legal salary grade systems existing at the Catholic University of Leuven. Women are encouraged to apply. Severely disabled people who are equally qualified will be given preference. Deadline for applications: 31 March 2010. Applications must be sent to: Dani Vandepoel Administrative coordinator Centre for R&D Monitoring E-mail: Dani.vandepoel at econ.kuleuven.be From amsciforum at GMAIL.COM Tue Mar 9 12:21:42 2010 From: amsciforum at GMAIL.COM (Stevan Harnad) Date: Tue, 9 Mar 2010 12:21:42 -0500 Subject: Alma Swan: The OA citation advantage: Studies and results to date Message-ID: ** Cross-Posted ** [Note added by SH: These data are derived from Dr. Steve Hitchcock's bibliography of studies on the effect of open access and downloads ('hits') on citation impact. They are now ripe for a meta-analysis: You are encouraged to do one -- or to contact Dr. Swan and Dr. Hitchcock if you are interested in collaborating] ------------ Swan, A. (2010) The Open Access citation advantage: Studies and results to date. Technical Report, School of Electronics & Computer Science, University of Southampton. http://eprints.ecs.soton.ac.uk/18516/ ABSTRACT: This paper presents a summary of reported studies on the Open Access citation advantage. There is a brief introduction to the main issues involved in carrying out such studies, both methodological and interpretive. The study listing provides some details of the coverage, methodological approach and main conclusions of each study. From j.hartley at PSY.KEELE.AC.UK Wed Mar 10 06:29:45 2010 From: j.hartley at PSY.KEELE.AC.UK (James Hartley) Date: Wed, 10 Mar 2010 11:29:45 -0000 Subject: Help needed in a study of the pereptions of basic and applied science Message-ID: A colleague of mine in Germany has a small study running on the perceptions of basic and applied science and would welcome your views. So, if you have the time, can you please fill out the survey you can find at: http://www.int-umfragen.fraunhofer.de/index_Umfrage_en.html Thanks Jim Hartley -- Milo? Jovanovi? M. A.; Fraunhofer INT; Appelsgarten 2, 53879 Euskirchen, Germany Tel.: +49 2251 18 - 265 Mobile: +49 163 278 33 55 mailto:milos.jovanovic at int.fraunhofer.de http://www.fraunhofer.de -------------------------------------------------------------------------------- Von: James Hartley [mailto:j.hartley at psy.keele.ac.uk] Gesendet: Dienstag, 9. M?rz 2010 12:43 An: Jovanovic, Milos Betreff: Re: Help needed in a study of the layout of abstracts Thanks for your message. I am interested in the replies of information scientists as well as medics, so if you would like to have a go I would be very grateful. I attach a copy of the relevant materials so you can take a look and see what you think! Best wishes James Hartley School of Psychology Keele University Staffordshire ST5 5BG UK j.hartley at psy.keele.ac.uk http://www.keele.ac.uk/depts/ps/people/JHartley/index.htm ----- Original Message ----- From: Jovanovic, Milos To: James Hartley Sent: Tuesday, March 09, 2010 10:04 AM Subject: AW: Help needed in a study of the layout of abstracts Dear Mr. Hartley, I would love to help you in your study but I am a historian and information scientist, I do not work in the medical sciences. I understand your email in the context that you need medical scientists. Is this correct? Kind regards, -- Milo? Jovanovi? M. A.; Fraunhofer INT; Appelsgarten 2, 53879 Euskirchen, Germany Tel.: +49 2251 18 - 265 Mobile: +49 163 278 33 55 mailto:milos.jovanovic at int.fraunhofer.de http://www.fraunhofer.de ------------------------------------------------------------------------------ Von: James Hartley [mailto:j.hartley at psy.keele.ac.uk] Gesendet: Dienstag, 9. M?rz 2010 10:43 An: wolfgang.glanzel at econ.kuleuven.ac.be; yuan at nii.ac.jp; ivana.roche at inist.fr; patricia.laurens at obs-ost.fr; Jovanovic, Milos; sebastian.boell at unsw.edu.au; Ping.Zhou at econ.kuleuven.be; Christian Schl?gl; wanjk at lib.tsinghua.edu.cn; rklavans at mapofscience.com; rcostas at cwts.leidenuniv.nl; Aparna Basu; af_asemi at yahoo.com; Isidro F. Aguillo; pvinkler at chemres.hu; edgar.schiebel at ait.ac.at Betreff: Help needed in a study of the layout of abstracts Dear Colleague I am carrying out a study of different formats for the first page and the abstracts of articles in academic journals and would be grateful if you felt you were able to take part. Please find attached some information about the study - which should take less than 10 mins - and an informed consent form which I need you to sign (an e-mail will do) before I can send you the materials! If you would also like to forward this invitation on to any of your colleagues in the medical sciences, I would be very grateful. I look forward to hearing from you. Many thanks and best wishes James Hartley School of Psychology Keele University Staffordshire ST5 5BG UK j.hartley at psy.keele.ac.uk http://www.keele.ac.uk/depts/ps/people/JHartley/index.htm -------------- next part -------------- An HTML attachment was scrubbed... URL: From pmd8 at CORNELL.EDU Thu Mar 11 12:17:50 2010 From: pmd8 at CORNELL.EDU (Philip Davis) Date: Thu, 11 Mar 2010 12:17:50 -0500 Subject: Alma Swan: The OA citation advantage: Studies and results to date In-Reply-To: Message-ID: Stevan, In my critique of this review today (see: http://j.mp/d91Jk2 ), I commented on the inappropriate use of meta-analysis to the empirical OA citation studies: "Meta-analysis is set of powerful statistical techniques for analyzing the literature. Its main function is to increase the statistical power of observation by combining separate empirical studies into one ?ber-analysis. It?s assumed, however, that the studies are comparable (for instance, the same drug given to a random group of patients with multiple myeloma), but conducted at different times in different locales. This is not the case with the empirical literature on open access and citations. Most of the studies to date are observational (simply observing the citation performance of two sets of articles), and most of these use no statistical controls to adjust for confounding variables. Some of the studies have focused on the effect of OA publishing, while others on OA self-archiving. To date, there is still only one published randomized controlled trial. Conducting a meta-analysis on this disparate collection of studies is like taking a Veg-O-Matic to a seven-course dinner. Not only does it homogenize the context (and limitations) of each study into a brown and unseemly mess, but it assumes that homogenization of disparate studies somehow results in a clearer picture of scientific truth." --Phil Davis Stevan Harnad wrote: > ** Cross-Posted ** > > [Note added by SH: These data are derived from Dr. Steve Hitchcock's > bibliography of studies on the effect of open access and downloads > ('hits') on citation impact. They are now ripe for a meta-analysis: > You are encouraged to do one -- or to contact Dr. Swan and Dr. > Hitchcock if you are interested in collaborating] > > ------------ > > Swan, A. (2010) The Open Access citation advantage: Studies and > results to date. Technical Report, School of Electronics & Computer > Science, University of Southampton. > http://eprints.ecs.soton.ac.uk/18516/ > > ABSTRACT: > This paper presents a summary of reported studies on the Open Access > citation advantage. There is a brief introduction to the main issues > involved in carrying out such studies, both methodological and > interpretive. The study listing provides some details of the coverage, > methodological approach and main conclusions of each study. > > -- Philip M. Davis PhD Student Department of Communication 301 Kennedy Hall Cornell University, Ithaca, NY 14853 email: pmd8 at cornell.edu phone: 607 255-2124 https://confluence.cornell.edu/display/~pmd8/resume http://scholarlykitchen.sspnet.org/author/pmd8/ From amsciforum at GMAIL.COM Thu Mar 11 21:56:47 2010 From: amsciforum at GMAIL.COM (Stevan Harnad) Date: Thu, 11 Mar 2010 21:56:47 -0500 Subject: Alma Swan: The OA citation advantage: Studies and results to date In-Reply-To: <4B9925BE.5060402@cornell.edu> Message-ID: On Thu, Mar 11, 2010 at 12:17 PM, Philip Davis wrote: > Stevan, > In my critique of this review today (see: http://j.mp/d91Jk2 ), I commented > on the inappropriate use of meta-analysis to the empirical OA citation > studies: > > "Meta-analysis is set of powerful statistical techniques for analyzing the > literature. Its main function is to increase the statistical power of > observation by combining separate empirical studies into one ?ber-analysis. > It?s assumed, however, that the studies are comparable (for instance, the > same drug given to a random group of patients with multiple myeloma), but > conducted at different times in different locales. > > This is not the case with the empirical literature on open access and > citations. Most of the studies to date are observational (simply observing > the citation performance of two sets of articles), and most of these use no > statistical controls to adjust for confounding variables. Some of the > studies have focused on the effect of OA publishing, while others on OA > self-archiving. To date, there is still only one published randomized > controlled trial. > > Conducting a meta-analysis on this disparate collection of studies is like > taking a Veg-O-Matic to a seven-course dinner. Not only does it homogenize > the context (and limitations) of each study into a brown and unseemly mess, > but it assumes that homogenization of disparate studies somehow results in a > clearer picture of scientific truth." > > --Phil Davis Phil, Thanks for the helpful feedback. I'm afraid you're mistaken about meta-analysis. It can be a perfectly appropriate statistical technique for analyzing a large number of studies, with positive and negative outcomes, varying in methodological rigor, sample size and effect size. It is a way of estimating whether or not there is a significant underlying effect. I think you may be inadvertently mixing up the criteria for eligibility for meta-analysis with the criteria for a clinical drug trial (for which there rightly tends to be an insistence on randomized control trials in biomedical research). Now I would again like to take the opportunity of receiving this helpful feedback from you to remind you about some feedback I have given you repeatedly http://bit.ly/dkieVi on your own 2008 study -- the randomized control trial that you suggest has been the only methodologically sound test of the OA Advantage so far: You forgot to do a self-selection control condition. That would be rather like doing a randomized control trial on a drug -- to show that the nonrandom control trials that have reported a positive benefit for that drug were really just self-selection artifacts -- but neglecting to include a replication of the self-selection artifact in your own sample, as a control. For, you see, if your own sample was too small and/or too brief (e.g., you didn't administer the drug for as long an interval, or to as many patients, as the nonrandom studies reporting the positive effects had done), then your own null effect with a randomized trial would be just that: a null effect, not a demonstration that randomizing eliminates the nonrandomized drug effect. (This is the kind of methodological weakness, for example, that multiple studies can be weighted for, in a meta-analysis.) I am responding to your public feedback only here, on the SIGMETRICS list, rather than also on your SSP Blog, where you likewise publicly posted this same feedback (along with other, rather shriller remarks) http://j.mp/d91Jk2 because I am assuming that you will again decline to post my response on your blog, as you did the previous time that you publicly posted your feedback on my work both there http://bit.ly/8LK57u and here -- refusing my response on your blog on the grounds that it had already been publicly posted elsewhere (namely, here!)... -- Stevan Harnad PS The idea of doing a meta-analysis came from me, not from Dr. Swan. > Stevan Harnad wrote: >> >> ? ?** Cross-Posted ** >> >> [Note added by SH: These data are derived from Dr. Steve Hitchcock's >> bibliography of studies on the effect of open access and downloads >> ('hits') on citation impact. They are now ripe for a meta-analysis: >> You are encouraged to do one -- or to contact Dr. Swan and Dr. >> Hitchcock if you are interested in collaborating] >> >> ------------ >> >> Swan, A. (2010) The Open Access citation advantage: Studies and >> results to date. Technical Report, School of Electronics & Computer >> Science, University of Southampton. >> http://eprints.ecs.soton.ac.uk/18516/ >> >> ABSTRACT: >> This paper presents a summary of reported studies on the Open Access >> citation advantage. There is a brief introduction to the main issues >> involved in carrying out such studies, both methodological and >> interpretive. The study listing provides some details of the coverage, >> methodological approach and main conclusions of each study. >> > -- > Philip M. Davis > PhD Student > Department of Communication > 301 Kennedy Hall > Cornell University, Ithaca, NY 14853 > email: pmd8 at cornell.edu > phone: 607 255-2124 > https://confluence.cornell.edu/display/~pmd8/resume > http://scholarlykitchen.sspnet.org/author/pmd8/ From amsciforum at GMAIL.COM Fri Mar 12 13:31:11 2010 From: amsciforum at GMAIL.COM (Stevan Harnad) Date: Fri, 12 Mar 2010 13:31:11 -0500 Subject: Alma Swan: The OA citation advantage: Studies and results to date In-Reply-To: Message-ID: Here, reposted, is some feedback on meta-analysis from one of its leading exponents: Gene V Glass Says: Mar 12, 2010 at 12:47 pm http://scholarlykitchen.sspnet.org/2010/03/11/rewriting-the-history-on-access/ Far more issues about OA and meta analysis have been raised in this thread for me to [be able to] comment on. But having dedicated 35 years of my efforts to meta analysis and 20 to OA, I can?t resist a couple of quick observations. Holding up one set of methods (be they RCT or whatever) as the gold standard is inconsistent with decades of empirical work in meta analysis that shows that ?perfect studies? and ?less than perfect studies? seldom show important differences in results. If the question at hand concerns experimental intervention, then random assignment to groups may well be inferior as a matching technique to even an ex post facto matching of groups. Randomization is not the royal road to equivalence of groups; it?s the road to probability statements about differences. Claims about the superiority of certain methods are empirical claims. They are not a priori dicta about what evidence can and can not be looked at. Glass, G.V.; McGaw, B.; & Smith, M.L. (1981). Meta-analysis in Social Research. Beverly Hills, CA: SAGE. Rudner, Lawrence, Gene V Glass, David L. Evartt, and Patrick J. Emery (2000). A user's guide to the meta-analysis of research studies. ERIC Clearinghouse on Assessment and Evaluation, University of Maryland, College Park. http://glass.ed.asu.edu/gene/resume2.html On Thu, Mar 11, 2010 at 9:56 PM, Stevan Harnad wrote: > On Thu, Mar 11, 2010 at 12:17 PM, Philip Davis wrote: > >> Stevan, >> In my critique of this review today (see: http://j.mp/d91Jk2 ), I commented >> on the inappropriate use of meta-analysis to the empirical OA citation >> studies: >> >> "Meta-analysis is set of powerful statistical techniques for analyzing the >> literature. Its main function is to increase the statistical power of >> observation by combining separate empirical studies into one ?ber-analysis. >> It?s assumed, however, that the studies are comparable (for instance, the >> same drug given to a random group of patients with multiple myeloma), but >> conducted at different times in different locales. >> >> This is not the case with the empirical literature on open access and >> citations. Most of the studies to date are observational (simply observing >> the citation performance of two sets of articles), and most of these use no >> statistical controls to adjust for confounding variables. Some of the >> studies have focused on the effect of OA publishing, while others on OA >> self-archiving. To date, there is still only one published randomized >> controlled trial. >> >> Conducting a meta-analysis on this disparate collection of studies is like >> taking a Veg-O-Matic to a seven-course dinner. Not only does it homogenize >> the context (and limitations) of each study into a brown and unseemly mess, >> but it assumes that homogenization of disparate studies somehow results in a >> clearer picture of scientific truth." >> >> --Phil Davis > > Phil, > > Thanks for the helpful feedback. > > I'm afraid you're mistaken about meta-analysis. It can be a perfectly > appropriate statistical technique for analyzing a large number of > studies, with positive and negative outcomes, varying in > methodological rigor, sample size and effect size. It is a way of > estimating whether or not there is a significant underlying effect. > > I think you may be inadvertently mixing up the criteria for > eligibility for meta-analysis with the criteria for a clinical drug > trial (for which there rightly tends to be an insistence on randomized > control trials in biomedical research). > > Now I would again like to take the opportunity of receiving this > helpful feedback from you to remind you about some feedback I have > given you repeatedly http://bit.ly/dkieVi on your own 2008 study -- > the randomized control trial that you suggest has been the only > methodologically sound test of the OA Advantage so far: > > You forgot to do a self-selection control condition. That would be > rather like doing a randomized control trial on a drug -- to show that > the nonrandom control trials that have reported a positive benefit for > that drug were really just self-selection artifacts -- but neglecting > to include a replication of the self-selection artifact in your own > sample, as a control. > > For, you see, if your own sample was too small and/or too brief (e.g., > you didn't administer the drug for as long an interval, or to as many > patients, as the nonrandom studies reporting the positive effects had > done), then your own null effect with a randomized trial would be just > that: a null effect, not a demonstration that randomizing eliminates > the nonrandomized drug effect. (This is the kind of methodological > weakness, for example, that multiple studies can be weighted for, in a > meta-analysis.) > > I am responding to your public feedback only here, on the SIGMETRICS > list, rather than also on your SSP Blog, where you likewise publicly > posted this same feedback (along with other, rather shriller remarks) > http://j.mp/d91Jk2 because I am assuming that you will again decline > to post my response on your blog, as you did the previous time that > you publicly posted your feedback on my work both there > http://bit.ly/8LK57u and here -- refusing my response on your blog on > the grounds that it had already been publicly posted elsewhere > (namely, here!)... > > -- Stevan Harnad > > PS The idea of doing a meta-analysis came from me, not from Dr. Swan. > >> Stevan Harnad wrote: >>> >>> ? ?** Cross-Posted ** >>> >>> [Note added by SH: These data are derived from Dr. Steve Hitchcock's >>> bibliography of studies on the effect of open access and downloads >>> ('hits') on citation impact. They are now ripe for a meta-analysis: >>> You are encouraged to do one -- or to contact Dr. Swan and Dr. >>> Hitchcock if you are interested in collaborating] >>> >>> ------------ >>> >>> Swan, A. (2010) The Open Access citation advantage: Studies and >>> results to date. Technical Report, School of Electronics & Computer >>> Science, University of Southampton. >>> http://eprints.ecs.soton.ac.uk/18516/ >>> >>> ABSTRACT: >>> This paper presents a summary of reported studies on the Open Access >>> citation advantage. There is a brief introduction to the main issues >>> involved in carrying out such studies, both methodological and >>> interpretive. The study listing provides some details of the coverage, >>> methodological approach and main conclusions of each study. >>> >> -- >> Philip M. Davis >> PhD Student >> Department of Communication >> 301 Kennedy Hall >> Cornell University, Ithaca, NY 14853 >> email: pmd8 at cornell.edu >> phone: 607 255-2124 >> https://confluence.cornell.edu/display/~pmd8/resume >> http://scholarlykitchen.sspnet.org/author/pmd8/ > From pmd8 at CORNELL.EDU Fri Mar 12 15:13:59 2010 From: pmd8 at CORNELL.EDU (Philip Davis) Date: Fri, 12 Mar 2010 15:13:59 -0500 Subject: Alma Swan: The OA citation advantage: Studies and results to date In-Reply-To: Message-ID: Stevan, First, I did not state in my critique of the Swan report that meta-analysis was Alma's idea, but that this was your suggestion (as posted to sigmetrics and other listservs). Secondly, you keep on trying to turn things back to critiquing my own work, as if *"the best defense is a good offense."* You've posted 5 rapid responses to the BMJ 2008 paper and another rapid response to the BMJ editorial. I've responded to your concerns and have better things to do than engaged in an endless discussion with you when there is absolutely no hope of changing your mind. You can continue to plaster the Internet with your critiques and astonishment that I haven't responded if this makes you feel better. I have students to teach and a dissertation to write. --Phil Davis Stevan Harnad wrote: > > Phil, > > Thanks for the helpful feedback. > > I'm afraid you're mistaken about meta-analysis. It can be a perfectly > appropriate statistical technique for analyzing a large number of > studies, with positive and negative outcomes, varying in > methodological rigor, sample size and effect size. It is a way of > estimating whether or not there is a significant underlying effect. > > I think you may be inadvertently mixing up the criteria for > eligibility for meta-analysis with the criteria for a clinical drug > trial (for which there rightly tends to be an insistence on randomized > control trials in biomedical research). > > Now I would again like to take the opportunity of receiving this > helpful feedback from you to remind you about some feedback I have > given you repeatedly http://bit.ly/dkieVi on your own 2008 study -- > the randomized control trial that you suggest has been the only > methodologically sound test of the OA Advantage so far: > > You forgot to do a self-selection control condition. That would be > rather like doing a randomized control trial on a drug -- to show that > the nonrandom control trials that have reported a positive benefit for > that drug were really just self-selection artifacts -- but neglecting > to include a replication of the self-selection artifact in your own > sample, as a control. > > For, you see, if your own sample was too small and/or too brief (e.g., > you didn't administer the drug for as long an interval, or to as many > patients, as the nonrandom studies reporting the positive effects had > done), then your own null effect with a randomized trial would be just > that: a null effect, not a demonstration that randomizing eliminates > the nonrandomized drug effect. (This is the kind of methodological > weakness, for example, that multiple studies can be weighted for, in a > meta-analysis.) > > I am responding to your public feedback only here, on the SIGMETRICS > list, rather than also on your SSP Blog, where you likewise publicly > posted this same feedback (along with other, rather shriller remarks) > http://j.mp/d91Jk2 because I am assuming that you will again decline > to post my response on your blog, as you did the previous time that > you publicly posted your feedback on my work both there > http://bit.ly/8LK57u and here -- refusing my response on your blog on > the grounds that it had already been publicly posted elsewhere > (namely, here!)... > > -- Stevan Harnad > > PS The idea of doing a meta-analysis came from me, not from Dr. Swan. > > From kretschmer.h at T-ONLINE.DE Mon Mar 15 13:51:56 2010 From: kretschmer.h at T-ONLINE.DE (kretschmer.h@t-online.de) Date: Mon, 15 Mar 2010 18:51:56 +0100 Subject: Reminder: 6th Int Conf on Webometrics, Informetrics and Scientometrics & 11th COLLNET Meeting Message-ID: An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Mysore_Second Announcement.pdf Type: application/pdf Size: 708459 bytes Desc: URL: From andrea.scharnhorst at VKS.KNAW.NL Tue Mar 16 08:30:32 2010 From: andrea.scharnhorst at VKS.KNAW.NL (Andrea Scharnhorst) Date: Tue, 16 Mar 2010 13:30:32 +0100 Subject: Call for chapters : Models of science dynamics - encounters between complexity theory and information sciences Message-ID: Dear colleagues, it is a pleasure to me to annouce a call for chapters for a book in the Springer Complexity Series "Models of science dynamics - encounters between complexity theory and information sciences" concerning the application of mathematical models for the science system. The deadline for chapter outlines is April 15, 2010. More details can be found in the attached pdf and here http://simshelf2.virtualknowledgestudio.nl/activities/models-science-dynamics-encounters-between-complexity-theory-and-information-sciences This book is an outcome of the last year workshop on Modelling science in Amsterdam see http://modelling-science.simshelf.virtualknowledgestudio.nl/ Looking forward to your submissions! Warmest regards Andrea, Katy and Peter -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Models of science dynamics_call.pdf Type: application/pdf Size: 347510 bytes Desc: Models of science dynamics_call.pdf URL: From ruben at UCR.EDU Tue Mar 16 19:46:42 2010 From: ruben at UCR.EDU (Ruben Urbizagastegui) Date: Tue, 16 Mar 2010 16:46:42 -0700 Subject: Help In-Reply-To: <1266951735.2826.58.camel@computer> Message-ID: Dear all, I am looking for a copy of this article: COZZENS, Susan. Taking the measure of science. International Society for the Sociology of Knowledge Newsletter, 7(1-2):16-20, 1981. If some of you guys has a copy of this papers, could you please send me a copy? I really will apprecited it. Thank you for all the help Ruben Urbizagastegui University of California, Riverside University Libraries P.O. Box 5900 Riverside, California 92517-5900 From amsciforum at GMAIL.COM Wed Mar 17 15:05:53 2010 From: amsciforum at GMAIL.COM (Stevan Harnad) Date: Wed, 17 Mar 2010 15:05:53 -0400 Subject: Alma Swan: The OA citation advantage In-Reply-To: <201003160154.o2G1sncY008786@quickgr.its.yale.edu> Message-ID: On Mon, Mar 15, 2010 at 9:53 PM, Philip Davis wrote: > Stevan, > > First of all, I did not state in my critique of the Swan report > (http://j.mp/d91Jk2) that meta-analysis was Alma's idea, but that > this ?was your suggestion (as posted to liblicense-l, > sigmetrics-l, and other ?listservs). > > Secondly, you keep trying to divert criticism of your colleague's > work ?by critiquing my own work, as if ?*"your best defense is a > good ?offense."* ?You've posted 5 rapid responses to the BMJ 2008 > paper and ?another rapid response to the BMJ editorial. [ http://www.bmj.com/cgi/eletters/337/jul31_1/a568#top ] > I've responded to your ?concerns and have better things to do than > engage in an endless ?discussion with you when there is > absolutely no hope of changing your ?mind. ?You can continue to > plaster the Internet with your critiques and ?astonishment that I > haven't responded if this makes you feel better. ?I ?have > students to teach and a dissertation to write. > > --Phil Davis The message below is forwarded from David Wilson, with permission [references and links added]: [See also (thanks to Peter Suber for spotting this study!): Wagner, A. Ben (2010) Open Access Citation Advantage: An Annotated Bibliography. Issues in Science and Technology Librarianship. 60. Winter 2010 http://www.istl.org/10-winter/article2.html] Date: March 17, 2010 11:17:10 AM EDT (CA) From: David Wilson dwilsonb -- gmu.edu Subject: Re: Comment on Meta-Analysis To: harnad -- ecs.soton.ac.uk Stevan, Interesting discussion. Phil Davis has a limited albeit common view of meta-analysis. [http://bit.ly/bCKzWk] Within medicine, meta-analysis is generally applied to a small set of highly homogeneous studies. As such, the focus is on the overall or pooled effect with only a secondary focus on variability in effects. Within the social sciences, there is a strong tradition of meta-analyzing fairly heterogeneous sets of studies. The focus is clearly not on the overall effect, which would be rather meaningless, but rather on the variability in effect and the study characteristics, both methodological and substantive, that explain that variability. I don't know enough about this area to ascertain the credibility of his criticism of the methodologies of the various studies involved. However, the one study that he claims is methodologically superior in terms of internal validity (which it might be) [http://www.bmj.com/cgi/content/full/337/jul31_1/a568] is clearly deficient in statistical power. As such, it provides only a weak test. Recall, that a statistically nonsignificant finding is a weak finding -- a failure to reject the null and not acceptance of the null. Meta-analysis could be put to good use in this area. It won't resolve the issue of whether the studies that Davis thinks are flawed are in fact flawed. It could explore the consistency in effect across these studies and whether the effect varies by the method used. Both would add to the debate on this issue. [Lipsey, MW & Wilson DB (2001) Practical Meta-Analysis. Sage.] Best, Dave -- David B. Wilson, Ph.D. Associate Professor Chair, Administration of Justice Department George Mason University 10900 University Boulevard, MS 4F4 Manassas, VA 20110-2203 Phone/Fax: 703.993.4701 dwilsonb at gmu.edu http://mason.gmu.edu/~dwilsonb/home.html > Stevan Harnad wrote: >> Phil, >> Thanks for the helpful feedback. >> >> I'm afraid you're mistaken about meta-analysis. It can be a >> perfectly appropriate statistical technique for analyzing a large >> number of studies, with positive and negative outcomes, varying >> in methodological rigor, sample size and effect size. It is a way >> of estimating whether or not there is a significant underlying >> effect. >> >> I think you may be inadvertently mixing up the criteria for (1) >> eligibility and comparability for a meta-analysis with the >> criteria for (2) a clinical drug trial (for which there rightly >> tends to be an insistence on randomized control trials in >> biomedical research). >> >> Now I would again like to take the opportunity of receiving this >> helpful feedback from you to remind you about some feedback I >> have given you repeatedly http://bit.ly/dkieVi on your own 2008 >> study -- the randomized control trial that you suggest has been >> the only methodologically sound test of the OA Advantage so far: >> >> You forgot to do a self-selection control condition. That would >> be rather like doing a randomized control trial on a drug -- to >> show that the nonrandom control trials that have reported a >> positive benefit for that drug were really just self-selection >> artifacts -- but neglecting to include a replication of the >> self-selection artifact in your own sample, as a control. >> >> For, you see, if your own sample was too small and/or too brief >> (e.g., you didn't administer the drug for as long an interval, or >> to as many patients, as the nonrandom studies reporting the >> positive effects had done), then your own null effect with a >> randomized trial would be just that: a null effect, not a >> demonstration that randomizing eliminates the nonrandomized drug >> effect. (This is the kind of methodological weakness, for >> example, that multiple studies can be weighted for, in a >> meta-analysis of positive, negative and null effects.) >> >> [I am responding to your public feedback here, on the liblicense >> and SERIALST lists, but not also on your SSP Blog, where you >> likewise publicly posted this same feedback (along with other, >> rather shriller remarks) http://j.mp/d91Jk2 because I am assuming >> that you will again decline to post my response on your blog, as >> you did the previous time that you publicly posted your feedback >> on my work both there http://bit.ly/8LK57u and elsewhere -- >> refusing my response on your blog on the grounds that it had >> already been publicly posted elsewhere!...] >> >> -- Stevan Harnad >> >> PS The idea of doing a meta-analysis came from me, not from Dr. >> Swan. > From loet at LEYDESDORFF.NET Fri Mar 19 02:08:41 2010 From: loet at LEYDESDORFF.NET (Loet Leydesdorff) Date: Fri, 19 Mar 2010 07:08:41 +0100 Subject: Indicators of the Interdisciplinarity of Journals; preprint version Message-ID: Indicators of the Interdisciplinarity of Journals: Diversity, Centrality, and Citations Authors: Loet Leydesdorff, Ismael Rafols (Submitted on 18 Mar 2010) Abstract: A citation-based indicator for interdisciplinarity has been missing hitherto among the set of available journal indicators. In this study, we investigate betweenness centrality, entropy, the Gini coefficient, and more recently proposed measures for diversity that combine the statistics of vectors and distances in networks, in terms of their potential to fill this gap. The effects of various normalizations are specified and measured using the matrix of 8,207 journals contained in the Journal Citation Reports of the (Social) Science Citation Index. Betweenness centrality in (1-mode) affiliations networks provides an indicator outperforming betweenness in the (2-mode) citation network. Entropy as a vector-based indicator performs better than the Gini coefficient, but is sensitive to size. Science and Nature, for example, are indicated at the top of the list. The new diversity measure provides reasonable results when (1 - cosine) is assumed as a measure for the distance, but results using Euclidean distances are difficult to interpret. Subjects: Digital Libraries (cs.DL); Physics and Society (physics.soc-ph) Cite as: arXiv:1003.3613v1 [cs.DL] _____ Loet Leydesdorff Amsterdam School of Communications Research (ASCoR) Kloveniersburgwal 48, 1012 CX Amsterdam. Tel. +31-20-525 6598; fax: +31-842239111 loet at leydesdorff.net ; http://www.leydesdorff.net/ Visiting Professor 2007-2010, ISTIC, Beijing; Honorary Fellow 2007-2010, SPRU, University of Sussex Now available: The Knowledge-Based Economy: Modeled, Measured, Simulated, 385 pp.; US$ 18.95; The Self-Organization of the Knowledge-Based Society ; The Challenge of Scientometrics -------------- next part -------------- An HTML attachment was scrubbed... URL: From lwaltman at FEW.EUR.NL Fri Mar 19 17:34:09 2010 From: lwaltman at FEW.EUR.NL (Ludo Waltman) Date: Fri, 19 Mar 2010 17:34:09 -0400 Subject: Caveats for the journal and field normalizations in the CWTS ("Leiden") evaluations of research performance Message-ID: Dear colleagues, Recently, Tobias Opthof and Loet Leydesdorff have written a critical paper (see below) about the way in which bibliometric research performance assessment studies are conducted by the Centre for Science and Technology Studies (CWTS) of Leiden University. There are a number of important inaccuracies in the paper by Opthof and Leydesdorff. CWTS also strongly disagrees with many of their comments. In the following paper CWTS replies to the criticism of Opthof and Leydesdorff: Anthony F.J. van Raan, Thed N. van Leeuwen, Martijn S. Visser, Nees Jan van Eck, and Ludo Waltman. Rivals for the crown: Reply to Opthof and Leydesdorff. Available at http://arxiv.org/abs/1003.2113. CWTS has also prepared a related paper on the same topic: Ludo Waltman, Nees Jan van Eck, Thed N. van Leeuwen, Martijn S. Visser, and Anthony F.J. van Raan. Towards a new crown indicator: Some theoretical considerations. Available at http://arxiv.org/abs/1003.2167. Best regards, Ludo Waltman On 16/02/2010 07:46, Loet Leydesdorff wrote: > Adminstrative info for SIGMETRICS (for example unsubscribe): http://web.utk.edu/~gwhitney/sigmetrics.html > Caveats for the journal and field normalizations in the CWTS ("Leiden") evaluations of research performance > Journal of Informetrics (forthcoming). > http://arxiv.org/abs/1002.2769 > > Abstract: The Center for Science and Technology Studies at Leiden University advocates the use of specific normalizations for assessing research performance with reference to a world average. The Journal Citation Score (JCS) and Field Citation Score (FCS) are averaged for the research group or individual researcher under study, and then these values are used as denominators of the (mean) Citations per publication (CPP). Thus, this normalization is based on dividing two averages. This procedure only generates a legitimate indicator in the case of underlying normal distributions. Given the skewed distributions under study, one should average the observed versus expected values which are to be divided first for each publication. We show the effects of the Leiden normalization for a recent evaluation where we happened to have access to the underlying data. > > > Tobias Opthof [1,2], Loet Leydesdorff [3] > > > [1] Experimental Cardiology Group, Heart Failure Research Center, Academic Medical Center AMC, Meibergdreef 9, 1105 AZ Amsterdam, The Netherlands. > > [2] Department of Medical Physiology, University Medical Center Utrecht, Utrecht, The Netherlands. > > [3] Amsterdam School of Communications Research (ASCoR), University of Amsterdam, Kloveniersburgwal 48, 1012 CX Amsterdam, The Netherlands. > > > > ** apologies for cross-postings > > ======================================================== Ludo Waltman MSc Researcher Centre for Science and Technology Studies Leiden University P.O. Box 905 2300 AX Leiden The Netherlands Willem Einthoven Building, Room B5-35 Tel: +31 (0)71 527 5806 Fax: +31 (0)71 527 3911 E-mail: waltmanlr at cwts.leidenuniv.nl Homepage: www.ludowaltman.nl ======================================================== From loet at LEYDESDORFF.NET Fri Mar 19 18:22:36 2010 From: loet at LEYDESDORFF.NET (Loet Leydesdorff) Date: Fri, 19 Mar 2010 23:22:36 +0100 Subject: Caveats for the journal and field normalizations in the CWTS ("Leiden") evaluations of research performance In-Reply-To: Message-ID: Dear colleagues, Please, find below the CWTS indicators for four papers of which two are cited in the same journal i or the same field i: In our opinion, one should normalize as follows: The issue is more general than CWTS because other centers normalize using MOCR/MECR, that is: Mean Observed Citation Rates divided by Mean Expected Citation Rates. The quotient between two means is no longer a statistics, while the values of observed versus expected citation rates provide a variance (standard deviation, median, etc.). (For example, 3/2 plus 2/3 is very different from 5/5). We understand from the papers indicated that in the meantime CWTS is changing its procedures. Best wishes, Loet ________________________________ Loet Leydesdorff Amsterdam School of Communications Research (ASCoR), Kloveniersburgwal 48, 1012 CX Amsterdam. Tel.: +31-20- 525 6598; fax: +31-20- 525 3681 loet at leydesdorff.net ; http://www.leydesdorff.net/ > -----Original Message----- > From: ASIS&T Special Interest Group on Metrics > [mailto:SIGMETRICS at LISTSERV.UTK.EDU] On Behalf Of Ludo Waltman > Sent: Friday, March 19, 2010 10:34 PM > To: SIGMETRICS at LISTSERV.UTK.EDU > Subject: Re: [SIGMETRICS] Caveats for the journal and field > normalizations in the CWTS ("Leiden") evaluations of research > performance > > Adminstrative info for SIGMETRICS (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html > > Dear colleagues, > > Recently, Tobias Opthof and Loet Leydesdorff have written a > critical paper > (see below) about the way in which bibliometric research performance > assessment studies are conducted by the Centre for Science > and Technology > Studies (CWTS) of Leiden University. There are a number of important > inaccuracies in the paper by Opthof and Leydesdorff. CWTS > also strongly > disagrees with many of their comments. In the following paper > CWTS replies to > the criticism of Opthof and Leydesdorff: > > Anthony F.J. van Raan, Thed N. van Leeuwen, Martijn S. > Visser, Nees Jan van > Eck, and Ludo Waltman. Rivals for the crown: Reply to Opthof and > Leydesdorff. Available at http://arxiv.org/abs/1003.2113. > > CWTS has also prepared a related paper on the same topic: > > Ludo Waltman, Nees Jan van Eck, Thed N. van Leeuwen, Martijn > S. Visser, and > Anthony F.J. van Raan. Towards a new crown indicator: Some > theoretical > considerations. Available at http://arxiv.org/abs/1003.2167. > > Best regards, > > Ludo Waltman > > > On 16/02/2010 07:46, Loet Leydesdorff wrote: > > Adminstrative info for SIGMETRICS (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html > > Caveats for the journal and field normalizations in the > CWTS ("Leiden") > evaluations of research performance > > Journal of Informetrics (forthcoming). > > http://arxiv.org/abs/1002.2769 > > > > Abstract: The Center for Science and Technology Studies > at Leiden > University advocates the use of specific normalizations for > assessing research > performance with reference to a world average. The Journal > Citation Score > (JCS) and Field Citation Score (FCS) are averaged for the > research group or > individual researcher under study, and then these values are used as > denominators of the (mean) Citations per publication (CPP). > Thus, this > normalization is based on dividing two averages. This procedure only > generates a legitimate indicator in the case of underlying > normal distributions. > Given the skewed distributions under study, one should > average the observed > versus expected values which are to be divided first for each > publication. We > show the effects of the Leiden normalization for a recent > evaluation where we > happened to have access to the underlying data. > > > > > > Tobias Opthof [1,2], Loet Leydesdorff [3] > > > > > > [1] Experimental Cardiology Group, Heart Failure Research > Center, Academic > Medical Center AMC, Meibergdreef 9, 1105 AZ Amsterdam, The > Netherlands. > > > > [2] Department of Medical Physiology, University Medical > Center Utrecht, > Utrecht, The Netherlands. > > > > [3] Amsterdam School of Communications Research (ASCoR), > University of > Amsterdam, Kloveniersburgwal 48, 1012 CX Amsterdam, The Netherlands. > > > > > > > > ** apologies for cross-postings > > > > > > > ======================================================== > Ludo Waltman MSc > Researcher > > Centre for Science and Technology Studies > Leiden University > P.O. Box 905 > 2300 AX Leiden > The Netherlands > > Willem Einthoven Building, Room B5-35 > Tel: +31 (0)71 527 5806 > Fax: +31 (0)71 527 3911 > E-mail: waltmanlr at cwts.leidenuniv.nl > Homepage: www.ludowaltman.nl > ======================================================== > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Outlook.jpg Type: image/jpeg Size: 14427 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Outlook.jpg Type: image/jpeg Size: 13825 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Outlook.jpg Type: image/jpeg Size: 15107 bytes Desc: not available URL: From linda.butler at ANU.EDU.AU Sat Mar 20 01:01:17 2010 From: linda.butler at ANU.EDU.AU (Linda Butler) Date: Sat, 20 Mar 2010 16:01:17 +1100 Subject: Caveats for the journal and field normalizations in the CWTS ("Leiden") evaluations of research performance In-Reply-To: Message-ID: Loet, Ludo There is another reason why bibliometricians are starting to do calculations at the individual level, which doesn't seem to have been mentioned in any of the papers referred to in this discussion. That reason is to make it easier to re-aggregate publication sets in different ways. Let me give a "live" example. The Australian Research Council (ARC) trialled a new assessment system late last year. In physics, chemistry and earth sciences (the only sciences assessed in the trial - the other disciplines were all in the humanities and creative arts) bibliometrics played a key role. Several indicators were used, of which one was Relative Citation Impact. It was calculated by determining the RCI for each individual paper, then computing an average of RCIs to give an institutional RCI i.e. the method Loet is championing and the one that Ludo reveals CWTS is now also working on. In addition to statistical considerations, there was an additional important advantage to the ARC in using this methodology. The Australian Government was only interested in assessing the performance of fields within institutions - who had the strongest/ weakest geology, astronomy, organic chemistry, etc. They were not concerned about which academic units these publications came from. But Vice-Chancellors and research managers were! VERY interested!! From the very first discussions on the development of the methodology (late 2007) it became clear that an essential consideration was the ability of universities to re-aggregate data to groups, faculties, research themes, or however they wanted to do. I am not necessarily supporting what they want to do - but they undoubtedly want to do it. An article by article methodology makes this a simple computational task. So the methodology has already being used in a national research assessment system, and will be used again when all fields that are subject to bibliometrics are assessed in the second half of this year. Can I just add that I'm just a little uncomfortable that CWTS seems to have been singled out for such pointed criticism. It's not just a "CWTS" methodology. Many other groups have used or are using this method, and not just because they copied CWTS - my own unit, REPP, is a case in point. Many of us arrived at the same point through our own development work, and having arrived there, are now looking to move on and improve the calculations. It's an important issue, so let's keep the discussion going, but keep it collegial. regards Linda Butler On 20/03/2010, at 9:22 AM, Loet Leydesdorff wrote: > Adminstrative info for SIGMETRICS (for example unsubscribe): http:// > web.utk.edu/~gwhitney/sigmetrics.html > Dear colleagues, > > Please, find below the CWTS indicators for four papers of which two > are cited in the same journal i or the same field i: > > > > In our opinion, one should normalize as follows: > > > > The issue is more general than CWTS because other centers normalize > using MOCR/MECR, that is: Mean Observed Citation Rates divided by > Mean Expected Citation Rates. The quotient between two means is no > longer a statistics, while the values of observed versus expected > citation rates provide a variance (standard deviation, median, etc.). > (For example, 3/2 plus 2/3 is very different from 5/5). > > We understand from the papers indicated that in the meantime CWTS > is changing its procedures. > > Best wishes, > > Loet > ________________________________ > > Loet Leydesdorff > Amsterdam School of Communications Research (ASCoR), > Kloveniersburgwal 48, 1012 CX Amsterdam. > Tel.: +31-20- 525 6598; fax: +31-20- 525 3681 > loet at leydesdorff.net ; http://www.leydesdorff.net/ > > > > > -----Original Message----- > > From: ASIS&T Special Interest Group on Metrics > > [mailto:SIGMETRICS at LISTSERV.UTK.EDU] On Behalf Of Ludo Waltman > > Sent: Friday, March 19, 2010 10:34 PM > > To: SIGMETRICS at LISTSERV.UTK.EDU > > Subject: Re: [SIGMETRICS] Caveats for the journal and field > > normalizations in the CWTS ("Leiden") evaluations of research > > performance > > > > Adminstrative info for SIGMETRICS (for example unsubscribe): > > http://web.utk.edu/~gwhitney/sigmetrics.html > > > > Dear colleagues, > > > > Recently, Tobias Opthof and Loet Leydesdorff have written a > > critical paper > > (see below) about the way in which bibliometric research performance > > assessment studies are conducted by the Centre for Science > > and Technology > > Studies (CWTS) of Leiden University. There are a number of important > > inaccuracies in the paper by Opthof and Leydesdorff. CWTS > > also strongly > > disagrees with many of their comments. In the following paper > > CWTS replies to > > the criticism of Opthof and Leydesdorff: > > > > Anthony F.J. van Raan, Thed N. van Leeuwen, Martijn S. > > Visser, Nees Jan van > > Eck, and Ludo Waltman. Rivals for the crown: Reply to Opthof and > > Leydesdorff. Available at http://arxiv.org/abs/1003.2113. > > > > CWTS has also prepared a related paper on the same topic: > > > > Ludo Waltman, Nees Jan van Eck, Thed N. van Leeuwen, Martijn > > S. Visser, and > > Anthony F.J. van Raan. Towards a new crown indicator: Some > > theoretical > > considerations. Available at http://arxiv.org/abs/1003.2167. > > > > Best regards, > > > > Ludo Waltman > > > > > > On 16/02/2010 07:46, Loet Leydesdorff wrote: > > > Adminstrative info for SIGMETRICS (for example unsubscribe): > > http://web.utk.edu/~gwhitney/sigmetrics.html > > > Caveats for the journal and field normalizations in the > > CWTS ("Leiden") > > evaluations of research performance > > > Journal of Informetrics (forthcoming). > > > http://arxiv.org/abs/1002.2769 > > > > > > Abstract: The Center for Science and Technology Studies > > at Leiden > > University advocates the use of specific normalizations for > > assessing research > > performance with reference to a world average. The Journal > > Citation Score > > (JCS) and Field Citation Score (FCS) are averaged for the > > research group or > > individual researcher under study, and then these values are used as > > denominators of the (mean) Citations per publication (CPP). > > Thus, this > > normalization is based on dividing two averages. This procedure only > > generates a legitimate indicator in the case of underlying > > normal distributions. > > Given the skewed distributions under study, one should > > average the observed > > versus expected values which are to be divided first for each > > publication. We > > show the effects of the Leiden normalization for a recent > > evaluation where we > > happened to have access to the underlying data. > > > > > > > > > Tobias Opthof [1,2], Loet Leydesdorff [3] > > > > > > > > > [1] Experimental Cardiology Group, Heart Failure Research > > Center, Academic > > Medical Center AMC, Meibergdreef 9, 1105 AZ Amsterdam, The > > Netherlands. > > > > > > [2] Department of Medical Physiology, University Medical > > Center Utrecht, > > Utrecht, The Netherlands. > > > > > > [3] Amsterdam School of Communications Research (ASCoR), > > University of > > Amsterdam, Kloveniersburgwal 48, 1012 CX Amsterdam, The Netherlands. > > > > > > > > > > > > ** apologies for cross-postings > > > > > > > > > > > > ======================================================== > > Ludo Waltman MSc > > Researcher > > > > Centre for Science and Technology Studies > > Leiden University > > P.O. Box 905 > > 2300 AX Leiden > > The Netherlands > > > > Willem Einthoven Building, Room B5-35 > > Tel: +31 (0)71 527 5806 > > Fax: +31 (0)71 527 3911 > > E-mail: waltmanlr at cwts.leidenuniv.nl > > Homepage: www.ludowaltman.nl > > ======================================================== > > - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - Linda Butler linda.butler52 at gmail.com landline: +61 (0)2 4982 7994 mobile: 0428 598 482 url: http://members.optuszoo.com.au/linda.butler52 ABN: 83 884 783 826 -------------- next part -------------- An HTML attachment was scrubbed... URL: From loet at LEYDESDORFF.NET Tue Mar 23 02:41:16 2010 From: loet at LEYDESDORFF.NET (Loet Leydesdorff) Date: Tue, 23 Mar 2010 07:41:16 +0100 Subject: Caveats for the journal and field normalizations in the CWTS ("Leiden") evaluations of research performance In-Reply-To: <4182A985-D0EA-4A1E-A4B4-6E89FA83C062@anu.edu.au> Message-ID: Normalization, CWTS indicators, and the Leiden Rankings: Differences in citation behavior at the level of fields Authors: Loet Leydesdorff, Tobias Opthof (Submitted on 21 Mar 2010) Abstract: Van Raan et al. (2010; arXiv:1003.2113 ) have proposed a new indicator (MNCS) for field normalization. Since field normalization is also used in the Leiden Rankings of universities, we elaborate our critique of journal normalization in Opthof & Leydesdorff (2010; arXiv:1002.2769 ) in this rejoinder concerning field normalization. Fractional citation counting thoroughly solves the issue of normalization for differences in citation behavior among fields. This indicator can also be used to obtain a normalized impact factor. Subjects: Physics and Society (physics.soc-ph) Cite as: arXiv:1003.3977v1 [physics.soc-ph] _____ Loet Leydesdorff Amsterdam School of Communications Research (ASCoR) Kloveniersburgwal 48, 1012 CX Amsterdam loet at leydesdorff.net ; http://www.leydesdorff.net/ _____ From: Linda Butler [mailto:linda.butler at anu.edu.au] Sent: Saturday, March 20, 2010 6:01 AM To: Loet Leydesdorff; Ludo Waltman Cc: SIGMETRICS at listserv.utk.edu Subject: Re: [SIGMETRICS] Caveats for the journal and field normalizations in the CWTS ("Leiden") evaluations of research performance Loet, Ludo There is another reason why bibliometricians are starting to do calculations at the individual level, which doesn't seem to have been mentioned in any of the papers referred to in this discussion. That reason is to make it easier to re-aggregate publication sets in different ways. Let me give a "live" example. The Australian Research Council (ARC) trialled a new assessment system late last year. In physics, chemistry and earth sciences (the only sciences assessed in the trial - the other disciplines were all in the humanities and creative arts) bibliometrics played a key role. Several indicators were used, of which one was Relative Citation Impact. It was calculated by determining the RCI for each individual paper, then computing an average of RCIs to give an institutional RCI i.e. the method Loet is championing and the one that Ludo reveals CWTS is now also working on. In addition to statistical considerations, there was an additional important advantage to the ARC in using this methodology. The Australian Government was only interested in assessing the performance of fields within institutions - who had the strongest/weakest geology, astronomy, organic chemistry, etc. They were not concerned about which academic units these publications came from. But Vice-Chancellors and research managers were! VERY interested!! From the very first discussions on the development of the methodology (late 2007) it became clear that an essential consideration was the ability of universities to re-aggregate data to groups, faculties, research themes, or however they wanted to do. I am not necessarily supporting what they want to do - but they undoubtedly want to do it. An article by article methodology makes this a simple computational task. So the methodology has already being used in a national research assessment system, and will be used again when all fields that are subject to bibliometrics are assessed in the second half of this year. Can I just add that I'm just a little uncomfortable that CWTS seems to have been singled out for such pointed criticism. It's not just a "CWTS" methodology. Many other groups have used or are using this method, and not just because they copied CWTS - my own unit, REPP, is a case in point. Many of us arrived at the same point through our own development work, and having arrived there, are now looking to move on and improve the calculations. It's an important issue, so let's keep the discussion going, but keep it collegial. regards Linda Butler On 20/03/2010, at 9:22 AM, Loet Leydesdorff wrote: Dear colleagues, Please, find below the CWTS indicators for four papers of which two are cited in the same journal i or the same field i: In our opinion, one should normalize as follows: The issue is more general than CWTS because other centers normalize using MOCR/MECR, that is: Mean Observed Citation Rates divided by Mean Expected Citation Rates. The quotient between two means is no longer a statistics, while the values of observed versus expected citation rates provide a variance (standard deviation, median, etc.). (For example, 3/2 plus 2/3 is very different from 5/5). We understand from the papers indicated that in the meantime CWTS is changing its procedures. Best wishes, Loet ________________________________ Loet Leydesdorff Amsterdam School of Communications Research (ASCoR), Kloveniersburgwal 48, 1012 CX Amsterdam. Tel.: +31-20- 525 6598; fax: +31-20- 525 3681 loet at leydesdorff.net ; http://www.leydesdorff.net/ > -----Original Message----- > From: ASIS&T Special Interest Group on Metrics > [mailto:SIGMETRICS at LISTSERV.UTK.EDU] On Behalf Of Ludo Waltman > Sent: Friday, March 19, 2010 10:34 PM > To: SIGMETRICS at LISTSERV.UTK.EDU > Subject: Re: [SIGMETRICS] Caveats for the journal and field > normalizations in the CWTS ("Leiden") evaluations of research > performance > > Adminstrative info for SIGMETRICS (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html > > Dear colleagues, > > Recently, Tobias Opthof and Loet Leydesdorff have written a > critical paper > (see below) about the way in which bibliometric research performance > assessment studies are conducted by the Centre for Science > and Technology > Studies (CWTS) of Leiden University. There are a number of important > inaccuracies in the paper by Opthof and Leydesdorff. CWTS > also strongly > disagrees with many of their comments. In the following paper > CWTS replies to > the criticism of Opthof and Leydesdorff: > > Anthony F.J. van Raan, Thed N. van Leeuwen, Martijn S. > Visser, Nees Jan van > Eck, and Ludo Waltman. Rivals for the crown: Reply to Opthof and > Leydesdorff. Available at http://arxiv.org/abs/1003.2113. > > CWTS has also prepared a related paper on the same topic: > > Ludo Waltman, Nees Jan van Eck, Thed N. van Leeuwen, Martijn > S. Visser, and > Anthony F.J. van Raan. Towards a new crown indicator: Some > theoretical > considerations. Available at http://arxiv.org/abs/1003.2167. > > Best regards, > > Ludo Waltman > > > On 16/02/2010 07:46, Loet Leydesdorff wrote: > > Adminstrative info for SIGMETRICS (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html > > Caveats for the journal and field normalizations in the > CWTS ("Leiden") > evaluations of research performance > > Journal of Informetrics (forthcoming). > > http://arxiv.org/abs/1002.2769 > > > > Abstract: The Center for Science and Technology Studies > at Leiden > University advocates the use of specific normalizations for > assessing research > performance with reference to a world average. The Journal > Citation Score > (JCS) and Field Citation Score (FCS) are averaged for the > research group or > individual researcher under study, and then these values are used as > denominators of the (mean) Citations per publication (CPP). > Thus, this > normalization is based on dividing two averages. This procedure only > generates a legitimate indicator in the case of underlying > normal distributions. > Given the skewed distributions under study, one should > average the observed > versus expected values which are to be divided first for each > publication. We > show the effects of the Leiden normalization for a recent > evaluation where we > happened to have access to the underlying data. > > > > > > Tobias Opthof [1,2], Loet Leydesdorff [3] > > > > > > [1] Experimental Cardiology Group, Heart Failure Research > Center, Academic > Medical Center AMC, Meibergdreef 9, 1105 AZ Amsterdam, The > Netherlands. > > > > [2] Department of Medical Physiology, University Medical > Center Utrecht, > Utrecht, The Netherlands. > > > > [3] Amsterdam School of Communications Research (ASCoR), > University of > Amsterdam, Kloveniersburgwal 48, 1012 CX Amsterdam, The Netherlands. > > > > > > > > ** apologies for cross-postings > > > > > > > ======================================================== > Ludo Waltman MSc > Researcher > > Centre for Science and Technology Studies > Leiden University > P.O. Box 905 > 2300 AX Leiden > The Netherlands > > Willem Einthoven Building, Room B5-35 > Tel: +31 (0)71 527 5806 > Fax: +31 (0)71 527 3911 > E-mail: waltmanlr at cwts.leidenuniv.nl > Homepage: www.ludowaltman.nl > ======================================================== > - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - Linda Butler linda.butler52 at gmail.com landline: +61 (0)2 4982 7994 mobile: 0428 598 482 url: http://members.optuszoo.com.au/linda.butler52 ABN: 83 884 783 826 -------------- next part -------------- An HTML attachment was scrubbed... URL: From lutz.bornmann at GESS.ETHZ.CH Thu Mar 25 09:34:05 2010 From: lutz.bornmann at GESS.ETHZ.CH (Bornmann Lutz) Date: Thu, 25 Mar 2010 14:34:05 +0100 Subject: h index Message-ID: Dear colleague: You might be interested in two new papers on the h index which were recently accepted for publication in the Journal of Informetrics: 1) Bornmann, L. & Daniel, H.-D. (in press). The citation speed index: a useful bibliometric indicator to add to the h index: http://www.lutz-bornmann.de/icons/SpeedIndex.pdf 2) Bornmann, L., Mutz, R., & Daniel, H.-D. (in press). The h index research output measurement: two approaches to enhance its accuracy: http://www.lutz-bornmann.de/icons/haccuracy.pdf Lutz Bornmann ------------------------------------------------------------------------------------------------------------- Dr. Lutz Bornmann ETH Zurich, D-GESS Professorship for Social Psychology and Research on Higher Education Z?hringerstr. 24 / ZAE CH-8092 Zurich Phone: +41 44 632 48 25 Fax: +41 44 632 12 83 Skype: lutz.bornmann http://www.psh.ethz.ch bornmann at gess.ethz.ch ResearcherID: http://www.researcherid.com/rid/A-3926-2008 -------------- next part -------------- An HTML attachment was scrubbed... URL: From Chris.Armbruster at EUI.EU Mon Mar 29 12:33:10 2010 From: Chris.Armbruster at EUI.EU (Armbruster, Chris) Date: Mon, 29 Mar 2010 18:33:10 +0200 Subject: Query: Data (also: historical) on Overheads relative to research budgets Message-ID: Dear colleagues, are you aware of data, time series or research on overheads relative to research budgets or research grants? Context: There have been efforts at the European level to push for full-cost accounting (also: full cost budgets). Another issue is that instituions should be able to claim their 'true' overheads when applying for EU funding. All of this also implies also, that the so-called overheads of research (or institutions) will be more transparent and comparable. I seem to remember that in the hey-day of US research funding and the Cold War university (e.g. 1960s) US universities were bolstered with an overhead allowance of up to 50% - and that since eligible overheads have been considerably less. Any data available? Thanks, Chris -------------- next part -------------- An HTML attachment was scrubbed... URL: From richardp at ACADEMIA.EDU Tue Mar 30 18:06:35 2010 From: richardp at ACADEMIA.EDU (Richard Price) Date: Tue, 30 Mar 2010 15:06:35 -0700 Subject: 35 SIGMetrics members have posted 21 papers on Academia.edu Message-ID: Dear SIGMetrics members, We just wanted to let you know about some recent activity on the SIGMetrics group on Academia.edu. In the SIGMetrics group on Academia.edu, there are now: - 35 people (4 in the last two months) - 21 papers (14 in the last two months) - 6 new status updates - 17 photos SIGMetrics members? pages have been viewed a total of 3,259 times, and their papers have been viewed a total of 41 times. To see these people, papers and status updates, follow the link below: http://lists.academia.edu/See-members-of-SIGMetrics Richard Dr. Richard Price, post-doc, Philosophy Dept, Oxford University. Founder of Academia.edu From bgsloan2 at YAHOO.COM Wed Mar 31 18:02:24 2010 From: bgsloan2 at YAHOO.COM (B.G. Sloan) Date: Wed, 31 Mar 2010 15:02:24 -0700 Subject: Google Starts Grant Program for Studies of Its Digitized Books Message-ID: ? "Even as a lawsuit over its book-digitization project remains up in the air, the search giant has quietly started reaching out to universities in search of humanities scholars who are ready to roll up their sleeves and hit the virtual stacks." http://chronicle.com/article/Google-Starts-Grant-Program/64891/ ? Bernie Sloan -------------- next part -------------- An HTML attachment was scrubbed... URL: