From harnad at ECS.SOTON.AC.UK Tue Feb 2 08:28:18 2010 From: harnad at ECS.SOTON.AC.UK (Stevan Harnad) Date: Tue, 2 Feb 2010 08:28:18 -0500 Subject: Science, Nature, Peer Review and Open Access Metrics In-Reply-To: <4B681D82.3090203@ecs.soton.ac.uk> Message-ID: On 2-Feb-10, at 7:41 AM, Christopher Gutteridge wrote: > Thanks Stevan, but there's one thing you didn't address. Which is > that the medical community, I believe, started the rules about not > accepting previously self-published works, as a measure against > quackery. The idea being to discourage reputable scientists from > publishing outside reviewed journals. > > I've seen a few of the "crackpot" papers that get submitted to > repositories, but bad medical information is far more immediately > dangerous to society than bad physics information. > > OA gives a risk of having overlay repositories, for example, which > could list both genuine quality work and "quackery", the genuine > articles lending their respectability to the others.... > > ps. You are welcome to quote me by name :) Chris Gutteridge is quite right. (Let me first introduce Chris for those who don't already know who he is: Chris is the award-winning http://bit.ly/8YeXkO developer of EPrints; he has for years now been the successor of EPrints' original designer, Rob Tansley, who was then likewise at U Southampton http://bit.ly/5gynrf and has since gone on to design DSpace at MIT and is now at Google!) Yes, there is a danger, especially in health-related research, openly publicizing unrefereed reports that endanger public health. This is another of the many reasons why the self-archiving of refereed final drafts can and should be mandated by institutions and fundees, but the self-archiving of unrefereed preprints cannot and should not be mandated. Open Access will help in two ways: It will raise the potential profile of published papers that have been unfairly relegated to a lower level of the peer-reviewed journal hierarchy than they deserved. It will also help to catch errors (in both published and unpublished postings), through broader peer feedback. But users can and will learn to weight the credibility of a report with the track-record for quality of the journal in which it appeared. OA metrics will help. I've just posted this comment to http://bit.ly/di0p8q where Kent Anderson raises another valid and related worry that online postings and their tags and comments (including journals and their citations) could be inflating the impact of biassed and bogus content through "dynamic filtering" and "amplification": SPY VS. SPY The solution is open, multiple metrics. Citation alone has inflated power right now, but with Open Access, it will have many potential competitors and complements. Multiple joint ?weights? on the metrics can also be controlled by the user. And abuses can be detected as departures from the joint pattern ? and named and shamed where needed. It?s far easier to abuse one metric, like citations, than to manipulate the whole lot. (As with spamming and spam-filtering, and other online abuses, it is more like the old ?Spy vs. Spy? series in Mad Magazine, where each spy was always getting one step ahead of the other.) Harnad, S. (1979) Creative disagreement. The Sciences 19: 18 - 20. http://eprints.ecs.soton.ac.uk/3387/ Harnad, S. (ed.) (1982) Peer commentary on peer review: A case study in scientific quality control, New York: Cambridge University Press. Harnad, S. (1984) Commentaries, opinions and the growth of scientific knowledge. American Psychologist 39: 1497 - 1498. Harnad, S (1985) Rational disagreement in peer review. Science, Technology and Human Values, 10 p.55-62. http://cogprints.org/2128/ Harnad, S. (1986) Policing the Paper Chase. (Review of S. Lock, A difficult balance: Peer review in biomedical publication.) Nature 322: 24 - 5. Harnad, S. (1995) Interactive Cognition: Exploring the Potential of Electronic Quote/Commenting. In: B. Gorayska & J.L. Mey (Eds.) Cognitive Technology: In Search of a Humane Interface. Elsevier. Pp. 397-414. http://cogprints.org/1599/ Harnad, S. (1996) Implementing Peer Review on the Net: Scientific Quality Control in Scholarly Electronic Journals. In: Peek, R. & Newby, G. (Eds.) Scholarly Publishing: The Electronic Frontier. Cambridge MA: MIT Press. Pp 103-118. http://cogprints.org/1692/ Harnad, S. (1997) Learned Inquiry and the Net: The Role of Peer Review, Peer Commentary and Copyright. Learned Publishing 11(4) 283-292. http://cogprints.org/1694/ Harnad, S. (1998/2000/2004) The invisible hand of peer review. Nature [online] (5 Nov. 1998), Exploit Interactive 5 (2000): and in Shatz, B. (2004) (ed.) Peer Review: A Critical Inquiry. Rowland & Littlefield. Pp. 235-242. http://cogprints.org/1646/ Harnad, S. (2003/2004) Back to the Oral Tradition Through Skywriting at the Speed of Thought. Interdisciplines. Salaun, Jean-Michel & Vendendorpe, Christian (dir). Le defis de la publication sur le web: hyperlectures, cybertextes et meta-editions. Presses de l'enssib. http://eprints.ecs.soton.ac.uk/7723/ Harnad, S. (2008) Validating Research Performance Metrics Against Peer Rankings. Ethics in Science and Environmental Politics 8 (11) doi: 10.3354/esep00088 The Use And Misuse Of Bibliometric Indices In Evaluating Scholarly Performance http://eprints.ecs.soton.ac.uk/15619/ Harnad, S. (2009) Open Access Scientometrics and the UK Research Assessment Exercise. Scientometrics 79 (1) Also in Proceedings of 11th Annual Meeting of the International Society for Scientometrics and Informetrics 11(1), pp. 27-33, Madrid, Spain. Torres-Salinas, D. and Moed, H. F., Eds. (2007) http://eprints.ecs.soton.ac.uk/17142/ > > Stevan Harnad wrote: >>> anon: "This article demonstrates some of the obvious issues with >>> peer review." >>> http://news.bbc.co.uk/1/hi/sci/tech/8490291.stm >> >> (1) Yes, peer review is imperfect, because human judgment is >> imperfect. >> But there is no alternative system that is as good as or better for >> checking, improving and tagging the quality of specialized research >> than >> qualified specialist review, answerable to a qualified specialist >> editor or board. >> >> (2) If there is a weak link in peer review, it is not the peer review >> itself, but the editor not doing a conscientious enough job. (The >> solution is to make editors more answerable. It would also be good to >> publish the name of the accepting reviewers -- but not of the >> rejecting >> ones, if a paper is rejected, to protect the anonymity and hence the >> honesty of reviewers who may be criticizing the work of someone who >> can >> pay them back. Justice is the editor's responsibility.) >> >> (3) Nature and Science are vastly over-rated, and OA will change >> this. >> They are not just journals with high quality standards but also >> especially highly desired "brands" because they can only accept a >> tiny >> percentage of submissions, hence giving the impression of being the >> best >> of the best. (Most Nature/Science rejections still go on to appear in >> the top specialized journals in their fields. They just don't get the >> big extra boost in visibility and impact that the Nature and Science >> "brand" and publicity machine adds.) In reality, their actual choices >> are often extremely arbitrary or stilted. >> >> (4) So it is not true that peer review in general blocks good work, >> or >> favors some work over others. Referees sometimes do that, but there >> are >> always other journals to submit to. (Just about everything is >> published >> somewhere, eventually.) Again, OA will help level this playing field. >> >> (5) Mark Wolpert is right that authors tend to get "paranoid" about >> this, when their work is rejected, especially by Nature and Science. >> Sometimes they are right. Mostly they are not, and Nature and Science >> are less biased than they are arbitrary, often going for what looks >> like more lustre rather than more substance. >> >> (6) Yes, in some sub-areas it is almost certain that Nature and >> Science >> have clique sub-editors and referees, and that their choices are >> hence >> sometimes biased and driven by competition rather than quality. These >> should be exposed wherever possible, and the editor in chief should >> frankly face up to it. But this is a flaw in Nature and Science's >> vastly >> inflated brand effect, not in peer review. And again, once OA becomes >> universal, it will help to counteract this. (OA will also help to >> catch >> errors, before and after peer review.) >> >> Stevan Harnad > > -- > Christopher Gutteridge -- http://www.ecs.soton.ac.uk/people/cjg > > Lead Developer, EPrints Project, http://eprints.org/ > > Web Projects Manager, School of Electronics and Computer Science, > University of Southampton. From amsciforum at GMAIL.COM Wed Feb 3 13:36:15 2010 From: amsciforum at GMAIL.COM (Stevan Harnad) Date: Wed, 3 Feb 2010 13:36:15 -0500 Subject: Email Eprint Request Button: Too good for OA's own good? Message-ID: Forwarding the interesting post below from Colin Smith, who is wondering whether the semi-automatic "email eprint request" button http://openaccess.eprints.org/index.php?/archives/274-guid.html might tempt authors to deliberately make their deposits Closed Access instead of Open Access so that can they get more detailed usage metrics for their papers, for research assessment purposes! The possibility was raised, when the button was first designed, that authors could get addicted to the richer vanity metrics that Closed Access plus the button provides. http://bit.ly/9ThxNu But if you go to Colin's blog to see the responses from IR managers that have since implemented the button, you will find that that is not what tends to happen. Rather, authors tire of vanity metrics after a while and instead prefer sparing themselves those extra keystrokes by re-setting their papers as Open Access. It might, though, be useful to implement a "button" for Open Access papers too, allowing users to identify themselves and their interests to the author, if they wish... -- SH ---------- Forwarded message ---------- From: C.J.Smith Date: Wed, Feb 3, 2010 at 11:27 AM Subject: Is the "request copy" button good for OA? To: AMERICAN-SCIENTIST-OPEN-ACCESS-FORUM at listserver.sigmaxi.org Members of this list may be interested in a blog post I?ve just written on the ?request copy? button used by some repositories (including my own). I?d welcome your responses not only on this list, but also as comments to the blog post itself. http://www.open.ac.uk/blogs/ORO/?p=92 Colin Smith Research Repository Manager Open Research Online (ORO) Open University Library Walton Hall Milton Keynes MK7 6AA Tel: +44(0)1908 332971 Email: c.j.smith at open.ac.uk Web: http://oro.open.ac.uk Blog: http://www.open.ac.uk/blogs/ORO Twitter: http://www.twitter.com/smithcolin ________________________________ The Open University is incorporated by Royal Charter (RC 000391), an exempt charity in England & Wales and a charity registered in Scotland (SC 038302). From richardp at ACADEMIA.EDU Thu Feb 4 19:49:13 2010 From: richardp at ACADEMIA.EDU (Richard Price) Date: Thu, 4 Feb 2010 16:49:13 -0800 Subject: 14 SIGMetrics members have posted 3 papers on Academia.edu Message-ID: Dear SIGMetrics members, We just wanted to let you know about some recent activity on the SIGMetrics group on Academia.edu. In the SIGMetrics group on Academia.edu, there are now: - 14 people in the last month - 3 papers - 4 new status updates (3 in the last month) - 4 photos SIGMetrics members? pages have been viewed a total of 1,014 times, and their papers have been viewed a total of 20 times. To see these people, papers and status updates, follow the link below: http://lists.academia.edu/See-members-of-SIGMetrics Richard Dr. Richard Price, post-doc, Philosophy Dept, Oxford University. Founder of Academia.edu From harnad at ECS.SOTON.AC.UK Mon Feb 8 11:26:32 2010 From: harnad at ECS.SOTON.AC.UK (Stevan Harnad) Date: Mon, 8 Feb 2010 11:26:32 -0500 Subject: Whether Self-Selected or Mandated, Open Access Increases Citation Impact for Higher Quality Research In-Reply-To: <4B4B61EF.2060504@cornell.edu> Message-ID: ** APOLOGIES FOR CROSS-POSTING ** What follows below is -- I think readers will agree -- a conscientious and attentive series of responses to questions raised by Phil Davis about our paper testing whether the OA citation Advantage is just a side-effect of author self-selection (Gargouri et al, currently under refereeing) -- responses for which we did further analyses of our data (not included in the draft under refereeing). Gargouri, Y., Hajjem, C., Lariviere, V., Gingras, Y., Brody, T., Carr, L. and Harnad, S. (2010) Self-Selected or Mandated, Open Access Increases Citation Impact for Higher Quality Research.(Submitted) http://eprints.ecs.soton.ac.uk/18346/ We are happy to have performed these further analyses, and we are very much in favor of this sort of open discussion and feedback on pre- refereeing preprints of papers that have been submitted and are undergoing peer review. They can only improve the quality of the eventual published version of articles. However, having carefully responded to Phil's welcome questions, below, we will, at the end of this posting, ask Phil to respond in kind to a question that we raised about his own paper (Davis et al 2008) a year and a half ago... RESPONSES TO DAVIS'S QUESTIONS ABOUT OUR PAPER: On 8-Jan-10, at 10:06 AM, Philip Davis wrote: > PD: > Stevan, > Granted, you may be more interested in what the referees of the paper > have to say than my comments; I'm interested in whether this paper is > good science, whether the methodology is sound and whether you interpret > your results properly. We are very appreciative of your concern and hope you will agree that we have not been interested only in what the referees might have to say. (We also hope you will now in turn be equally responsive to a longstanding question about your own paper on this same topic.) > PD: > For instance, it is not clear whether your Odds Ratios are interpreted > correctly. Based on Figure 4, OA article are MORE LIKELY to receive > zero citations than 1-5 citations (or conversely, LESS LIKELY to receive > 1-5 citations than zero citations). > You write: "For example, we can say for the first model that for a one > unit increase in OA, the odds of receiving 1-5 citations (versus zero > citations) increased by a factor of 0.957. Figure 4.. (p.9) You are interpreting the figure incorrectly. It is the higher citation count that is in each case more likely, as co-author Yassine Gargouri pointed out to you in a subsequent response, to which you replied: > PD: > Yassine, Thank you for your response. I find your odds ratio > methodology unnecessarily complex and unintuitive but now > understand your explanation, thank you. Our article supports its conclusions with several different, convergent analyses. The logistical analysis with the odds ratio is one of them, and its results are fully corroborated by the other, simpler analyses we also reported, as well as the supplementary analyses we append here now. > PD: > Similarly in Figure 4 (if I understand the axes correctly), CERN articles > are more than twice as likely to be in the 20+ citation category > than in the 1-5 citation category, a fact that may distort further > interpretation of your data as it may be that institutional effects may > explain your Mandated OA effect. See comments by Patrick Gaule and Ludo > Waltman on the review http://j.mp/8LK57u Here is the analysis underlying Figure 4, re-done without CERN, and then again re-done without either CERN or Southampton. As will be seen, the outcome pattern, as well as its statistical significance, are the same whether or not we exclude these institutions. SUPPLEMENTARY FIGURE S1: http://eprints.ecs.soton.ac.uk/18346/7/Supp1_CERN%2DSOTON.pdf On 11-Jan-10, at 12:37 PM, Philip Davis wrote: > PD: > Changing how you report your citation ratios, from the ratio of log > citations to the log of citation ratios is a very substantial change to > your paper and I am surprised that you point out this reporting error at > this point. As noted in Yassine's reply to Phil, that formula was incorrectly stated in our text, once; in all the actual computations, results, figures and tables, however, the correct formula was used. > PD: > While it normalizes the distribution of the ratios, it is not without > problems, such as: > > 1. Small citation differences have very large leverage in your > calculations. Example, A=2 and B=1, log (A/B)=0.3 The log of the citation ratio was used only in displaying the means (Figure 2), presented for visual inspection. The paired-sample t-tests of significance (Table 2) were based on the raw citation counts, not on log ratios, hence had no leverage in our calculations or their interpretations. (The paired-sample t-tests were also based only on 2004-2006, because for 2002-2003 not all the institutional mandates were yet in effect.) Moreover, both the paired-sample t-test results (2004-2006) and the pattern of means (2002-2006) converged with the results of the (more complicated) logistical regression analyses and subdivisions into citation ranges. > PD: > 2. Similarly, any ratio with zero in the denominator must be thrown out > of your dataset. The paper does not inform the reader on how much data > was ignored in your ratio analysis and we have no information on the > potential bias this may have on your results. As noted, the log ratios were only used in presenting the means, not in the significance testing, nor in the logistic regressions. However, we are happy to provide the additional information Phil requests, in order to help readers eyeball the means. Here are the means from Figure 2, recalculated by adding 1 to all citation counts. This restores all log ratios with zeroes in the numerator (sic); the probability of a zero in the denominator is vanishingly small, as it would require that all 10 same-issue control articles have no citations! The pattern is again much the same. (And, as noted, the significance tests are based on the raw citation counts, which were not affected by the log transformations that exclude numerator citation counts of zero.) SUPPLEMENTARY FIGURE S2: http://eprints.ecs.soton.ac.uk/18346/12/Supp2_Cites%2B1.pdf This exercise suggested a further heuristic analysis that we had not thought of doing in the paper, even though the results had clearly suggested that the OA advantage is not evenly distributed across the full range of article quality and citeability: The higher quality, more citeable articles gain more of the citation advantage from OA. In the following supplementary figure (S3), for exploratory and illustrative purposes only, we re-calculate the means in the paper's Figure 2 separately for OA articles in the citation range 0-4 and for OA articles in the citation range 5+. SUPPLEMENTARY FIGURE S3: http://eprints.ecs.soton.ac.uk/18346/17/Supp3_CiteRanges.pdf The overall OA advantage is clearly concentrated on articles in the higher citation range. There is even what looks like an OA DISadvantage for articles in the lower citation range. This may be mostly an artifact (from restricting the OA articles to 0-4 citations and not restricting the non-OA articles), although it may also be partly due to the fact that when unciteable articles are made OA, only one direction of outcome is possible, in the comparison with citation means for non-OA articles in the same journal and year: OA/non-OA citation ratios will always be unflattering for zero-citation OA articles. (This can be statistically controlled for, if we go on to investigate the distribution of the OA effect across citation brackets directly.) > PD: > Have you attempted to analyze your citation data as continuous variables > rather than ratios or categories? We will be doing this in our next study, which extends the time base to 2002-2008. Meanwhile, a preview is possible from plotting the mean number of OA and non-OA articles for each citation count. Note that zero citations is the biggest category for both OA and non-OA articles, and that the proportion of articles at each citation level decreases faster for non-OA articles than for OA articles; this is another way of visualizing the OA advantage. At citation counts of 30 or more, the difference is quite striking, although of course there are few articles with so many citations: SUPPLEMENTARY FIGURE 4: http://eprints.ecs.soton.ac.uk/18346/22/Supp4_IndivCites.pdf -------- REQUEST FOR RESPONSE TO QUESTION ABOUT DAVIS ET AL'S (2008) Paper: Davis, PN, Lewenstein, BV, Simon, DH, Booth, JG, & Connolly, MJL (2008) Open access publishing, article downloads, and citations: randomised controlled trial British Medical Journal 337: a568 http://www.bmj.com/cgi/content/full/337/jul31_1/a568 Critique of Davis et al's paper: "Davis et al's 1-year Study of Self-Selection Bias: No Self-Archiving Control, No OA Effect, No Conclusion" http://www.bmj.com/cgi/eletters/337/jul31_1/a568#199775 Davis et al had taken a 1-year sample of biological journal articles and randomly made a subset of them OA, to control for author self- selection. (This is comparable to our mandated control for author self- selection.) They reported that after a year, they found no significant OA Advantage for the randomized OA for citations (although they did find an OA Advantage for downloads) and concluded that this showed that the OA citation Advantage is just an artifact of author self- selection, now eliminated by the randomization. What Davis et al failed to do, however, was to demonstrate, in the same sample and time-span, that author self-selection generates the OA citation Advantage. Without doing that, all they have shown is that in their sample and time-span, they found no significant OA citation Advantage. This is no great surprise, because their sample was small and their time-span was short, whereas the many of the other studies that have reported finding an OA Advantage were based on much larger samples and much longer time spans. The question raised was about controlling for self-selected OA. If one tests for the OA Advantage, whether self-selected or randomized, there is a great deal of variability, across articles and disciplines, especially for the first year or so after publication. In order to have a statistically reliable measure of OA effects, the sample has to be big enough, both in number of articles and in the time allowed for any citation advantage to build up to become detectable and statistically reliable. Davis et al need to do with their randomization methodology what we have done with our mandating methodology, namely, to demonstrate the presence of a self-selected OA Advantage in the same journals and years. Then they can compare that with randomized OA in those same journals and years, and if there is a significant OA Advantage for self-selected OA and no OA Advantage for randomized OA then they will have evidence that some or all of the OA Advantage is just a side- effect of self-selection. Otherwise, all they have shown is that with their journals, sample size and time-span, there is no detectable OA Advantage at all. What Davis et al replied in their Authors' Response was instead this: http://www.bmj.com/cgi/eletters/337/jul31_1/a568#200109 > PD: > "Professor Harnad comments that we should have implemented > a self-selection control in our study. Although this is an > excellent idea, it was not possible for us to do so because, > at the time of our randomization, the publisher did not permit > author-sponsored open access publishing in our experimental > journals. Nonetheless, self-archiving, the type of open access > Prof. Harnad often refers to, is accounted for in our regression > model (see Tables 2 and 3)... Table 2 Linear regression output > reporting independent variable effects on PDF downloads for six > months after publication Self-archived: 6% of variance p = .361 > (i.e., not statistically significant)... Table 3 Negative > binomial regression output reporting independent variable effects > on citations to articles aged 9 to 12 months Self-archived: > Incidence Rate 0.9 p = .716 (i.e., not statistically significant)... This is not an adequate response. If a control condition was needed in order to make am outcome meaningful, it is not sufficient to reply that "the publisher and sample allowed us to do the experimental condition but not the control condition." Nor is it an adequate response to reiterate that there was no significant self-selected self-archiving effect in the sample (as the regression analysis showed). That is in fact bad news for the hypothesis being tested. Nor is it an adequate response to say, as Phil did in a later posting, that even after another half year or more had gone by, there there was still no significant OA Advantage. (That is just the sound of one hand clapping again, this time louder.) The only way to draw meaningful conclusions from Davis et al's methodology is to demonstrate the self-selected self-archiving citation advantage, for the same journals and time-span, and then to show that randomization wipes it out. Until then, our own results, which do demonstrate the self-selected self-archiving citation advantage for the same journals and time-span, show that mandating the self-archiving does not wipe it out. Meanwhile, Davis et al's finding that although their randomized OA did not generate a citation increase, it did generate a download increase, suggests that with a larger sample and time-span there may well be scope for a citation advantage as well: Our own prior work and that of others has shown that higher earlier download counts tend to lead to higher later citation counts. Bollen, J., Van de Sompel, H., Hagberg, A. and Chute, R. (2009) A principal component analysis of 39 scientific impact measures arXiv.org, arXiv:0902.2183v1 [cs.CY], 12 Feb. 2009, in PLoS ONE 4(6): e6022, http://dx.doi.org/10.1371/journal.pone.0006022 Brody, T., Harnad, S. and Carr, L. (2006) Earlier Web Usage Statistics as Predictors of Later Citation Impact. Journal of the American Association for Information Science and Technology (JASIST) 57(8) 1060-1072. http://eprints.ecs.soton.ac.uk/10713/ Lokker, C., McKibbon, K. A., McKinlay, R.J., Wilczynski, N. L. and Haynes, R. B. (2008) Prediction of citation counts for clinical articles at two years using data available within three weeks of publication: retrospective cohort study BMJ, 2008;336:655-657 http://www.bmj.com/cgi/content/abstract/336/7645/655 Moed, H. F. (2005) Statistical Relationships Between Downloads and Citations at the Level of Individual Documents Within a Single Journal (abstract only) Journal of the American Society for Information Science and Technology, 56(10): 1088- 1097 O'Leary, D. E. (2008) The relationship between citations and number of downloads Decision Support Systems. 45(4): 972-980 http://dx.doi.org/10.1016/j.dss.2008.03.008 Watson, A. B. (2009) Comparing citations and downloads for individual articles Journal of Vision, 9(4): 1-4 http://journalofvision.org/9/4/i/ From leo.egghe at UHASSELT.BE Tue Feb 9 07:42:26 2010 From: leo.egghe at UHASSELT.BE (Leo Egghe) Date: Tue, 9 Feb 2010 13:42:26 +0100 Subject: Journal of Informetrics 4(1): table of contents Message-ID: Journal of Informetrics Table of contents of Volume 4, Issue 1, Pages 1-136 (January 2010) 1. Editorial Board Page CO2, doi:10.1016/S1751-1577(09)00090-X 2. Citations to scientific articles: Its distribution and dependence on the article features Author(s): E.S. Vieira, J.A.N.F. Gomes Pages 1-13, doi:10.1016/j.joi.2009.06.002 Abstract The citation counts are increasingly used to assess the impact on the scientific community of publications produced by a researcher, an institution or a country. There are many institutions that use bibliometric indicators to steer research policy and for hiring or promotion decisions. Given the importance that counting citations has today, the aim of the work presented here is to show how citations are distributed within a scientific area and determine the dependence of the citation count on the article features. All articles referenced in the Web of Science in 2004 for Biology & Biochemistry, Chemistry, Mathematics and Physics were considered. We show that the distribution of citations is well represented by a double exponential-Poisson law. There is a dependence of the mean citation rate on the number of co-authors, the number of addresses and the number of references, although this dependence is a little far from the linear behaviour. For the relation between the mean impact and the number of pages the dependence obtained was very low. For Biology & Biochemistry and Chemistry we found a linear behaviour between the mean citation per article and impact factor and for Mathematics and Physics the results obtained are near to the linear behaviour. 3. Characteristic scores and scales based on h-type indices Author(s): L. Egghe Pages 14-22, doi:10.1016/j.joi.2009.06.001 Abstract Based on the rank-order citation distribution of e.g. a researcher, one can define certain points on this distribution, hereby summarizing the citation performance of this researcher. Previous work of Gl?nzel and Schubert defined these so-called ?characteristic scores and scales? (CSS), based on average citation data of samples of this ranked publication?citation list. In this paper we will define another version of CSS, based on diverse h-type indices such as the h-index, the g-index, the Kosmulski's h(2)-index and the g-variant of it, the g(2)-index. Mathematical properties of these new CSS are proved in a Lotkaian framework. These CSS also provide an improvement of the single h-type indices in the sense that they give h-type index values for different parts of the ranked publication?citation list. 4. q2-Index: Quantitative and qualitative evaluation based on the number and impact of papers in the Hirsch core Author(s): F.J. Cabrerizo, S. Alonso, E. Herrera-Viedma, F. Herrera Pages 23-28, doi:10.1016/j.joi.2009.06.005 Abstract Bibliometric studies at the micro level are increasingly requested by science managers and policy makers to support research decisions. Different measures and indices have been developed at this level of analysis. One type of indices, such as the h-index and g-index, describe the most productive core of the output of a researcher and inform about the number of papers in the core. Other indices, such as the a-index and m-index, depict the impact of the papers in the core. In this paper, we present a new index which relates two different dimensions in a researcher?s productive core: a quantitative one (number of papers) and a qualitative one (impact of papers). In such a way, we could obtain a more balanced and global view of the scientific production of researchers. This new index, called q2-index, is based on the geometric mean of h-index and the median number of citations received by papers in the h-core, i.e., the m-index, which allows us to combine the advantages of both kind of indices. 5. Exposing multi-relational networks to single-relational network analysis algorithms Author(s): Marko A. Rodriguez, Joshua Shinavier Pages 29-41, doi:10.1016/j.joi.2009.06.004 Abstract Many, if not most network analysis algorithms have been designed specifically for single-relational networks; that is, networks in which all edges are of the same type. For example, edges may either represent ?friendship,? ?kinship,? or ?collaboration,? but not all of them together. In contrast, a multi-relational network is a network with a heterogeneous set of edge labels which can represent relationships of various types in a single data structure. While multi-relational networks are more expressive in terms of the variety of relationships they can capture, there is a need for a general framework for transferring the many single-relational network analysis algorithms to the multi-relational domain. It is not sufficient to execute a single-relational network analysis algorithm on a multi-relational network by simply ignoring edge labels. This article presents an algebra for mapping multi-relational networks to single-relational networks, thereby exposing them to single-relational network analysis algorithms. 6. How to modify the g-index for multi-authored manuscripts Author(s): Michael Schreiber Pages 42-54, doi:10.1016/j.joi.2009.06.003 Abstract A recently suggested modification of the g-index is analysed in order to take multiple coauthorship appropriately into account. By fractionalised counting of the papers one can obtain an appropriate measure which I call gm-index. Two fictitious examples for model cases and two empirical cases are analysed. The results are compared with two other variants of the g-index which have also recently been proposed. Only the gm-index shows the correct behaviour when datasets are aggregated. The interpolated and continuous versions of the g-index and its variants are also discussed. For an intuitive comparison of the determination of the investigated variants of the h-index and the g-index, a visualization of the citation records is utilized. 7. The difference between popularity and prestige in the sciences and in the social sciences: A bibliometric analysis Author(s): Massimo Franceschet Pages 55-63, doi:10.1016/j.joi.2009.08.001 Abstract The status of a journal is commonly determined by two factors: popularity and prestige. While the former counts citations, the latter recursively weights them with the prestige of the citing journals. We make a thorough comparison of the bibliometric concepts of popularity and prestige for journals in the sciences and in the social sciences. We find that the two notions diverge more for the hard sciences, including physics, engineering, material sciences, and computer sciences, than they do for the geosciences, for biology-medical disciplines, and for the social sciences. Moreover, we identify the science and social science journals with the highest diverging ranks in popularity and prestige compilations. 8. The Hirsch spectrum: A novel tool for analyzing scientific journals Author(s): Fiorenzo Franceschini, Domenico Maisano Pages 64-73, doi:10.1016/j.joi.2009.08.003 Abstract This paper introduces the Hirsch spectrum (h-spectrum) for analyzing the academic reputation of a scientific journal. h-Spectrum is a novel tool based on the Hirsch (h) index. It is easy to construct: considering a specific journal in a specific interval of time, h-spectrum is defined as the distribution representing the h-indexes associated to the authors of the journal articles. This tool allows defining a reference profile of the typical author of a journal, compare different journals within the same scientific field, and provide a rough indication of prestige/reputation of a journal in the scientific community. h-Spectrum can be associated to every journal. Ten specific journals in the Quality Engineering/Quality Management field are analyzed so as to preliminarily investigate the h-spectrum characteristics. 9. Can epidemic models describe the diffusion of topics across disciplines? Author(s): Istvan Z. Kiss, Mark Broom, Paul G. Craze, Ismael Rafols Pages 74-82, doi:10.1016/j.joi.2009.08.002 Abstract This paper introduces a new approach to describe the spread of research topics across disciplines using epidemic models. The approach is based on applying individual-based models from mathematical epidemiology to the diffusion of a research topic over a contact network that represents knowledge flows over the map of science?as obtained from citations between ISI Subject Categories. Using research publications on the protein class kinesin as a case study, we report a better fit between model and empirical data when using the citation-based contact network. Incubation periods on the order of 4?15.5 years support the view that, whilst research topics may grow very quickly, they face difficulties to overcome disciplinary boundaries. 10. Citation speed as a measure to predict the attention an article receives: An investigation of the validity of editorial decisions at Angewandte Chemie International Edition Author(s): Lutz Bornmann, Hans-Dieter Daniel Pages 83-88, doi:10.1016/j.joi.2009.09.001 Abstract The scientific quality of a publication can be determined not only based on the number of times it is cited but also based on the speed with which its content is disseminated in the scientific community. In this study we tested whether manuscripts that were accepted by Angewandte Chemie International Edition (one of the prime chemistry journals worldwide) received the first citation after publication faster than manuscripts that were rejected by the journal but published elsewhere. The results of a Cox regression model show that accepted manuscripts have a 49% higher hazard rate of citation than rejected manuscripts. 11. Analysis of cooperative research and development networks on Japanese patents Author(s): Hiroyasu Inoue, Wataru Souma, Schumpeter Tamada Pages 89-96, doi:10.1016/j.joi.2009.09.002 Abstract To sustain economic growth, countries have to manage systems in order to create technological innovation. To meet this goal, they are developing policies that organically connect companies, national laboratories, and universities into innovation networks. However, the whole structures of these connections have been little investigated because of the difficulty of obtaining such data. We use Japanese patent data and create a network of jointly applying organizations. This network can be considered as one representation of an innovation network because patents are seeds of innovation and joint applications are strong evidence of connections between organizations. We investigated the structure of the network, especially whether or not the degree distribution follows a power law. After that, we also propose a model that generates the actual network, not only degree distribution, but also link distance distribution. 12. The impact of small world on innovation: An empirical study of 16 countries Author(s): Zifeng Chen, Jiancheng Guan Pages 97-106, doi:10.1016/j.joi.2009.09.003 Abstract This paper investigates the impact of small world properties and the size of largest component on innovation performance at national level. Our study adds new evidence to the limited literature on this topic with an empirical investigation for the patent collaboration networks of 16 main innovative countries during 1975?2006. We combine small world network theory with statistical models to systematically explore the relationship between network structure and patent productivity. Results fail to support that the size of largest component enhances innovative productivity significantly, which is not consistent with recent concerns regarding positive effects of largest component on patent output. We do find that small-world structure benefits innovation but it is limited to a special range after which the effects inversed and shorter path length always correlates with increased innovation output. Our findings extend the current literature and they can be implicated for policy makers and relevant managers when making decisions for technology, industry and firm location. 13. Ranking marketing journals using the Google Scholar-based hg-index Author(s): Salim Moussa, Mourad Touzani Pages 107-117, doi:10.1016/j.joi.2009.10.001 Abstract This paper provides a ranking of 69 marketing journals using a new Hirsch-type index, the hg-index which is the geometric mean of hg. The applicability of this index is tested on data retrieved from Google Scholar on marketing journal articles published between 2003 and 2007. The authors investigate the relationship between the hg-ranking, ranking implied by Thomson Reuters? Journal Impact Factor for 2008, and rankings in previous citation-based studies of marketing journals. They also test two models of consumption of marketing journals that take into account measures of citing (based on the hg-index), prestige, and reading preference. 14. Hirsch-type characteristics of the tail of distributions. The generalised h-index Author(s): Wolfgang Gl?nzel, Andr?s Schubert Pages 118-123, doi:10.1016/j.joi.2009.10.002 Abstract In this paper a generalisation of the h-index and g-index is given on the basis of non-negative real-valued functionals defined on subspaces of the vector space generated by the ordered samples. Several Hirsch-type measures are defined and their basic properties are analysed. Empirical properties are illustrated using examples from the micro- and meso-level. Among these measures, the h-index proved the most, the arithmetic and geometric g-indices, the least robust measures. The μ-index and the harmonic g-index provide more balanced results and are still robust enough. 15. Using the Web for research evaluation: The Integrated Online Impact indicator Author(s): Kayvan Kousha, Mike Thelwall, Somayeh Rezaie Pages 124-135, doi:10.1016/j.joi.2009.10.003 Abstract Previous research has shown that citation data from different types of Web sources can potentially be used for research evaluation. Here we introduce a new combined Integrated Online Impact (IOI) indicator. For a case study, we selected research articles published in the Journal of the American Society for Information Science & Technology (JASIST) and Scientometrics in 2003. We compared the citation counts from Web of Science (WoS) and Scopus with five online sources of citation data including Google Scholar, Google Books, Google Blogs, PowerPoint presentations and course reading lists. The mean and median IOI was nearly twice as high as both WoS and Scopus, confirming that online citations are sufficiently numerous to be useful for the impact assessment of research. We also found significant correlations between conventional and online impact indicators, confirming that both assess something similar in scholarly communication. Further analysis showed that the overall percentage for unique Google Scholar citations outside the WoS were 73% and 60% for the articles published in JASIST and Scientometrics, respectively. An important conclusion is that in subject areas where wider types of intellectual impact indicators outside the WoS and Scopus databases are needed for research evaluation, IOI can be used to help monitor research performance. Copyright ? 2009 Published by Elsevier B.V. From leo.egghe at UHASSELT.BE Wed Feb 10 05:19:40 2010 From: leo.egghe at UHASSELT.BE (Leo Egghe) Date: Wed, 10 Feb 2010 11:19:40 +0100 Subject: Journal of Informetrics 4(1): table of contents Message-ID: Journal of Informetrics Table of contents of Volume 4, Issue 1, Pages 1-136 (January 2010) 1. Editorial Board Page CO2, doi:10.1016/S1751-1577(09)00090-X 2. Citations to scientific articles: Its distribution and dependence on the article features Author(s): E.S. Vieira, J.A.N.F. Gomes Corresponding author: J.A.N.F. Gomes - jfgomes at fc.up.pt - REQUIMTE/Departamento de Qu?mica, Faculdade de Ci?ncias, Universidade do Porto, Rua do Campo Alegre, 687, 4169-007 Porto, Portugal Pages 1-13, doi:10.1016/j.joi.2009.06.002 Abstract The citation counts are increasingly used to assess the impact on the scientific community of publications produced by a researcher, an institution or a country. There are many institutions that use bibliometric indicators to steer research policy and for hiring or promotion decisions. Given the importance that counting citations has today, the aim of the work presented here is to show how citations are distributed within a scientific area and determine the dependence of the citation count on the article features. All articles referenced in the Web of Science in 2004 for Biology & Biochemistry, Chemistry, Mathematics and Physics were considered. We show that the distribution of citations is well represented by a double exponential-Poisson law. There is a dependence of the mean citation rate on the number of co-authors, the number of addresses and the number of references, although this dependence is a little far from the linear behaviour. For the relation between the mean impact and the number of pages the dependence obtained was very low. For Biology & Biochemistry and Chemistry we found a linear behaviour between the mean citation per article and impact factor and for Mathematics and Physics the results obtained are near to the linear behaviour. 3. Characteristic scores and scales based on h-type indices Author(s): L. Egghe Corresponding author: L. Egghe - leo.egghe at uhasselt.be - Universiteit Hasselt (UHasselt), Campus Diepenbeek, Agoralaan, B-3590 Diepenbeek, Belgium Pages 14-22, doi:10.1016/j.joi.2009.06.001 Abstract Based on the rank-order citation distribution of e.g. a researcher, one can define certain points on this distribution, hereby summarizing the citation performance of this researcher. Previous work of Gl?nzel and Schubert defined these so-called ?characteristic scores and scales? (CSS), based on average citation data of samples of this ranked publication?citation list. In this paper we will define another version of CSS, based on diverse h-type indices such as the h-index, the g-index, the Kosmulski's h(2)-index and the g-variant of it, the g(2)-index. Mathematical properties of these new CSS are proved in a Lotkaian framework. These CSS also provide an improvement of the single h-type indices in the sense that they give h-type index values for different parts of the ranked publication?citation list. 4. q2-Index: Quantitative and qualitative evaluation based on the number and impact of papers in the Hirsch core Author(s): F.J. Cabrerizo, S. Alonso, E. Herrera-Viedma, F. Herrera Corresponding author: F.J. Cabrerizo - cabrerizo at issi.uned.es - Department of Software Engineering and Computer Systems, Distance Learning University of Spain, Spain Pages 23-28, doi:10.1016/j.joi.2009.06.005 Abstract Bibliometric studies at the micro level are increasingly requested by science managers and policy makers to support research decisions. Different measures and indices have been developed at this level of analysis. One type of indices, such as the h-index and g-index, describe the most productive core of the output of a researcher and inform about the number of papers in the core. Other indices, such as the a-index and m-index, depict the impact of the papers in the core. In this paper, we present a new index which relates two different dimensions in a researcher?s productive core: a quantitative one (number of papers) and a qualitative one (impact of papers). In such a way, we could obtain a more balanced and global view of the scientific production of researchers. This new index, called q2-index, is based on the geometric mean of h-index and the median number of citations received by papers in the h-core, i.e., the m-index, which allows us to combine the advantages of both kind of indices. 5. Exposing multi-relational networks to single-relational network analysis algorithms Author(s): Marko A. Rodriguez, Joshua Shinavier Corresponding author: Marko A. Rodriguez - marko at lanl.gov - T-5 Center for Nonlinear Studies, Los Alamos National Laboratory, Los Alamos, NM 87545, United States Pages 29-41, doi:10.1016/j.joi.2009.06.004 Abstract Many, if not most network analysis algorithms have been designed specifically for single-relational networks; that is, networks in which all edges are of the same type. For example, edges may either represent ?friendship,? ?kinship,? or ?collaboration,? but not all of them together. In contrast, a multi-relational network is a network with a heterogeneous set of edge labels which can represent relationships of various types in a single data structure. While multi-relational networks are more expressive in terms of the variety of relationships they can capture, there is a need for a general framework for transferring the many single-relational network analysis algorithms to the multi-relational domain. It is not sufficient to execute a single-relational network analysis algorithm on a multi-relational network by simply ignoring edge labels. This article presents an algebra for mapping multi-relational networks to single-relational networks, thereby exposing them to single-relational network analysis algorithms. 6. How to modify the g-index for multi-authored manuscripts Author(s): Michael Schreiber Corresponding author: Michael Schreiber - schreiber at physik.tu-chemnitz.de - Institut f?r Physik, Technische Universit?t Chemnitz, Reichenhainer Str. 70, 09107 Chemnitz, Germany Pages 42-54, doi:10.1016/j.joi.2009.06.003 Abstract A recently suggested modification of the g-index is analysed in order to take multiple coauthorship appropriately into account. By fractionalised counting of the papers one can obtain an appropriate measure which I call gm-index. Two fictitious examples for model cases and two empirical cases are analysed. The results are compared with two other variants of the g-index which have also recently been proposed. Only the gm-index shows the correct behaviour when datasets are aggregated. The interpolated and continuous versions of the g-index and its variants are also discussed. For an intuitive comparison of the determination of the investigated variants of the h-index and the g-index, a visualization of the citation records is utilized. 7. The difference between popularity and prestige in the sciences and in the social sciences: A bibliometric analysis Author(s): Massimo Franceschet Corresponding author: Massimo Franceschet - massimo.franceschet at dimi.uniud.it - Department of Mathematics and Computer Science, University of Udine, Via delle Scienze 206, 33100 Udine, Italy Pages 55-63, doi:10.1016/j.joi.2009.08.001 Abstract The status of a journal is commonly determined by two factors: popularity and prestige. While the former counts citations, the latter recursively weights them with the prestige of the citing journals. We make a thorough comparison of the bibliometric concepts of popularity and prestige for journals in the sciences and in the social sciences. We find that the two notions diverge more for the hard sciences, including physics, engineering, material sciences, and computer sciences, than they do for the geosciences, for biology-medical disciplines, and for the social sciences. Moreover, we identify the science and social science journals with the highest diverging ranks in popularity and prestige compilations. 8. The Hirsch spectrum: A novel tool for analyzing scientific journals Author(s): Fiorenzo Franceschini, Domenico Maisano Corresponding author: Fiorenzo Franceschini - fiorenzo.franceschini at polito.it - Politecnico di Torino, Dipartimento di Sistemi di Produzione ed Economia dell?Azienda (DISPEA), Corso Duca degli Abruzzi 24, 10129 Torino, Italy Pages 64-73, doi:10.1016/j.joi.2009.08.003 Abstract This paper introduces the Hirsch spectrum (h-spectrum) for analyzing the academic reputation of a scientific journal. h-Spectrum is a novel tool based on the Hirsch (h) index. It is easy to construct: considering a specific journal in a specific interval of time, h-spectrum is defined as the distribution representing the h-indexes associated to the authors of the journal articles. This tool allows defining a reference profile of the typical author of a journal, compare different journals within the same scientific field, and provide a rough indication of prestige/reputation of a journal in the scientific community. h-Spectrum can be associated to every journal. Ten specific journals in the Quality Engineering/Quality Management field are analyzed so as to preliminarily investigate the h-spectrum characteristics. 9. Can epidemic models describe the diffusion of topics across disciplines? Author(s): Istvan Z. Kiss, Mark Broom, Paul G. Craze, Ismael Rafols Corresponding author: Ismael Rafols - i.rafols at sussex.ac.uk - SPRU-Science and Technology Policy Research, University of Sussex, Brighton BN1 9QE, UK; Technology Policy & Assessment Center, School of Public Policy, Georgia Institute of Technology, Atlanta, GA 30332, USA Pages 74-82, doi:10.1016/j.joi.2009.08.002 Abstract This paper introduces a new approach to describe the spread of research topics across disciplines using epidemic models. The approach is based on applying individual-based models from mathematical epidemiology to the diffusion of a research topic over a contact network that represents knowledge flows over the map of science?as obtained from citations between ISI Subject Categories. Using research publications on the protein class kinesin as a case study, we report a better fit between model and empirical data when using the citation-based contact network. Incubation periods on the order of 4?15.5 years support the view that, whilst research topics may grow very quickly, they face difficulties to overcome disciplinary boundaries. 10. Citation speed as a measure to predict the attention an article receives: An investigation of the validity of editorial decisions at Angewandte Chemie International Edition Author(s): Lutz Bornmann, Hans-Dieter Daniel Corresponding author: Lutz Bornmann - bornmann at gess.ethz.ch - ETH Zurich, Professorship for Social Psychology and Research on Higher Education, Z?hringerstr 24, CH-8092 Zurich, Switzerland Pages 83-88, doi:10.1016/j.joi.2009.09.001 Abstract The scientific quality of a publication can be determined not only based on the number of times it is cited but also based on the speed with which its content is disseminated in the scientific community. In this study we tested whether manuscripts that were accepted by Angewandte Chemie International Edition (one of the prime chemistry journals worldwide) received the first citation after publication faster than manuscripts that were rejected by the journal but published elsewhere. The results of a Cox regression model show that accepted manuscripts have a 49% higher hazard rate of citation than rejected manuscripts. 11. Analysis of cooperative research and development networks on Japanese patents Author(s): Hiroyasu Inoue, Wataru Souma, Schumpeter Tamada Corresponding author: Hiroyasu Inoue - inoue at dis.osaka-sandai.ac.jp - Osaka Sangyo University, 3-1-1 Nakagaito, Daito city, Osaka 574-0013, Japan Pages 89-96, doi:10.1016/j.joi.2009.09.002 Abstract To sustain economic growth, countries have to manage systems in order to create technological innovation. To meet this goal, they are developing policies that organically connect companies, national laboratories, and universities into innovation networks. However, the whole structures of these connections have been little investigated because of the difficulty of obtaining such data. We use Japanese patent data and create a network of jointly applying organizations. This network can be considered as one representation of an innovation network because patents are seeds of innovation and joint applications are strong evidence of connections between organizations. We investigated the structure of the network, especially whether or not the degree distribution follows a power law. After that, we also propose a model that generates the actual network, not only degree distribution, but also link distance distribution. 12. The impact of small world on innovation: An empirical study of 16 countries Author(s): Zifeng Chen, Jiancheng Guan Corresponding author: Jiancheng Guan - guanjianch at 126.com - School of Management, Fudan University, 200433 Shanghai, PR China Pages 97-106, doi:10.1016/j.joi.2009.09.003 Abstract This paper investigates the impact of small world properties and the size of largest component on innovation performance at national level. Our study adds new evidence to the limited literature on this topic with an empirical investigation for the patent collaboration networks of 16 main innovative countries during 1975?2006. We combine small world network theory with statistical models to systematically explore the relationship between network structure and patent productivity. Results fail to support that the size of largest component enhances innovative productivity significantly, which is not consistent with recent concerns regarding positive effects of largest component on patent output. We do find that small-world structure benefits innovation but it is limited to a special range after which the effects inversed and shorter path length always correlates with increased innovation output. Our findings extend the current literature and they can be implicated for policy makers and relevant managers when making decisions for technology, industry and firm location. 13. Ranking marketing journals using the Google Scholar-based hg-index Author(s): Salim Moussa, Mourad Touzani Corresponding author: Salim Moussa - salimmoussa at yahoo.fr - Institut Sup?rieur de Gestion, Universit? de Tunis, Tunisie; Institut Sup?rieur des Etudes Appliqu?es en Humanit?s de Gafsa, Universit? de Gafsa, Tunisie Pages 107-117, doi:10.1016/j.joi.2009.10.001 Abstract This paper provides a ranking of 69 marketing journals using a new Hirsch-type index, the hg-index which is the geometric mean of hg. The applicability of this index is tested on data retrieved from Google Scholar on marketing journal articles published between 2003 and 2007. The authors investigate the relationship between the hg-ranking, ranking implied by Thomson Reuters? Journal Impact Factor for 2008, and rankings in previous citation-based studies of marketing journals. They also test two models of consumption of marketing journals that take into account measures of citing (based on the hg-index), prestige, and reading preference. 14. Hirsch-type characteristics of the tail of distributions. The generalised h-index Author(s): Wolfgang Gl?nzel, Andr?s Schubert Corresponding author: Wolfgang Gl?nzel - wolfgang.glanzel at econ.kuleuven.ac.be - Katholieke Universiteit Leuven, Centre for R&D Monitoring and Dept. MSI, Leuven, Belgium; Hungarian Academy of Sciences, Institute for Research Policy Studies, Budapest, Hungary Pages 118-123, doi:10.1016/j.joi.2009.10.002 Abstract In this paper a generalisation of the h-index and g-index is given on the basis of non-negative real-valued functionals defined on subspaces of the vector space generated by the ordered samples. Several Hirsch-type measures are defined and their basic properties are analysed. Empirical properties are illustrated using examples from the micro- and meso-level. Among these measures, the h-index proved the most, the arithmetic and geometric g-indices, the least robust measures. The μ-index and the harmonic g-index provide more balanced results and are still robust enough. 15. Using the Web for research evaluation: The Integrated Online Impact indicator Author(s): Kayvan Kousha, Mike Thelwall, Somayeh Rezaie Corresponding author: Kayvan Kousha - kkoosha at ut.ac.ir - Department of Library and Information Science, University of Tehran, Jalle-Ahmad Highway, P.O. Box 14155-6456, Tehran, Iran Pages 124-135, doi:10.1016/j.joi.2009.10.003 Abstract Previous research has shown that citation data from different types of Web sources can potentially be used for research evaluation. Here we introduce a new combined Integrated Online Impact (IOI) indicator. For a case study, we selected research articles published in the Journal of the American Society for Information Science & Technology (JASIST) and Scientometrics in 2003. We compared the citation counts from Web of Science (WoS) and Scopus with five online sources of citation data including Google Scholar, Google Books, Google Blogs, PowerPoint presentations and course reading lists. The mean and median IOI was nearly twice as high as both WoS and Scopus, confirming that online citations are sufficiently numerous to be useful for the impact assessment of research. We also found significant correlations between conventional and online impact indicators, confirming that both assess something similar in scholarly communication. Further analysis showed that the overall percentage for unique Google Scholar citations outside the WoS were 73% and 60% for the articles published in JASIST and Scientometrics, respectively. An important conclusion is that in subject areas where wider types of intellectual impact indicators outside the WoS and Scopus databases are needed for research evaluation, IOI can be used to help monitor research performance. Copyright ? 2009 Published by Elsevier B.V. From j.hartley at PSY.KEELE.AC.UK Wed Feb 10 06:58:58 2010 From: j.hartley at PSY.KEELE.AC.UK (James Hartley) Date: Wed, 10 Feb 2010 11:58:58 -0000 Subject: Help needed in a study of medical journal page layouts Message-ID: Dear Colleague I am carrying out a study of different formats for the first page of articles in scientific journals and would be grateful if you felt you were able to take part. Please find attached some information about the study - which should take less than 10 mins - and an informed consent form which I need you to sign (an e-mail will do) before I can send you the materials! If you would also like to forward this invitation on to any of your colleagues who might be able to help, I would be very grateful. I look forward to hearing from you. Many thanks and best wishes James Hartley School of Psychology Keele University Staffordshire ST5 5BG UK j.hartley at psy.keele.ac.uk http://www.keele.ac.uk/depts/ps/people/JHartley/index.htm -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Invitation.doc Type: application/msword Size: 32768 bytes Desc: not available URL: From clement_levallois at YAHOO.FR Wed Feb 10 11:19:41 2010 From: clement_levallois at YAHOO.FR (Clement Levallois) Date: Wed, 10 Feb 2010 17:19:41 +0100 Subject: A new open-source software for SNA: GEPHI In-Reply-To: Message-ID: Dear list members, You will surely be excited by the 0.7 release of GEPHI, developed by young French computing engineers. GEPHI offers new perspectives for 3D animation, and is designed to be highly evolvable (input format in XML, platform for development of plug-ins). GEPHI is open source and freeware, and accepts a variety of input and export formats. Visit http://gephi.org/ for screenshots and downloads. Best wishes, Clement Levallois www.clementlevallois.net -------------- next part -------------- An HTML attachment was scrubbed... URL: From ahcjurquhart at BTINTERNET.COM Thu Feb 11 14:56:43 2010 From: ahcjurquhart at BTINTERNET.COM (AHCJ URQUHART) Date: Thu, 11 Feb 2010 19:56:43 +0000 Subject: Help needed in a study of medical journal page layouts Message-ID: Happy to help. I do work with medical journals although my interests are more towards Health Services Research (and am an associate editor, for my sins, on BMC Health Services Research, a much more onerous task than I thought it would be). But I have been an abstractor, and am interested in the process and the presentation of abstracts. ? Chris --- On Wed, 10/2/10, James Hartley wrote: From: James Hartley Subject: [SIGMETRICS] Help needed in a study of medical journal page layouts To: SIGMETRICS at LISTSERV.UTK.EDU Date: Wednesday, 10 February, 2010, 11:58 Dear Colleague ? I am carrying out a study of different formats for the first page of articles in?scientific journals and would be grateful if you felt you were able to take part.? ? Please find attached some information about the study - which should take?less than 10 mins -?and an informed consent form which I need you to sign (an e-mail will do) before I can send you the materials! ? If you would also?like to forward this invitation on to any of your?colleagues who might be able to help, I would be very grateful. ? I look forward to hearing from you. ? Many thanks and best wishes ? James Hartley School of Psychology Keele University Staffordshire ST5 5BG UK j.hartley at psy.keele.ac.uk http://www.keele.ac.uk/depts/ps/people/JHartley/index.htm -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Invitation.doc Type: application/octet-stream Size: 32768 bytes Desc: not available URL: From garfield at CODEX.CIS.UPENN.EDU Mon Feb 15 14:19:07 2010 From: garfield at CODEX.CIS.UPENN.EDU (Eugene Garfield) Date: Mon, 15 Feb 2010 14:19:07 -0500 Subject: Leydesdorff, L; de Moya-Anegon, F; Guerrero-Bote, VP. 2010. Journal Maps on the Basis of Scopus Data: A Comparison with the Journal Citation Reports of the ISI. JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY 61 (2): 352-369. Message-ID: Leydesdorff, L; de Moya-Anegon, F; Guerrero-Bote, VP. 2010. Journal Maps on the Basis of Scopus Data: A Comparison with the Journal Citation Reports of the ISI. JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY 61 (2): 352-369. Author Full Name(s): Leydesdorff, Loet; de Moya-Anegon, Felix; Guerrero-Bote, Vicente P. Language: English Document Type: Article KeyWords Plus: WEB-OF-SCIENCE; GOOGLE-SCHOLAR; H-INDEX; RESEARCH PERFORMANCE; SCIENTIFIC JOURNALS; COMMUNICATION; INDICATORS; IMPACT; RESEARCHERS; SOCIOLOGY Abstract: Using the Scopus dataset (1996-2007) a grand matrix of aggregated journal-journal citations was constructed. This matrix can be compared in terms of the network structures with the matrix contained in the Journal Citation Reports (JCR) of the Institute of Scientific Information (ISI). Because the Scopus database contains a larger number of journals and covers the humanities, one would expect richer maps. However, the matrix is in this case sparser than in the case of the ISI data. This is because of (a) the larger number of journals covered by Scopus and (b) the historical record of citations older than 10 years contained in the ISI database. When the data is highly structured, as in the case of large journals, the maps are comparable, although one may have to vary a threshold (because of the differences in densities). In the case of interdisciplinary journals and journals in the social sciences and humanities, the new database does not add a lot to what is possible with the ISI databases. Addresses: [Leydesdorff, Loet] Univ Amsterdam, ASCoR, NL-1012 CX Amsterdam, Netherlands; [de Moya-Anegon, Felix] SCImago Res Grp, CSIC, CCHS, IPP, Madrid, Spain; [Guerrero-Bote, Vicente P.] Univ Extremadura, Informat & Commun Sci Dept, E-06071 Badajoz, Spain Reprint Address: Leydesdorff, L, Univ Amsterdam, ASCoR, Kloveniersburgwal 48, NL-1012 CX Amsterdam, Netherlands. E-mail Address: loet at leydesdorff.net; felix.moya at scimago.es; guerrero at unex.es ISSN: 1532-2882 DOI: 10.1002/asi.21250 From garfield at CODEX.CIS.UPENN.EDU Mon Feb 15 14:22:45 2010 From: garfield at CODEX.CIS.UPENN.EDU (Eugene Garfield) Date: Mon, 15 Feb 2010 14:22:45 -0500 Subject: Campanario, JM. 2010. Distribution of Ranks of Articles and Citations in Journals. JASIST 61 (2): 419-423. Message-ID: Campanario, JM. 2010. Distribution of Ranks of Articles and Citations in Journals. JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY 61 (2): 419-423. Author Full Name(s): Miguel Campanario, Juan Language: English Document Type: Article KeyWords Plus: IMPACT FACTOR Abstract: I studied the distribution of articles and citations in journals between 1998 and 2007 according to an empirical function with two exponents. These variables showed good fit to a beta function with two exponents. Addresses: Univ Alcala De Henares, Dept Fis, Madrid 28871, Spain Reprint Address: Campanario, JM, Univ Alcala De Henares, Dept Fis, Madrid 28871, Spain. E-mail Address: Juan.campanario at uah.es ISSN: 1532-2882 DOI: 10.1002/asi.21238 From garfield at CODEX.CIS.UPENN.EDU Mon Feb 15 14:25:31 2010 From: garfield at CODEX.CIS.UPENN.EDU (Eugene Garfield) Date: Mon, 15 Feb 2010 14:25:31 -0500 Subject: Lariviere, V; Gingras, Y. 2010. The Impact Factor's Matthew Effect: A Natural Experiment in Bibliometrics. JASIST 61 (2): 424-427. Message-ID: Lariviere, V; Gingras, Y. 2010. The Impact Factor's Matthew Effect: A Natural Experiment in Bibliometrics. JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY 61 (2): 424-427. Author Full Name(s): Lariviere, Vincent; Gingras, Yves Language: English Document Type: Article KeyWords Plus: SCIENCE SYSTEM; SCALING RULES; CITATION CHARACTERISTICS; INDICATORS Abstract: Since the publication of Robert K. Merton's theory of cumulative advantage in science (Matthew Effect), several empirical studies have tried to measure its presence at the level of papers, individual researchers, institutions, or countries. However, these studies seldom control for the intrinsic "quality" of papers or of researchers-"better" (however defined) papers or researchers could receive higher citation rates because they are indeed of better quality. Using an original method for controlling the intrinsic value of papers-identical duplicate papers published in different journals with different impact factors- this paper shows that the journal in which papers are published have a strong influence on their citation rates, as duplicate papers published in high-impact journals obtain, on average, twice as many citations as their identical counterparts published in journals with lower impact factors. The intrinsic value of a paper is thus not the only reason a given paper gets cited or not, there is a specific Matthew Effect attached to journals and this gives to papers published there an added value over and above their intrinsic quality. Addresses: [Lariviere, Vincent; Gingras, Yves] Univ Quebec Montreal, OST, CIRST, Montreal, PQ H3C 3P8, Canada Reprint Address: Lariviere, V, Univ Quebec Montreal, OST, CIRST, CP 8888,Succursale Ctr Ville, Montreal, PQ H3C 3P8, Canada. E-mail Address: lariviere.vincent at uqam.ca; gingras.yves at uqam.ca ISSN: 1532-2882 DOI: 10.1002/asi.21232 From loet at LEYDESDORFF.NET Tue Feb 16 01:46:17 2010 From: loet at LEYDESDORFF.NET (Loet Leydesdorff) Date: Tue, 16 Feb 2010 07:46:17 +0100 Subject: Caveats for the journal and field normalizations in the CWTS ("Leiden") evaluations of research performance Message-ID: Caveats for the journal and field normalizations in the CWTS ("Leiden") evaluations of research performance Journal of Informetrics (forthcoming). http://arxiv.org/abs/1002.2769 Abstract: The Center for Science and Technology Studies at Leiden University advocates the use of specific normalizations for assessing research performance with reference to a world average. The Journal Citation Score (JCS) and Field Citation Score (FCS) are averaged for the research group or individual researcher under study, and then these values are used as denominators of the (mean) Citations per publication (CPP). Thus, this normalization is based on dividing two averages. This procedure only generates a legitimate indicator in the case of underlying normal distributions. Given the skewed distributions under study, one should average the observed versus expected values which are to be divided first for each publication. We show the effects of the Leiden normalization for a recent evaluation where we happened to have access to the underlying data. Tobias Opthof [1,2], Loet Leydesdorff [3] [1] Experimental Cardiology Group, Heart Failure Research Center, Academic Medical Center AMC, Meibergdreef 9, 1105 AZ Amsterdam, The Netherlands. [2] Department of Medical Physiology, University Medical Center Utrecht, Utrecht, The Netherlands. [3] Amsterdam School of Communications Research (ASCoR), University of Amsterdam, Kloveniersburgwal 48, 1012 CX Amsterdam, The Netherlands. ** apologies for cross-postings -------------- next part -------------- An HTML attachment was scrubbed... URL: From garfield at CODEX.CIS.UPENN.EDU Tue Feb 16 10:39:49 2010 From: garfield at CODEX.CIS.UPENN.EDU (Eugene Garfield) Date: Tue, 16 Feb 2010 10:39:49 -0500 Subject: Asadi, M; Shekofteh, M. 2009. The relationship between the research activity of Iranian medical universities and their web impact factor. ELECTRONIC LIBRARY 27 (6): 1026-1043. Message-ID: Asadi, M; Shekofteh, M. 2009. The relationship between the research activity of Iranian medical universities and their web impact factor. ELECTRONIC LIBRARY 27 (6): 1026-1043. Author Full Name(s): Asadi, Maryam; Shekofteh, Maryam Language: English Document Type: Article Author Keywords: Research; Medical schools; Worldwide web; Students; Iran KeyWords Plus: INFORMATION; LINKS; DEPARTMENTS Abstract: Purpose - The aim of this paper is to look at the relationship between research in Iranian medical universities, together with their Web Impact Factor (WIF). Design/methodology/approach - The Altavista search engine was chosen because of its advanced search facilities for counting links and having a wider scope of academic web sites than other search engines. It was searched for determining the number of pages, in-links and self-links of 42 web sites of Iranian medical universities. The Web Impact Factor (WIF) was calculated by two formulas and the relationship between the two grades of universities (WIF and ISI) was calculated by using the Spearman's correlation coefficient. Findings - Tehran, Iran and Gilan medical universities had the first to third grade in the number of web sites' pages. The number of in-links to Tehran, Isfahan and Tabriz medical universities was more than the others. The WIF of universities' web sites was counted in four groups with regard to their number of web pages and Kerman, Kermanshah, Fasa and Qom had the highest grades of WIF in each group, but there was no meaningful relationship between WIF grades and ISI grades. Also, the WIF of university web sites was counted with regard to the number of their members in the four groups and the result shows that Hormozgan, Shiraz, Isfahan and Tehran had the highest grade in each group. Again, there was no meaningful relation between WIF grades and ISI grades. Research limitations/implications - It seems that counting the number of in-links is a better scale for ranking university web sites than WIF. WIF is proposed only in situations where the number of web pages is about equal. Originality/value - The paper provides rankings, for the first time, of Iranian medical university web sites with regard to WIF based on the number of the web pages as well as on the number of the academic staff. The research shows a methodology that others can follow. Addresses: [Asadi, Maryam] Sharif Univ Technol, Cent Lib, Tehran, Iran; [Shekofteh, Maryam] Shahid Beheshti Univ Med Sci & Human Serv, Paramed Fac, Dept Lib & Informat Sci, Tehran, Iran Reprint Address: Shekofteh, M, Islam Azad Univ, Tehran, Iran. E-mail Address: shekofteh_m at yahoo.com ISSN: 0264-0473 DOI: 10.1108/02640470911004101 URL: http://195.92.228.61/10.1108/02640470911004101 From garfield at CODEX.CIS.UPENN.EDU Tue Feb 16 10:43:57 2010 From: garfield at CODEX.CIS.UPENN.EDU (Eugene Garfield) Date: Tue, 16 Feb 2010 10:43:57 -0500 Subject: Rojas-Sola et al. 2009. Bibliometric analysis of Latin American, Spanish and Portuguese Scientific Publications in the subject materials science, ceramics in JCR (SCI) database (1997-2008). Message-ID: Rojas-Sola, JI; Jorda-Albinana, B; Criado-Herrero, E. 2009. Bibliometric analysis of Latin American, Spanish and Portuguese Scientific Publications in the subject materials science, ceramics in JCR (SCI) database (1997-2008). BOLETIN DE LA SOCIEDAD ESPANOLA DE CERAMICA Y VIDRIO 48 (6): 297-310. Author Full Name(s): Rojas-Sola, J. I.; Jorda-Albinana, B.; Criado-Herrero, E. Language: Spanish Document Type: Article Author Keywords: Bibliometric analysis; Impact Factor; Latinoamerican; ceramic research institutions; Universities; Research Centres; Ceramic Materials KeyWords Plus: INDICATORS; COOPERATION Abstract: The Latin American scientific community is becoming increasingly significant in many areas, particularly, in the ceramic field because of its proximity to the processes of generation of infrastructure and housing demand in developing societies. The present study is devoted to determine the specific weight that each country, research institution and author have adquired. The thirty journals included in journal Citation Reports, under the category "Materials Science, Ceramics" along 1997-2008 period, have been selected and articles from Latin America and Portugal, have been analyzed Under a bibliometric approach. Thus, Within the document type "Journal Article or Review" 1423 papers have been collected and Studied from all institutional perspective; different bibliometric indicators (number of documents weighted impact factor, relative impact factor and the ratio between the number of citations and the number of documents) have been elaborated. Among the research centers showing a scientific highlight the most relevant is the University of Aveiro (Portugal) and the Universidade Federal de Sao Carlos (Brazil), followed by the Universidade Estadual Paulista and Universidade de Sao Paulo, both belonging to Brazil. The latter is also notable for its high weighted impact factor. Regarding publications, the Journal of Non-Crystalline Solids ranked first, by bringing together the 20.45% of scientific production in Latin America and Portugal published in the JCR (1423 items). However, if data from Spain were collected, the magazine Bulletin of the Spanish Society of Ceramics and Glass, it is the most relevant, taking into account the higher number of articles (524), that represents for 19.68% of total records founded (2663). It has also confirmed a remarkable international collaboration, mainly with Spain, France, Brazil, USA, England and Portugal, and highlighting the situation of the latter country, carrying out 49.11% of scientific production analyzed in international collaboration. Finally, it was found that the annual impact factor of scientific publications has been a growing trend in all Countries and particularly, in Brazil. Addresses: [Rojas-Sola, J. I.] Univ Jaen, Escuela Politecn Super Jaen, Jaen 23071, Spain; [Jorda-Albinana, B.] Univ Politecn Valencia, ETS Ingn Diseno, Valencia 46022, Spain; [Criado-Herrero, E.] CSIC, Inst Ceram & Vidrio, E-28049 Madrid, Spain Reprint Address: Rojas-Sola, JI, Univ Jaen, Escuela Politecn Super Jaen, Campus Lagunillas S-N, Jaen 23071, Spain. E-mail Address: jirojas at ujaen.es; bego at mag.upv.es; ecriado at icv.csic.es ISSN: 0366-3175 URL: http://boletines.secv.es/en/index.php From garfield at CODEX.CIS.UPENN.EDU Tue Feb 16 10:46:59 2010 From: garfield at CODEX.CIS.UPENN.EDU (Eugene Garfield) Date: Tue, 16 Feb 2010 10:46:59 -0500 Subject: van Nierop, E. 2010. The introduction of the 5-year impact factor: does it benefit statistics journals?. STATISTICA NEERLANDICA 64 (1): 71-76. Message-ID: van Nierop, E. 2010. The introduction of the 5-year impact factor: does it benefit statistics journals?. STATISTICA NEERLANDICA 64 (1): 71-76. Author Full Name(s): van Nierop, Erjen Language: English Document Type: Article Author Keywords: impact factors; 5-year impact factor; citations Abstract: In this paper, we investigate the effects of the introduction of 5- year impact factors. We collect impact factor data for all disciplines available in the Journal Citation Reports. For all these categories, we give insights into the relationship between the traditional 2-year impact factor and the new 5-year impact factor. Our main focus is to investigate whether the traditionally low impact factors for statistics journals improve with this new measure. The results show that the statistics discipline indeed benefits from the 5-year window, that is, the impact factor increases. This appears to be true for most disciplines, although the statistics discipline ranks among the top 15 (out of 171) disciplines in this respect. Addresses: Univ Groningen, Fac Econ & Business, Dept Mkt, NL-9700 AV Groningen, Netherlands Reprint Address: van Nierop, E, Univ Groningen, Fac Econ & Business, Dept Mkt, POB 800, NL-9700 AV Groningen, Netherlands. E-mail Address: j.e.m.van.nierop at rug.nl ISSN: 0039-0402 DOI: 10.1111/j.1467-9574.2009.00448.x URL: http://www3.interscience.wiley.com/journal/123247514/abstract From garfield at CODEX.CIS.UPENN.EDU Tue Feb 16 10:51:51 2010 From: garfield at CODEX.CIS.UPENN.EDU (Eugene Garfield) Date: Tue, 16 Feb 2010 10:51:51 -0500 Subject: Krell, FT. 2010. Should editors influence journal impact factors?. LEARNED PUBLISHING 23 (1): 59-62. Message-ID: Krell, FT. 2010. Should editors influence journal impact factors?. LEARNED PUBLISHING 23 (1): 59-62. Author Full Name(s): Krell, Frank-Thorsten Language: English Document Type: Article KeyWords Plus: SCIENTIFIC JOURNALS; FACTOR MANIPULATION; CITATION DATABASES; REFERENCES; SCIENCE; PERFORMANCE; VALIDITY; QUALITY Abstract: The journal impact factor is widely used as a performance indicator for single authors (despite its unsuitably in this respect). Hence, authors are increasingly exercised if there is any sign that impact factors are being manipulated. Editors who ask authors to cite relevant papers from their own journal tire accused of acting unethically. This is surprising because, besides publishers, authors are the primary beneficiaries of an increased impact factor of the journal in which they publish, and because the citation process is biased anyway. There is growing evidence that quality and relevance are not always the reasons for choosing references. Authors' biases and personal environments as well (is strategic considerations are major factors. As long (is an editor does not force authors to cite irrelevant papers front their own journal, I consider it as a matter of caretaking for the journal and its authors if an editor brings recent papers to the authors' attention. It would be Unfair to authors and disloyal to the publisher if an editor did not try to increase the impact of his/her own journal. (C) Frank-Thorsten Krell Addresses: Denver Museum Nat & Sci, Dept Zool, Denver, CO 80205 USA Reprint Address: Krell, FT, Denver Museum Nat & Sci, Dept Zool, 2001 Colorado Blvd, Denver, CO 80205 USA. E-mail Address: frank.krell at dmns.org ISSN: 0953-1513 DOI: 10.1087/20100110 URL: http://www.dmns.org/science/curators/frank-krell/editors-influence.pdf From garfield at CODEX.CIS.UPENN.EDU Tue Feb 16 10:56:09 2010 From: garfield at CODEX.CIS.UPENN.EDU (Eugene Garfield) Date: Tue, 16 Feb 2010 10:56:09 -0500 Subject: Hartley, J. 2010. Never mind the impact factor: colleagues know better!. LEARNED PUBLISHING 23 (1): 63-65. Message-ID: Hartley, J. 2010. Never mind the impact factor: colleagues know better!. LEARNED PUBLISHING 23 (1): 63-65. Author Full Name(s): Hartley, James Language: English Document Type: Editorial Material KeyWords Plus: JOURNALS Addresses: Univ Keele, Sch Psychol, Keele ST5 5BG, Staffs, England Reprint Address: Hartley, J, Univ Keele, Sch Psychol, Keele ST5 5BG, Staffs, England. E-mail Address: j.hartley at psy.keele.ac.uk ISSN: 0953-1513 DOI: 10.1087/20100111 URL: http://www.ingentaconnect.com/content/alpsp/lp/2010/00000023/00000001/ar t00011 From garfield at CODEX.CIS.UPENN.EDU Tue Feb 16 10:58:54 2010 From: garfield at CODEX.CIS.UPENN.EDU (Eugene Garfield) Date: Tue, 16 Feb 2010 10:58:54 -0500 Subject: Evers, JLH. 2010. 100 papers to read before you die. HUMAN REPRODUCTION 25 (1): 2-5. Message-ID: Evers, JLH. 2010. 100 papers to read before you die. HUMAN REPRODUCTION 25 (1): 2-5. Author Full Name(s): Evers, Johannes L. H. Language: English Document Type: Editorial Material KeyWords Plus: CITATION-CLASSICS; JOURNALS; ARTICLES E-mail Address: Jlh.evers at mumc.nl ISSN: 0268-1161 DOI: 10.1093/humrep/dep336 URL: http://humrep.oxfordjournals.org/cgi/content/full/dep336 From eugene.garfield at THOMSONREUTERS.COM Thu Feb 18 10:50:19 2010 From: eugene.garfield at THOMSONREUTERS.COM (Eugene Garfield) Date: Thu, 18 Feb 2010 09:50:19 -0600 Subject: David Pendlebury talks about Predicting Nobel Prizes through citation analysis on Youtube Message-ID: http://www.youtube.com/user/ThomsonReutersIIFL#p/u/15/xHyYOcWpiyI Subscribers to the Metrics Virtual SIG will enjoy this interview with David Pendlebury. ------------------------------------------------------------------------ ------------ Eugene Garfield, PhD. email: garfield at codex.cis.upenn.edu home page: www.eugenegarfield.org -------------- next part -------------- An HTML attachment was scrubbed... URL: From jzus at ZJU.EDU.CN Mon Feb 22 04:15:50 2010 From: jzus at ZJU.EDU.CN (=?GB2312?Q?Helen(Yuehong)_ZHANG?=) Date: Mon, 22 Feb 2010 04:15:50 -0500 Subject: CrossCheck:an effective tool fordetecting plagiarism Message-ID: Title :CASE STUDY:CrossCheck:an effective tool fordetecting plagiarism Author:Helen (Yuehong) ZHANG, Journal of Zhejiang University-SCIENCE (A/B/C) Source:Learned Publishing,VOL.23 NO.1 JANUARY 2010 2010, 23: 9?C14 doi:10.1087/20100103 ABSTRACT.The plagiarism detection service CrossCheck has been used since October 2008 as part of the paper reviewing process for the Journal of ZhejiangUniversity?CScienc (A&B). Between October 2008 and May 2009 662 papers were CrossChecked;151 of these (around 22.8 of submitted papers were found to contain apparently unreasonable levels of copying or self- plagiarism, and 25.8% of these cases (39papers)gave rise to serious suspicions of plagiarism and copyright infringement. Four types of copying or plagiarism were identified, in an attempt to reach a consensus on this type of academic misconduct. http://www.zju.edu.cn/jzus/download/CrossCheck.pdf Helen (Yuehong) ZHANG E-mail Addresses: jzus at zju.edu.cn Publisher:ALPSP From Ping.Zhou at ECON.KULEUVEN.BE Mon Feb 22 04:36:22 2010 From: Ping.Zhou at ECON.KULEUVEN.BE (Zhou, Ping) Date: Mon, 22 Feb 2010 10:36:22 +0100 Subject: A comparative study on communication structures of Chinese journals in the social sciences Message-ID: Title: A comparative study on communication structures of Chinese journals in the social sciences Source: Journal of the American Society for Information Science and Technology, forthcoming Authors: Ping Zhou, Xinning Su, Loet Leydesdorff Abstract: We argue that the communication structures in the Chinese social sciences have not yet been sufficiently reformed. Citation patterns among Chinese domestic journals in three subject areas-political science and marxism, library and information science, and economics-are compared with their counterparts internationally. Like their colleagues in the natural and life sciences, Chinese scholars in the social sciences provide fewer references to journal publications than their international counterparts; like their international colleagues, social scientists provide fewer references than natural sciences. The resulting citation networks, therefore, are sparse. Nevertheless, the citation structures clearly suggest that the Chinese social sciences are far less specialized in terms of disciplinary delineations than their international counterparts. Marxism studies are more established than political science in China. In terms of the impact of the Chinese political system on academic fields, disciplines closely related to the political system are less specialized than those weakly related. In the discussion section, we explore reasons that may cause the current stagnation and provide policy recommendations. From dwojick at HUGHES.NET Mon Feb 22 06:56:59 2010 From: dwojick at HUGHES.NET (David Wojick) Date: Mon, 22 Feb 2010 11:56:59 +0000 Subject: A comparative study on communication structures of Chinese journals in the social sciences Message-ID: An HTML attachment was scrubbed... URL: From Ping.Zhou at ECON.KULEUVEN.BE Mon Feb 22 09:33:14 2010 From: Ping.Zhou at ECON.KULEUVEN.BE (Zhou, Ping) Date: Mon, 22 Feb 2010 15:33:14 +0100 Subject: A comparative study on communication structures of Chinese journals in the social sciences In-Reply-To: <208104695.24875.1266839819596.JavaMail.mail@webmail10> Message-ID: Dear David, The "stagnation" is concluded based on two types of comparison: comparison with natural and life sciences domestically and the specialization comparison between Chinese and international communities. Domestic comparison shows that China's world share of publications in the social sciences lags dramatically behind that in the natural and life sciences (see Figure 1 of the paper). International comparison shows that China is less specialized than its international counterparts. We argue that specialization links to maturity of a discipline; less specialization may lead to slower progress. The full text of the paper can be retrieved at: http://arxiv.org/abs/1002.3590 With kind regards, Ping Zhou From: ASIS&T Special Interest Group on Metrics [mailto:SIGMETRICS at LISTSERV.UTK.EDU] On Behalf Of David Wojick Sent: Monday, February 22, 2010 12:57 PM To: SIGMETRICS at LISTSERV.UTK.EDU Subject: Re: [SIGMETRICS] A comparative study on communication structures of Chinese journals in the social sciences Adminstrative info for SIGMETRICS (for example unsubscribe): http://web.utk.edu/~gwhitney/sigmetrics.html Dear Ping Zhou, What do you mean by "stagnation" and how are you measuring it? This seems like an odd conclusion to draw from a citation analysis. David On Feb 22, 2010, Zhou, Ping wrote: Adminstrative info for SIGMETRICS (for example unsubscribe): http://web.utk.edu/~gwhitney/sigmetrics.html Title: A comparative study on communication structures of Chinese journals in the social sciences Source: Journal of the American Society for Information Science and Technology, forthcoming Authors: Ping Zhou, Xinning Su, Loet Leydesdorff Abstract: We argue that the communication structures in the Chinese social sciences have not yet been sufficiently reformed. Citation patterns among Chinese domestic journals in three subject areas-political science and marxism, library and information science, and economics-are compared with their counterparts internationally. Like their colleagues in the natural and life sciences, Chinese scholars in the social sciences provide fewer references to journal publications than their international counterparts; like their international colleagues, social scientists provide fewer references than natural sciences. The resulting citation networks, therefore, are sparse. Nevertheless, the citation structures clearly suggest that the Chinese social sciences are far less specialized in terms of disciplinary delineations than their international counterparts. Marxism studies are more established than political science in China. In terms of the impact of the Chinese political system o! n academic fields, disciplines closely related to the political system are less specialized than those weakly related. In the discussion section, we explore reasons that may cause the current stagnation and provide policy recommendations. -------------- next part -------------- An HTML attachment was scrubbed... URL: From wilsontd at GMAIL.COM Mon Feb 22 09:47:48 2010 From: wilsontd at GMAIL.COM (Tom Wilson) Date: Mon, 22 Feb 2010 14:47:48 +0000 Subject: A comparative study on communication structures of Chinese journals in the social sciences In-Reply-To: <92C4100F72FC214CAA56E9D1A6B5A3AA01268C2C6F29@ECONSRVEX6.econ.kuleuven.ac.be> Message-ID: I suspect that the problem lies in the communist ideology - it was the same in the days of the Soviet Union - Marxist/Leninism is presumed to be the only philosophy/methodology needed to explain social life and for a social scientist to explore different avenues was positively dangerous. With the collapse of communism in Russia, social scientists there and in the former dependencies are now exploring Western social philosophy and, to a certain extent, anything Marxist is considered suspect - a complete reversal of the previous situation. Given the tight control of society in China, I imagine that looking beyond Marxism for social explanation would also be considered suspect. Tom Wilson On 22 February 2010 14:33, Zhou, Ping wrote: > Dear David, > > > > The "stagnation" is concluded based on two types of comparison: comparison > with natural and life sciences domestically and the specialization > comparison between Chinese and international communities. > > > > Domestic comparison shows that China's world share of publications in the > social sciences lags dramatically behind that in the natural and life > sciences (see Figure 1 of the paper). > > > > International comparison shows that China is less specialized than its > international counterparts. We argue that specialization links to maturity > of a discipline; less specialization may lead to slower progress. > > > > The full text of the paper can be retrieved at: > > http://arxiv.org/abs/1002.3590 > > > > With kind regards, > > > > Ping Zhou > > > > > > *From:* ASIS&T Special Interest Group on Metrics [mailto: > SIGMETRICS at LISTSERV.UTK.EDU] *On Behalf Of *David Wojick > *Sent:* Monday, February 22, 2010 12:57 PM > *To:* SIGMETRICS at LISTSERV.UTK.EDU > *Subject:* Re: [SIGMETRICS] A comparative study on communication > structures of Chinese journals in the social sciences > > > > Adminstrative info for SIGMETRICS (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html Dear Ping Zhou, > > > > What do you mean by "stagnation" and how are you measuring it? This seems > like an odd conclusion to draw from a citation analysis. > > > David > > On Feb 22, 2010, *Zhou, Ping* wrote: > > Adminstrative info for SIGMETRICS (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html > > Title: A comparative study on communication structures of Chinese journals > in the social sciences > > Source: Journal of the American Society for Information Science and > Technology, forthcoming > > Authors: Ping Zhou, Xinning Su, Loet Leydesdorff > > Abstract: We argue that the communication structures in the Chinese social > sciences have not yet been sufficiently reformed. Citation patterns among > Chinese domestic journals in three subject areas-political science and > marxism, library and information science, and economics-are compared with > their counterparts internationally. Like their colleagues in the natural and > life sciences, Chinese scholars in the social sciences provide fewer > references to journal publications than their international counterparts; > like their international colleagues, social scientists provide fewer > references than natural sciences. The resulting citation networks, > therefore, are sparse. Nevertheless, the citation structures clearly suggest > that the Chinese social sciences are far less specialized in terms of > disciplinary delineations than their international counterparts. Marxism > studies are more established than political science in China. In terms of > the impact of the Chinese political system o! > n academic fields, disciplines closely related to the political system are > less specialized than those weakly related. In the discussion section, we > explore reasons that may cause the current stagnation and provide policy > recommendations. > > -- ---------------------------------------------------------- Professor Tom Wilson, PhD, PhD (h.c.), ----------------------------------------------------------- Publisher and Editor in Chief: Information Research: an international electronic journal Website - http://InformationR.net/ir/ Blog - http://info-research.blogspot.com/ Photoblog - http://tomwilson.shutterchance.com/ ----------------------------------------------------------- E-mail: wilsontd at gmail.com ----------------------------------------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: From loet at LEYDESDORFF.NET Mon Feb 22 10:33:15 2010 From: loet at LEYDESDORFF.NET (Loet Leydesdorff) Date: Mon, 22 Feb 2010 16:33:15 +0100 Subject: A comparative study on communication structures of Chinese journals in the social sciences In-Reply-To: Message-ID: Dear Tom, I am not so sure that China now is directly comparable with the Soviet Union. Library and information science, for example, are well developed. With best wishes, Loet _____ Loet Leydesdorff Amsterdam School of Communications Research (ASCoR), Kloveniersburgwal 48, 1012 CX Amsterdam. Tel.: +31-20- 525 6598; fax: +31-20- 525 3681 loet at leydesdorff.net ; http://www.leydesdorff.net/ _____ From: ASIS&T Special Interest Group on Metrics [mailto:SIGMETRICS at LISTSERV.UTK.EDU] On Behalf Of Tom Wilson Sent: Monday, February 22, 2010 3:48 PM To: SIGMETRICS at LISTSERV.UTK.EDU Subject: Re: [SIGMETRICS] A comparative study on communication structures of Chinese journals in the social sciences in the communist ideology - it was the same in the days of the Soviet Union - Marxist/Leninism is presumed to be the only philosophy/methodology needed to explain social life and for a social scientist to explore different avenues was positively dangerous. With the collapse of communism in Russia, social scientists there and in the former dependencies are now exploring Western social philosophy and, to a certain extent, anything Marxist is considered suspect - a complete reversal of the previous situation. Given the tight control of society in China, I imagine that looking beyond Marxism for social explanation would also be considered suspect. Tom Wilson On 22 February 2010 14:33, Zhou, Ping wrote: Dear David, The "stagnation" is concluded based on two types of comparison: comparison with natural and life sciences domestically and the specialization comparison between Chinese and international communities. Domestic comparison shows that China's world share of publications in the social sciences lags dramatically behind that in the natural and life sciences (see Figure 1 of the paper). International comparison shows that China is less specialized than its international counterparts. We argue that specialization links to maturity of a discipline; less specialization may lead to slower progress. The full text of the paper can be retrieved at: http://arxiv.org/abs/1002.3590 With kind regards, Ping Zhou From: ASIS&T Special Interest Group on Metrics [mailto:SIGMETRICS at LISTSERV.UTK.EDU] On Behalf Of David Wojick Sent: Monday, February 22, 2010 12:57 PM To: SIGMETRICS at LISTSERV.UTK.EDU Subject: Re: [SIGMETRICS] A comparative study on communication structures of Chinese journals in the social sciences What do you mean by "stagnation" and how are you measuring it? This seems like an odd conclusion to draw from a citation analysis. David On Feb 22, 2010, Zhou, Ping wrote: Title: A comparative study on communication structures of Chinese journals in the social sciences Source: Journal of the American Society for Information Science and Technology, forthcoming Authors: Ping Zhou, Xinning Su, Loet Leydesdorff Abstract: We argue that the communication structures in the Chinese social sciences have not yet been sufficiently reformed. Citation patterns among Chinese domestic journals in three subject areas-political science and marxism, library and information science, and economics-are compared with their counterparts internationally. Like their colleagues in the natural and life sciences, Chinese scholars in the social sciences provide fewer references to journal publications than their international counterparts; like their international colleagues, social scientists provide fewer references than natural sciences. The resulting citation networks, therefore, are sparse. Nevertheless, the citation structures clearly suggest that the Chinese social sciences are far less specialized in terms of disciplinary delineations than their international counterparts. Marxism studies are more established than political science in China. In terms of the impact of the Chinese political system o! n academic fields, disciplines closely related to the political system are less specialized than those weakly related. In the discussion section, we explore reasons that may cause the current stagnation and provide policy recommendations. -- ---------------------------------------------------------- Professor Tom Wilson, PhD, PhD (h.c.), ----------------------------------------------------------- Publisher and Editor in Chief: Information Research: an international electronic journal Website - http://InformationR.net/ir/ Blog - http://info-research.blogspot.com/ Photoblog - http://tomwilson.shutterchance.com/ ----------------------------------------------------------- E-mail: wilsontd at gmail.com ----------------------------------------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: From Ping.Zhou at ECON.KULEUVEN.BE Mon Feb 22 11:33:15 2010 From: Ping.Zhou at ECON.KULEUVEN.BE (Zhou, Ping) Date: Mon, 22 Feb 2010 17:33:15 +0100 Subject: A comparative study on communication structures of Chinese journals in the social sciences In-Reply-To: Message-ID: Dear Tom, Marxism does play a fundamental role in China. But less specialization of Chinese social sciences cannot be attributed to Marxism. Similar situation also happened in non-communist countries, as indicated in our paper: The current state of the social sciences is comparable to that in the smaller countries of Europe during the 1970s. Reorganizations during the 1980s transformed, for example, the Dutch system from a locally oriented one to an international one. National journals are nowadays also specialized. Scandinavian countries went through similar transitions during this period, and Spain and Italy followed during the 1990s. French and German journals have become more specialized and are now sometimes multilingual. With best regards, Ping Zhou From: ASIS&T Special Interest Group on Metrics [mailto:SIGMETRICS at LISTSERV.UTK.EDU] On Behalf Of Tom Wilson Sent: Monday, February 22, 2010 3:48 PM To: SIGMETRICS at LISTSERV.UTK.EDU Subject: Re: [SIGMETRICS] A comparative study on communication structures of Chinese journals in the social sciences nist ideology - it was the same in the days of the Soviet Union - Marxist/Leninism is presumed to be the only philosophy/methodology needed to explain social life and for a social scientist to explore different avenues was positively dangerous. With the collapse of communism in Russia, social scientists there and in the former dependencies are now exploring Western social philosophy and, to a certain extent, anything Marxist is considered suspect - a complete reversal of the previous situation. Given the tight control of society in China, I imagine that looking beyond Marxism for social explanation would also be considered suspect. Tom Wilson On 22 February 2010 14:33, Zhou, Ping > wrote: Dear David, The "stagnation" is concluded based on two types of comparison: comparison with natural and life sciences domestically and the specialization comparison between Chinese and international communities. Domestic comparison shows that China's world share of publications in the social sciences lags dramatically behind that in the natural and life sciences (see Figure 1 of the paper). International comparison shows that China is less specialized than its international counterparts. We argue that specialization links to maturity of a discipline; less specialization may lead to slower progress. The full text of the paper can be retrieved at: http://arxiv.org/abs/1002.3590 With kind regards, Ping Zhou From: ASIS&T Special Interest Group on Metrics [mailto:SIGMETRICS at LISTSERV.UTK.EDU] On Behalf Of David Wojick Sent: Monday, February 22, 2010 12:57 PM To: SIGMETRICS at LISTSERV.UTK.EDU Subject: Re: [SIGMETRICS] A comparative study on communication structures of Chinese journals in the social sciences What do you mean by "stagnation" and how are you measuring it? This seems like an odd conclusion to draw from a citation analysis. David On Feb 22, 2010, Zhou, Ping > wrote: Title: A comparative study on communication structures of Chinese journals in the social sciences Source: Journal of the American Society for Information Science and Technology, forthcoming Authors: Ping Zhou, Xinning Su, Loet Leydesdorff Abstract: We argue that the communication structures in the Chinese social sciences have not yet been sufficiently reformed. Citation patterns among Chinese domestic journals in three subject areas-political science and marxism, library and information science, and economics-are compared with their counterparts internationally. Like their colleagues in the natural and life sciences, Chinese scholars in the social sciences provide fewer references to journal publications than their international counterparts; like their international colleagues, social scientists provide fewer references than natural sciences. The resulting citation networks, therefore, are sparse. Nevertheless, the citation structures clearly suggest that the Chinese social sciences are far less specialized in terms of disciplinary delineations than their international counterparts. Marxism studies are more established than political science in China. In terms of the impact of the Chinese political system o! n academic fields, disciplines closely related to the political system are less specialized than those weakly related. In the discussion section, we explore reasons that may cause the current stagnation and provide policy recommendations. -- ---------------------------------------------------------- Professor Tom Wilson, PhD, PhD (h.c.), ----------------------------------------------------------- Publisher and Editor in Chief: Information Research: an international electronic journal Website - http://InformationR.net/ir/ Blog - http://info-research.blogspot.com/ Photoblog - http://tomwilson.shutterchance.com/ ----------------------------------------------------------- E-mail: wilsontd at gmail.com ----------------------------------------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: From harnad at ECS.SOTON.AC.UK Mon Feb 22 19:42:30 2010 From: harnad at ECS.SOTON.AC.UK (Stevan Harnad) Date: Mon, 22 Feb 2010 19:42:30 -0500 Subject: Studies showing that review articles get more citations Message-ID: I would be grateful if someone could point me to studies showing that review articles are cited more than ordinary articles. Many thanks, Stevan Harnad From notsjb at LSU.EDU Mon Feb 22 21:11:48 2010 From: notsjb at LSU.EDU (Stephen J Bensman) Date: Mon, 22 Feb 2010 20:11:48 -0600 Subject: Studies showing that review articles get more citations Message-ID: Try those at the URLs below. They contain discussions of the importance of review articles in both Garfield's thinking and science. . http://garfield.library.upenn.edu/bensman/bensman072008.pdf http://garfield.library.upenn.edu/bensman/bensmanegif2007.pdf http://garfield.library.upenn.edu/bensman/bensmanegif22007.pdf Stephen J. Bensman LSU Libraries Louisiana State University Baton Rouge, LA 70803 USA ________________________________ From: ASIS&T Special Interest Group on Metrics on behalf of Stevan Harnad Sent: Mon 2/22/2010 6:42 PM To: SIGMETRICS at listserv.utk.edu Subject: [SIGMETRICS] Studies showing that review articles get more citations I would be grateful if someone could point me to studies showing that review articles are cited more than ordinary articles. Many thanks, Stevan Harnad -------------- next part -------------- An HTML attachment was scrubbed... URL: From vmarkusova at YAHOO.COM Tue Feb 23 03:47:22 2010 From: vmarkusova at YAHOO.COM (Valentina Markusova) Date: Tue, 23 Feb 2010 00:47:22 -0800 Subject: =?koi8-r?Q?=EF=D4=D7=C5=D4=3A_?= [SIGMETRICS] Studies showing that review articles get more citations In-Reply-To: Message-ID: Dear Stevan, Grant Lewison published the paper in Scientometrics on this subject. Best regards, valentina Markusova --- ??, 23.2.10, Stevan Harnad ?????: > ??: Stevan Harnad > ????: [SIGMETRICS] Studies showing that review articles get more citations > ????: SIGMETRICS at listserv.utk.edu > ????: ???????, 23 ??????? 2010, 3:42 > Adminstrative info for SIGMETRICS > (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html > > I would be grateful if someone could point me to studies > showing that review articles are cited more than ordinary > articles. > > Many thanks, > > Stevan Harnad > ________________________________________________________ ?? ??? ? Yahoo!? ????????? ??????????? ? ??????????. Yahoo! ?????! http://ru.mail.yahoo.com From wainer at IC.UNICAMP.BR Tue Feb 23 07:23:44 2010 From: wainer at IC.UNICAMP.BR (Jacques Wainer) Date: Tue, 23 Feb 2010 09:23:44 -0300 Subject: Studies showing that review articles get more citations In-Reply-To: (Stevan Harnad's message of "Mon, 22 Feb 2010 19:42:30 -0500") Message-ID: I used: @Article{reviewpap1, author = {Aksnes, D. W.}, title = {Citation rates and perceptions of scientific contribution}, journal = {Journal of the American Society for Information Science and Technology}, year = 2006, key = 2, volume = 57, pages = {169-185}, doi = {10.1002/asi.20262}} @Article{reviewpap3, author = {H. P. F. Peters and A. F. J. van Raan}, title = {On determinants of citation scores: A case study in chemical engineering}, journal = {Journal of the American Society for Information Science}, year = 1994, volume = 45, number = 1, pages = {39 - 49}} as two references to the phenomenon. In this line, does anyone know of studies that point out that METHODOLOGICAL papers are also cited more than other research? Thanks Jacques Wainer From wilsontd at GMAIL.COM Tue Feb 23 08:51:00 2010 From: wilsontd at GMAIL.COM (Tom Wilson) Date: Tue, 23 Feb 2010 13:51:00 +0000 Subject: Studies showing that review articles get more citations In-Reply-To: <87fx4s87jj.fsf@ic.unicamp.br> Message-ID: Is it really worth exploring? I'd have thought it self-evident that, if you are looking for a review of the literature, as most authors are, you'll site existing reviews; similarly with methodology - if you are using a particular theoretical perspective you'll want to cite others as confirmation that you are on the right track. One of the problems of bibliometrics appears to be a stunning facility for determining the obvious :-) Tom Wilson On 23 February 2010 12:23, Jacques Wainer wrote: > Adminstrative info for SIGMETRICS (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html > > I used: > > @Article{reviewpap1, > author = {Aksnes, D. W.}, > title = {Citation rates and perceptions of scientific > contribution}, > journal = {Journal of the American Society for Information Science > and Technology}, > year = 2006, > key = 2, > volume = 57, > pages = {169-185}, > doi = {10.1002/asi.20262}} > > > @Article{reviewpap3, > author = {H. P. F. Peters and A. F. J. van Raan}, > title = {On determinants of citation scores: A case study in > chemical engineering}, > journal = {Journal of the American Society for Information Science}, > year = 1994, > volume = 45, > number = 1, > pages = {39 - 49}} > > > as two references to the phenomenon. In this line, does anyone know > of studies that point out that METHODOLOGICAL papers are also cited more > than other research? > > Thanks > > Jacques Wainer > -- ---------------------------------------------------------- Professor Tom Wilson, PhD, PhD (h.c.), ----------------------------------------------------------- Publisher and Editor in Chief: Information Research: an international electronic journal Website - http://InformationR.net/ir/ Blog - http://info-research.blogspot.com/ Photoblog - http://tomwilson.shutterchance.com/ ----------------------------------------------------------- E-mail: wilsontd at gmail.com ----------------------------------------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: From amsciforum at GMAIL.COM Tue Feb 23 09:06:10 2010 From: amsciforum at GMAIL.COM (Stevan Harnad) Date: Tue, 23 Feb 2010 09:06:10 -0500 Subject: Studies showing that review articles get more citations In-Reply-To: Message-ID: On Tue, Feb 23, 2010 at 8:51 AM, Tom Wilson wrote: > Is it really worth exploring? > > ?I'd have thought it self-evident that, if you are looking for a review of > the literature, as most authors are, you'll site existing reviews; similarly > with methodology - if you are using a particular theoretical perspective > you'll want to cite others as confirmation that you are on the right track. > ?One of the problems of bibliometrics appears to be a stunning facility for > determining the obvious :-) It is obvious that reviews will cite reviews, and that authors will cite supporting studies, but is it obvious that reviews are cited more than ordinary articles? Perhaps; but it would still be nice to see the evidence. Especially nice to see the evidence for review *articles* -- relative to ordinary articles -- separated from the evidence for review *journals* relative to ordinary journals. There has also been some evidence that articles that cite more references get more citations. Review articles usually cite more references than ordinary articles (indeed, that is one of the criteria ISI uses for classifying articles as reviews!). It would be nice to partial out the respective contributions of these factors too (along, of course, with self-citations, co-author citations, citation circles, etc.). The outcomes may continue to be confirming the obvious, but it will still be nice to have the objective data at hand... :-) Stevan Harnad > Tom Wilson > > On 23 February 2010 12:23, Jacques Wainer wrote: >> >> Adminstrative info for SIGMETRICS (for example unsubscribe): >> http://web.utk.edu/~gwhitney/sigmetrics.html >> >> I used: >> >> @Article{reviewpap1, >> ?author = ? ? ? {Aksnes, D. W.}, >> ?title = ? ? ? ?{Citation rates and perceptions of scientific >> contribution}, >> ?journal = ? ? ?{Journal of the American Society for Information Science >> and Technology}, >> ?year = ? ? ? ? 2006, >> ?key = ? ? ? ? ?2, >> ?volume = ? ? ? 57, >> ?pages = ? ? ? ?{169-185}, >> doi = {10.1002/asi.20262}} >> >> >> @Article{reviewpap3, >> ?author = ? ? ? {H. P. F. Peters ?and ?A. F. J. van Raan}, >> ?title = ? ? ? ?{On determinants of citation scores: A case study in >> chemical engineering}, >> ?journal = ? ? ?{Journal of the American Society for Information Science}, >> ?year = ? ? ? ? 1994, >> ?volume = ? ? ? 45, >> ?number = ? ? ? 1, >> ?pages = ? ? ? ?{39 - 49}} >> >> >> as two references to the phenomenon. In this line, does anyone know >> of studies that point out that METHODOLOGICAL papers are also cited more >> than other research? >> >> Thanks >> >> Jacques Wainer > > > > -- > ---------------------------------------------------------- > Professor Tom Wilson, PhD, PhD (h.c.), > ----------------------------------------------------------- > Publisher and Editor in Chief: Information Research: an international > electronic journal > Website - http://InformationR.net/ir/ > Blog - http://info-research.blogspot.com/ > Photoblog - http://tomwilson.shutterchance.com/ > ----------------------------------------------------------- > E-mail: wilsontd at gmail.com > ----------------------------------------------------------- > From dwojick at HUGHES.NET Tue Feb 23 12:46:32 2010 From: dwojick at HUGHES.NET (David Wojick) Date: Tue, 23 Feb 2010 12:46:32 -0500 Subject: Studies showing that review articles get more citations In-Reply-To: Message-ID: That review articles are more highly cited than regular articles reporting specific results may support a theory of the nature of scientific communication. The typical article appears to have the following logic. There are 4 parts -- (1) here's the problem, (2) here's what we did, (3) here's what we found and (4) here's what it means. Part 4 may be brief or even absent. I recently did a quick study (just 6 articles) and found that about 60% of the citations occur in the first quarter of an article. This is the "here's the problem" part and it makes sense that review articles would play an important role in this phase of the explanation. David At 09:06 AM 2/23/2010, you wrote: >Adminstrative info for SIGMETRICS (for example unsubscribe): >http://web.utk.edu/~gwhitney/sigmetrics.html > >On Tue, Feb 23, 2010 at 8:51 AM, Tom Wilson wrote: > > > Is it really worth exploring? > > > > I'd have thought it self-evident that, if you are looking for a review of > > the literature, as most authors are, you'll site existing reviews; > similarly > > with methodology - if you are using a particular theoretical perspective > > you'll want to cite others as confirmation that you are on the right track. > > One of the problems of bibliometrics appears to be a stunning facility for > > determining the obvious :-) > >It is obvious that reviews will cite reviews, and that authors will >cite supporting studies, but is it obvious that reviews are cited more >than ordinary articles? Perhaps; but it would still be nice to see the >evidence. Especially nice to see the evidence for review *articles* -- >relative to ordinary articles -- separated from the evidence for >review *journals* relative to ordinary journals. > >There has also been some evidence that articles that cite more >references get more citations. Review articles usually cite more >references than ordinary articles (indeed, that is one of the criteria >ISI uses for classifying articles as reviews!). It would be nice to >partial out the respective contributions of these factors too (along, >of course, with self-citations, co-author citations, citation circles, >etc.). > >The outcomes may continue to be confirming the obvious, but it will >still be nice to have the objective data at hand... :-) > >Stevan Harnad > > > Tom Wilson > > > > On 23 February 2010 12:23, Jacques Wainer wrote: > >> > >> Adminstrative info for SIGMETRICS (for example unsubscribe): > >> http://web.utk.edu/~gwhitney/sigmetrics.html > >> > >> I used: > >> > >> @Article{reviewpap1, > >> author = {Aksnes, D. W.}, > >> title = {Citation rates and perceptions of scientific > >> contribution}, > >> journal = {Journal of the American Society for Information Science > >> and Technology}, > >> year = 2006, > >> key = 2, > >> volume = 57, > >> pages = {169-185}, > >> doi = {10.1002/asi.20262}} > >> > >> > >> @Article{reviewpap3, > >> author = {H. P. F. Peters and A. F. J. van Raan}, > >> title = {On determinants of citation scores: A case study in > >> chemical engineering}, > >> journal = {Journal of the American Society for Information Science}, > >> year = 1994, > >> volume = 45, > >> number = 1, > >> pages = {39 - 49}} > >> > >> > >> as two references to the phenomenon. In this line, does anyone know > >> of studies that point out that METHODOLOGICAL papers are also cited more > >> than other research? > >> > >> Thanks > >> > >> Jacques Wainer > > > > > > > > -- > > ---------------------------------------------------------- > > Professor Tom Wilson, PhD, PhD (h.c.), > > ----------------------------------------------------------- > > Publisher and Editor in Chief: Information Research: an international > > electronic journal > > Website - http://InformationR.net/ir/ > > Blog - http://info-research.blogspot.com/ > > Photoblog - http://tomwilson.shutterchance.com/ > > ----------------------------------------------------------- > > E-mail: wilsontd at gmail.com > > ----------------------------------------------------------- > > From jean.claude.guedon at UMONTREAL.CA Tue Feb 23 14:02:15 2010 From: jean.claude.guedon at UMONTREAL.CA (Jean-Claude =?ISO-8859-1?Q?Gu=E9don?=) Date: Tue, 23 Feb 2010 20:02:15 +0100 Subject: Studies showing that review articles get more citations In-Reply-To: Message-ID: It is a statement often heard among editors that if you want to raise the impact factor of a journal, you increase the number of review articles. The discussion here is about impact, and not impact factor. However, in "Impact factors: Use and Abuse" by Amin and Mabe published in the 1st issue of "Publishing Perspectives" in 2000 and reissued in 2007 includes an interesting curve that may explain both sides of the issue, impact as well as impact factor. 1. The fig. 3 called "Impact factors and journal type" shows that reviews see their impact move much faster and higher than letters and full papers. This means that, within the two-year window of the impact-factor, review articles will collect more citations than other ypes of articles and this will translate favourably on the ranking of the journal; 2. Because the curve of the reviews is much higher than that of leters or full papers, the area beneath the curve is also much greater, which means that impact is also higher in this case. Alas the authors do not give a source for their graph and one is left with a question: is this a graph to explain a phenomenon that still needs to be documented, or does it really reflect some empirical measurement? References are alluded to a 4,000 journal study, but details are few. The article in question comes from an Elsevier publication and may have to be taken with a grain of salt. Here is the URL: www.elsevier.com/framework_editors/pdfs/Perspectives1.pdf On a different, but related, topic, viz. the errors attached to impact factor measurements, Amin and Mabe do make an interesting comment when they say that journals whose impact factors differ by less than 25% should be lumped together in the same category (p. 5). Also impact of journals go up and down by as much as 40% for smaller journals, and 15% for journals that produce greater numbers of articles (more than 150 per year). Observed and random fluctuation, furthermore, are very similar (fig. 4b). This points to rather large errors attached to these measurements. Perhaps impact factors should be expressed with one or perhaps two significant figures... Le mardi 23 f?vrier 2010 ? 09:06 -0500, Stevan Harnad a ?crit : > Adminstrative info for SIGMETRICS (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html > > On Tue, Feb 23, 2010 at 8:51 AM, Tom Wilson wrote: > > > Is it really worth exploring? > > > > I'd have thought it self-evident that, if you are looking for a review of > > the literature, as most authors are, you'll site existing reviews; similarly > > with methodology - if you are using a particular theoretical perspective > > you'll want to cite others as confirmation that you are on the right track. > > One of the problems of bibliometrics appears to be a stunning facility for > > determining the obvious :-) > > It is obvious that reviews will cite reviews, and that authors will > cite supporting studies, but is it obvious that reviews are cited more > than ordinary articles? Perhaps; but it would still be nice to see the > evidence. Especially nice to see the evidence for review *articles* -- > relative to ordinary articles -- separated from the evidence for > review *journals* relative to ordinary journals. > > There has also been some evidence that articles that cite more > references get more citations. Review articles usually cite more > references than ordinary articles (indeed, that is one of the criteria > ISI uses for classifying articles as reviews!). It would be nice to > partial out the respective contributions of these factors too (along, > of course, with self-citations, co-author citations, citation circles, > etc.). > > The outcomes may continue to be confirming the obvious, but it will > still be nice to have the objective data at hand... :-) > > Stevan Harnad > > > Tom Wilson > > > > On 23 February 2010 12:23, Jacques Wainer wrote: > >> > >> Adminstrative info for SIGMETRICS (for example unsubscribe): > >> http://web.utk.edu/~gwhitney/sigmetrics.html > >> > >> I used: > >> > >> @Article{reviewpap1, > >> author = {Aksnes, D. W.}, > >> title = {Citation rates and perceptions of scientific > >> contribution}, > >> journal = {Journal of the American Society for Information Science > >> and Technology}, > >> year = 2006, > >> key = 2, > >> volume = 57, > >> pages = {169-185}, > >> doi = {10.1002/asi.20262}} > >> > >> > >> @Article{reviewpap3, > >> author = {H. P. F. Peters and A. F. J. van Raan}, > >> title = {On determinants of citation scores: A case study in > >> chemical engineering}, > >> journal = {Journal of the American Society for Information Science}, > >> year = 1994, > >> volume = 45, > >> number = 1, > >> pages = {39 - 49}} > >> > >> > >> as two references to the phenomenon. In this line, does anyone know > >> of studies that point out that METHODOLOGICAL papers are also cited more > >> than other research? > >> > >> Thanks > >> > >> Jacques Wainer > > > > > > > > -- > > ---------------------------------------------------------- > > Professor Tom Wilson, PhD, PhD (h.c.), > > ----------------------------------------------------------- > > Publisher and Editor in Chief: Information Research: an international > > electronic journal > > Website - http://InformationR.net/ir/ > > Blog - http://info-research.blogspot.com/ > > Photoblog - http://tomwilson.shutterchance.com/ > > ----------------------------------------------------------- > > E-mail: wilsontd at gmail.com > > ----------------------------------------------------------- > > From jean.claude.guedon at UMONTREAL.CA Tue Feb 23 14:03:13 2010 From: jean.claude.guedon at UMONTREAL.CA (Jean-Claude =?ISO-8859-1?Q?Gu=E9don?=) Date: Tue, 23 Feb 2010 20:03:13 +0100 Subject: Studies showing that review articles get more citations In-Reply-To: Message-ID: Le mardi 23 f?vrier 2010 ? 09:06 -0500, Stevan Harnad a ?crit : > Adminstrative info for SIGMETRICS (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html > > On Tue, Feb 23, 2010 at 8:51 AM, Tom Wilson wrote: > > > Is it really worth exploring? > > > > I'd have thought it self-evident that, if you are looking for a review of > > the literature, as most authors are, you'll site existing reviews; similarly > > with methodology - if you are using a particular theoretical perspective > > you'll want to cite others as confirmation that you are on the right track. > > One of the problems of bibliometrics appears to be a stunning facility for > > determining the obvious :-) > > It is obvious that reviews will cite reviews, and that authors will > cite supporting studies, but is it obvious that reviews are cited more > than ordinary articles? Perhaps; but it would still be nice to see the > evidence. Especially nice to see the evidence for review *articles* -- > relative to ordinary articles -- separated from the evidence for > review *journals* relative to ordinary journals. > > There has also been some evidence that articles that cite more > references get more citations. Review articles usually cite more > references than ordinary articles (indeed, that is one of the criteria > ISI uses for classifying articles as reviews!). It would be nice to > partial out the respective contributions of these factors too (along, > of course, with self-citations, co-author citations, citation circles, > etc.). > > The outcomes may continue to be confirming the obvious, but it will > still be nice to have the objective data at hand... :-) > > Stevan Harnad > > > Tom Wilson > > > > On 23 February 2010 12:23, Jacques Wainer wrote: > >> > >> Adminstrative info for SIGMETRICS (for example unsubscribe): > >> http://web.utk.edu/~gwhitney/sigmetrics.html > >> > >> I used: > >> > >> @Article{reviewpap1, > >> author = {Aksnes, D. W.}, > >> title = {Citation rates and perceptions of scientific > >> contribution}, > >> journal = {Journal of the American Society for Information Science > >> and Technology}, > >> year = 2006, > >> key = 2, > >> volume = 57, > >> pages = {169-185}, > >> doi = {10.1002/asi.20262}} > >> > >> > >> @Article{reviewpap3, > >> author = {H. P. F. Peters and A. F. J. van Raan}, > >> title = {On determinants of citation scores: A case study in > >> chemical engineering}, > >> journal = {Journal of the American Society for Information Science}, > >> year = 1994, > >> volume = 45, > >> number = 1, > >> pages = {39 - 49}} > >> > >> > >> as two references to the phenomenon. In this line, does anyone know > >> of studies that point out that METHODOLOGICAL papers are also cited more > >> than other research? > >> > >> Thanks > >> > >> Jacques Wainer > > > > > > > > -- > > ---------------------------------------------------------- > > Professor Tom Wilson, PhD, PhD (h.c.), > > ----------------------------------------------------------- > > Publisher and Editor in Chief: Information Research: an international > > electronic journal > > Website - http://InformationR.net/ir/ > > Blog - http://info-research.blogspot.com/ > > Photoblog - http://tomwilson.shutterchance.com/ > > ----------------------------------------------------------- > > E-mail: wilsontd at gmail.com > > ----------------------------------------------------------- > > From enrique.wulff at ICMAN.CSIC.ES Wed Feb 24 03:06:10 2010 From: enrique.wulff at ICMAN.CSIC.ES (Enrique Wulff) Date: Wed, 24 Feb 2010 09:06:10 +0100 Subject: A comparative study on communication structures of Chinese journals in the social sciences In-Reply-To: Message-ID: Curiously enough this article does not analyze such resources like the 'Journal of dialectics of nature. ISSN 1000-0763' or institutions as known as the 'school of marxism and philosophy' at the South China university of technology. Otherwise there is a lot of anti-Obama speech on the Web, and his attempt to normalize the relations between the United States and Russia, so the good learning environment in social sciences enjoyed there is obviated as a rule of thumb. Enrique Wulff At 15:47 22/02/2010, you wrote: >Adminstrative info for SIGMETRICS (for example unsubscribe): >http://web.utk.edu/~gwhitney/sigmetrics.html I suspect that the >problem lies in the communist ideology - it was the same in the days >of the Soviet Union - Marxist/Leninism is presumed to be the only >philosophy/methodology needed to explain social life and for a >social scientist to explore different avenues was positively >dangerous. With the collapse of communism in Russia, social >scientists there and in the former dependencies are now exploring >Western social philosophy and, to a certain extent, anything Marxist >is considered suspect - a complete reversal of the previous situation. > >Given the tight control of society in China, I imagine that looking >beyond Marxism for social explanation would also be considered suspect. > >Tom Wilson > >On 22 February 2010 14:33, Zhou, Ping ><Ping.Zhou at econ.kuleuven.be> wrote: > >Dear David, > > > >The "stagnation" is concluded based on two types of comparison: >comparison with natural and life sciences domestically and the >specialization comparison between Chinese and international communities. > > > >Domestic comparison shows that China's world share of publications >in the social sciences lags dramatically behind that in the natural >and life sciences (see Figure 1 of the paper). > > > >International comparison shows that China is less specialized than >its international counterparts. We argue that specialization links >to maturity of a discipline; less specialization may lead to slower progress. > > > >The full text of the paper can be retrieved at: > >http://arxiv.org/abs/1002.3590 > > > >With kind regards, > > > >Ping Zhou > > > > > >From: ASIS&T Special Interest Group on Metrics >[mailto:SIGMETRICS at LISTSERV.UTK.EDU] On Behalf Of David Wojick >Sent: Monday, February 22, 2010 12:57 PM >To: SIGMETRICS at LISTSERV.UTK.EDU >Subject: Re: [SIGMETRICS] A comparative study on communication >structures of Chinese journals in the social sciences > > > >Adminstrative info for SIGMETRICS (for example unsubscribe): >http://web.utk.edu/~gwhitney/sigmetrics.html >Dear Ping Zhou, > > > >What do you mean by "stagnation" and how are you measuring it? This >seems like an odd conclusion to draw from a citation analysis. > > >David > >On Feb 22, 2010, Zhou, Ping ><Ping.Zhou at ECON.KULEUVEN.BE> wrote: > >Adminstrative info for SIGMETRICS (for example unsubscribe): >http://web.utk.edu/~gwhitney/sigmetrics.html > >Title: A comparative study on communication structures of Chinese >journals in the social sciences > >Source: Journal of the American Society for Information Science and >Technology, forthcoming > >Authors: Ping Zhou, Xinning Su, Loet Leydesdorff > >Abstract: We argue that the communication structures in the Chinese >social sciences have not yet been sufficiently reformed. Citation >patterns among Chinese domestic journals in three subject >areas-political science and marxism, library and information >science, and economics-are compared with their counterparts >internationally. Like their colleagues in the natural and life >sciences, Chinese scholars in the social sciences provide fewer >references to journal publications than their international >counterparts; like their international colleagues, social scientists >provide fewer references than natural sciences. The resulting >citation networks, therefore, are sparse. Nevertheless, the citation >structures clearly suggest that the Chinese social sciences are far >less specialized in terms of disciplinary delineations than their >international counterparts. Marxism studies are more established >than political science in China. In terms of the impact of the >Chinese political system o! >n academic fields, disciplines closely related to the political >system are less specialized than those weakly related. In the >discussion section, we explore reasons that may cause the current >stagnation and provide policy recommendations. > > > > >-- >---------------------------------------------------------- >Professor Tom Wilson, PhD, PhD (h.c.), >----------------------------------------------------------- >Publisher and Editor in Chief: Information Research: an >international electronic journal >Website - http://InformationR.net/ir/ >Blog - http://info-research.blogspot.com/ >Photoblog - >http://tomwilson.shutterchance.com/ >----------------------------------------------------------- >E-mail: wilsontd at gmail.com >----------------------------------------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: From Ping.Zhou at ECON.KULEUVEN.BE Wed Feb 24 04:40:20 2010 From: Ping.Zhou at ECON.KULEUVEN.BE (Zhou, Ping) Date: Wed, 24 Feb 2010 10:40:20 +0100 Subject: A comparative study on communication structures of Chinese journals in the social sciences In-Reply-To: <20100224092036.A6CA3B7108@merlin.uca.es> Message-ID: Dear Enrique, It's impossible and unnecessary to analyze every journal covered by the CSSCI, let alone the journal you mentioned is not covered by the CSSCI. Regards, Ping Zhou Expertisecentrum O&O Monitoring, Katholieke Universiteit Leuven Institute of Scientific and Technical Information of China Dekenstraat 2, bus 3536 B 3000 LEUVEN, Belgium Tel. +32 16/32.57.50 Fax +32 16/ 32.57.99 Ping.Zhou at econ.kuleuven.be From: ASIS&T Special Interest Group on Metrics [mailto:SIGMETRICS at LISTSERV.UTK.EDU] On Behalf Of Enrique Wulff Sent: Wednesday, February 24, 2010 9:06 AM To: SIGMETRICS at LISTSERV.UTK.EDU Subject: Re: [SIGMETRICS] A comparative study on communication structures of Chinese journals in the social sciences Curiously enough this article does not analyze such resources like the 'Journal of dialectics of nature. ISSN 1000-0763' or institutions as known as the 'school of marxism and philosophy' at the South China university of technology. Otherwise there is a lot of anti-Obama speech on the Web, and his attempt to normalize the relations between the United States and Russia, so the good learning environment in social sciences enjoyed there is obviated as a rule of thumb. Enrique Wulff At 15:47 22/02/2010, you wrote: nist ideology - it was the same in the days of the Soviet Union - Marxist/Leninism is presumed to be the only philosophy/methodology needed to explain social life and for a social scientist to explore different avenues was positively dangerous. With the collapse of communism in Russia, social scientists there and in the former dependencies are now exploring Western social philosophy and, to a certain extent, anything Marxist is considered suspect - a complete reversal of the previous situation. Given the tight control of society in China, I imagine that looking beyond Marxism for social explanation would also be considered suspect. Tom Wilson On 22 February 2010 14:33, Zhou, Ping < Ping.Zhou at econ.kuleuven.be> wrote: Dear David, The "stagnation" is concluded based on two types of comparison: comparison with natural and life sciences domestically and the specialization comparison between Chinese and international communities. Domestic comparison shows that China's world share of publications in the social sciences lags dramatically behind that in the natural and life sciences (see Figure 1 of the paper). International comparison shows that China is less specialized than its international counterparts. We argue that specialization links to maturity of a discipline; less specialization may lead to slower progress. The full text of the paper can be retrieved at: http://arxiv.org/abs/1002.3590 With kind regards, Ping Zhou From: ASIS&T Special Interest Group on Metrics [ mailto:SIGMETRICS at LISTSERV.UTK.EDU] On Behalf Of David Wojick Sent: Monday, February 22, 2010 12:57 PM To: SIGMETRICS at LISTSERV.UTK.EDU Subject: Re: [SIGMETRICS] A comparative study on communication structures of Chinese journals in the social sciences What do you mean by "stagnation" and how are you measuring it? This seems like an odd conclusion to draw from a citation analysis. David On Feb 22, 2010, Zhou, Ping < Ping.Zhou at ECON.KULEUVEN.BE> wrote: Title: A comparative study on communication structures of Chinese journals in the social sciences Source: Journal of the American Society for Information Science and Technology, forthcoming Authors: Ping Zhou, Xinning Su, Loet Leydesdorff Abstract: We argue that the communication structures in the Chinese social sciences have not yet been sufficiently reformed. Citation patterns among Chinese domestic journals in three subject areas-political science and marxism, library and information science, and economics-are compared with their counterparts internationally. Like their colleagues in the natural and life sciences, Chinese scholars in the social sciences provide fewer references to journal publications than their international counterparts; like their international colleagues, social scientists provide fewer references than natural sciences. The resulting citation networks, therefore, are sparse. Nevertheless, the citation structures clearly suggest that the Chinese social sciences are far less specialized in terms of disciplinary delineations than their international counterparts. Marxism studies are more established than political science in China. In terms of the impact of the Chinese political system o! n academic fields, disciplines closely related to the political system are less specialized than those weakly related. In the discussion section, we explore reasons that may cause the current stagnation and provide policy recommendations. -- ---------------------------------------------------------- Professor Tom Wilson, PhD, PhD (h.c.), ----------------------------------------------------------- Publisher and Editor in Chief: Information Research: an international electronic journal Website - http://InformationR.net/ir/ Blog - http://info-research.blogspot.com/ Photoblog - http://tomwilson.shutterchance.com/ ----------------------------------------------------------- E-mail: wilsontd at gmail.com ----------------------------------------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: From nvaneck at ESE.EUR.NL Wed Feb 24 09:38:21 2010 From: nvaneck at ESE.EUR.NL (Nees Jan van Eck) Date: Wed, 24 Feb 2010 15:38:21 +0100 Subject: Studies showing that review articles get more citations In-Reply-To: Message-ID: Dear Stevan and others, Reviews indeed on average receive more citations than ordinary articles. Some time ago I collected some data on this issue. See the attached Excel file. The data (taken from Web of Science) is based on publications published between 1999 and 2003. For each publication citations were counted within a four year time window. Some data can also be found in a paper by Lundberg published in Journal of Informetrics in 2007 (http://dx.doi.org/10.1016/j.joi.2006.09.007). Note that the distinction between ordinary articles and reviews in Web of Science is quite arbitrary and mainly determined by the number of references of a publication. Publications with more than 100 references are (almost) always regarded as reviews in Web of Science. Best regards, Nees Jan van Eck ======================================================== Nees Jan van Eck MSc Researcher Centre for Science and Technology Studies Leiden University P.O. Box 905 2300 AX Leiden The Netherlands Willem Einthoven Building, Room B5-35 Tel: +31 (0)71 527 6445 Fax: +31 (0)71 527 3911 E-mail: ecknjpvan at cwts.leidenuniv.nl Homepage: www.neesjanvaneck.nl ======================================================== -----Original Message----- From: ASIS&T Special Interest Group on Metrics [mailto:SIGMETRICS at LISTSERV.UTK.EDU] On Behalf Of Stevan Harnad Sent: Tuesday, February 23, 2010 3:06 PM To: SIGMETRICS at LISTSERV.UTK.EDU Subject: Re: [SIGMETRICS] Studies showing that review articles get more citations On Tue, Feb 23, 2010 at 8:51 AM, Tom Wilson wrote: > Is it really worth exploring? > > ?I'd have thought it self-evident that, if you are looking for a > review of the literature, as most authors are, you'll site existing > reviews; similarly with methodology - if you are using a particular > theoretical perspective you'll want to cite others as confirmation that you are on the right track. > ?One of the problems of bibliometrics appears to be a stunning > facility for determining the obvious :-) It is obvious that reviews will cite reviews, and that authors will cite supporting studies, but is it obvious that reviews are cited more than ordinary articles? Perhaps; but it would still be nice to see the evidence. Especially nice to see the evidence for review *articles* -- relative to ordinary articles -- separated from the evidence for review *journals* relative to ordinary journals. There has also been some evidence that articles that cite more references get more citations. Review articles usually cite more references than ordinary articles (indeed, that is one of the criteria ISI uses for classifying articles as reviews!). It would be nice to partial out the respective contributions of these factors too (along, of course, with self-citations, co-author citations, citation circles, etc.). The outcomes may continue to be confirming the obvious, but it will still be nice to have the objective data at hand... :-) Stevan Harnad > Tom Wilson > > On 23 February 2010 12:23, Jacques Wainer wrote: >> >> Adminstrative info for SIGMETRICS (for example unsubscribe): >> http://web.utk.edu/~gwhitney/sigmetrics.html >> >> I used: >> >> @Article{reviewpap1, >> ?author = ? ? ? {Aksnes, D. W.}, >> ?title = ? ? ? ?{Citation rates and perceptions of scientific >> contribution}, >> ?journal = ? ? ?{Journal of the American Society for Information >> Science and Technology}, >> ?year = ? ? ? ? 2006, >> ?key = ? ? ? ? ?2, >> ?volume = ? ? ? 57, >> ?pages = ? ? ? ?{169-185}, >> doi = {10.1002/asi.20262}} >> >> >> @Article{reviewpap3, >> ?author = ? ? ? {H. P. F. Peters ?and ?A. F. J. van Raan}, >> ?title = ? ? ? ?{On determinants of citation scores: A case study in >> chemical engineering}, >> ?journal = ? ? ?{Journal of the American Society for Information >> Science}, >> ?year = ? ? ? ? 1994, >> ?volume = ? ? ? 45, >> ?number = ? ? ? 1, >> ?pages = ? ? ? ?{39 - 49}} >> >> >> as two references to the phenomenon. In this line, does anyone know >> of studies that point out that METHODOLOGICAL papers are also cited >> more than other research? >> >> Thanks >> >> Jacques Wainer > > > > -- > ---------------------------------------------------------- > Professor Tom Wilson, PhD, PhD (h.c.), > ----------------------------------------------------------- > Publisher and Editor in Chief: Information Research: an international > electronic journal Website - http://InformationR.net/ir/ Blog - > http://info-research.blogspot.com/ > Photoblog - http://tomwilson.shutterchance.com/ > ----------------------------------------------------------- > E-mail: wilsontd at gmail.com > ----------------------------------------------------------- > -------------- next part -------------- A non-text attachment was scrubbed... Name: data.xls Type: application/vnd.ms-excel Size: 32768 bytes Desc: not available URL: From isidro.aguillo at CCHS.CSIC.ES Wed Feb 24 11:35:41 2010 From: isidro.aguillo at CCHS.CSIC.ES (Isidro F. Aguillo) Date: Wed, 24 Feb 2010 17:35:41 +0100 Subject: Ranking Web of Universities: 2010 July edition In-Reply-To: <4A705CE1.9070609@cchs.csic.es> Message-ID: Ranking Web of World Universities: 2010 edition The first 2010 edition of the Ranking Web of Universities is already available in the webometrics site (http://www.webometrics.info), including a larger coverage and improved methodology. The ranking started in 2004 analyzes more than 18,000 universities worldwide providing the classification for the top 8,000, many of them from developing countries traditionally ignored in other rankings. The basic premise is that in the 21st century the web should reflect the organization, activities, research results, knowledge transfer, prestige, and international visibility of the universities. If the web performance of an institution is below the expected position according to their academic excellence, university authorities should reconsider their web policy, promoting substantial increases of the volume and quality of their electronic publications. The web indicators are easy to collect and analyze and since the results are similar to those obtained for a few hundred universities by other far more complex and expensive rankings, the Ranking Web provide in this way the opportunity of obtaining reliable ranks for several thousand institutions. The Ranking web with its large coverage allows comparisons not only between universities but also countries and regions, highlighting the role of nation building institutions that are non-research intensive oriented. Academic ranking is only one of the aims of the webometrics site, as showing bad web policies and promoting the Open Access initiatives are objectives targeted explicitly. Web presence measures the activity and visibility of the institutions and it is a good indicator of impact and prestige of universities. Rank summarizes the global performance of the University, provides information for candidate students and scholars, and reflects the commitment to the dissemination of scientific knowledge. ?The university must seek excellence. The recognition of the institutions corresponds to the international community and to the economic, social and politic agents involved in the university activity. Nowadays, the best way to measure all of these acknowledgments is through the measure of the Web link visibility, a true virtual referendum about the university excellence?, said Isidro Aguillo, editor of the Ranking (Cybermetrics Lab ? CSIC). With respect to the results, like in previous editions, the first positions are occupied by North American and Canadian universities, with a virtual draw on first position between Harvard and MIT, followed by Stanford, Berkeley, etc. European universities appear still in delayed positions, with the first one being Cambridge ranked on the 27th place. Regarding to results according to region, the first ones are: ? Latin America: Sao Paulo, closely followed by UNAM ? Europe: Cambridge and Oxford ? Eastern Europe: Charles University (Prague) ? Asia: Tokyo University ? South East Asia: National University of Singapore ? South Asia: Indian Institute of Technology Bombay ? Oceania: Australian National University ? Africa: University of Cape Town The Ranking Web is being elaborated by the Cybermetrics Lab is a research group belonging to the National Research Council (CSIC), the main public research organization in Spain. Since mid-nineties this group has designed web indicators for describing and evaluating the higher education and R&D sectors. The Ranking Web is widely used by students, scholars and directive staff of universities all over the world, receiving more than 4 million visitors per year. -- ************************************* Isidro F. Aguillo, HonPhD Cybermetrics Lab CCHS - CSIC Albasanz, 26-28, 3C1. 28037 Madrid. Spain Ph. 91-602 2890. Fax: 91-602 2971 isidro.aguillo @ cchs.csic.es www. webometrics.info ************************************* From garfield at CODEX.CIS.UPENN.EDU Wed Feb 24 14:12:25 2010 From: garfield at CODEX.CIS.UPENN.EDU (Eugene Garfield) Date: Wed, 24 Feb 2010 14:12:25 -0500 Subject: Beel, J; Gipp, B. 2009. Google Scholar's Ranking Algorithm: The Impact of Citation Counts (An Empirical Study). RCIS 2009 Message-ID: Beel, J; Gipp, B. 2009. Google Scholar's Ranking Algorithm: The Impact of Citation Counts (An Empirical Study). RCIS 2009: PROCEEDINGS OF THE IEEE INTERNATIONAL CONFERENCE ON RESEARCH CHALLENGES IN INFORMATION SCIENCE: 439-445. edited by Flory, A; Collard, M.presented at 3rd International Conference on Research Challenges in Information Science (RCIS 2009) in Fez, MOROCCO, APR 22-24, 2009. Author Full Name(s): Beel, Joeran; Gipp, Bela Language: English Document Type: Proceedings Paper Abstract: Google Scholar is one of the major academic search engines but its ranking algorithm for academic articles is unknown. In a recent study we partly reverse-engineered the algorithm. This paper presents the results of our second study. While the previous study provided a broad overview, the current study focused on analyzing the correlation of an article's citation count and its ranking in Google Scholar. For this study, citation counts and rankings of 1,364,757 articles were analyzed. Some results of our first study were confirmed: Citation counts is the highest weighed factor in Google Scholar's ranking algorithm. Highly cited articles are found significantly more often in higher positions than articles that are cited less often. Therefore, Google Scholar seems to be more suitable for searching standard literature than for gems or articles by authors advancing a view different from the mainstream. However, interesting exceptions for some search queries occurred. In some cases no correlation existed; in others bizarre patterns were recognizable, suggesting that citation counts sometimes have no impact at all on articles' rankings. Addresses: [Beel, Joeran; Gipp, Bela] Otto VonGuericke Univ Magdegurg, Dept Comp Sci, ITI VLBA Lab Scienstein, D-39016 Magdeburg, Germany Reprint Address: Beel, J, Otto VonGuericke Univ Magdegurg, Dept Comp Sci, ITI VLBA Lab Scienstein, D-39016 Magdeburg, Germany. ISBN: 978-1-4244-2864-9 preprint URL: http://www.sciplore.org/publications/2009- Google_Scholar's_Ranking_Algorithm_-_The_Impact_of_Citation_Counts_-- _preprint%20aseotest333.pdf From garfield at CODEX.CIS.UPENN.EDU Wed Feb 24 14:15:12 2010 From: garfield at CODEX.CIS.UPENN.EDU (Eugene Garfield) Date: Wed, 24 Feb 2010 14:15:12 -0500 Subject: Aksnes, DW. 2006. Citation rates and perceptions of scientific contribution. JASIST. 57 (2): 169-185. Message-ID: Aksnes, DW. 2006. Citation rates and perceptions of scientific contribution. JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY 57 (2): 169-185. Language: English Document Type: Article Abstract: In this study scientists were asked about their own publication history and their citation counts. The study shows that the citation counts of the publications correspond reasonably well with the authors' own assessments of scientific contribution. Generally, citations proved to have the highest accuracy in identifying either major or minor contributions. Nevertheless, according to these judgments, citations are not a reliable indicator of scientific contribution at the level of the individual article. In the construction of relative citation indicators, the average citation rate of the subfield appears to be slightly more appropriate as a reference standard than the journal citation rate. The study confirms that review articles are cited more frequently than other publication types. Compared to the significance authors attach to these articles they appear to be considerably "overcited." However, there were only marginal differences in the citation rates between empirical, methods, and theoretical contributions. Addresses: Norwegian Inst Studies Res & Higher Educ, NIFU, STEP, N-0167 Oslo, Norway Reprint Address: Aksnes, DW, Norwegian Inst Studies Res & Higher Educ, NIFU, STEP, Wergelandsv 7, N-0167 Oslo, Norway. E-mail Address: Dag.W.Aksnes at nifustep.no ISSN: 1532-2882 DOI: 10.1002/asi.20262 From garfield at CODEX.CIS.UPENN.EDU Wed Feb 24 14:18:31 2010 From: garfield at CODEX.CIS.UPENN.EDU (Eugene Garfield) Date: Wed, 24 Feb 2010 14:18:31 -0500 Subject: Agarwal, P; Searls, DB. 2009. Can literature analysis identify innovation drivers in drug discovery?. NATURE REVIEWS DRUG DISCOVERY 8 (11): 865-878 Message-ID: Agarwal, P; Searls, DB. 2009. Can literature analysis identify innovation drivers in drug discovery?. NATURE REVIEWS DRUG DISCOVERY 8 (11): 865-878. Author Full Name(s): Agarwal, Pankaj; Searls, David B. Language: English Document Type: Article Abstract: Drug discovery must be guided not only by medical need and commercial potential, but also by the areas in which new science is creating therapeutic opportunities, such as target identification and the understanding of disease mechanisms. To systematically identify such areas of high scientific activity, we use bibliometrics and related data-mining methods to analyse over half a terabyte of data, including PubMed abstracts, literature citation data and patent filings. These analyses reveal trends in scientific activity related to disease studied at varying levels, down to individual genes and pathways, and provide methods to monitor areas in which scientific advances are likely to create new therapeutic opportunities. Addresses: [Agarwal, Pankaj; Searls, David B.] GlaxoSmithKline Inc, Computat Biol Dept, King Of Prussia, PA 19406 USA Reprint Address: Agarwal, P, GlaxoSmithKline Inc, Computat Biol Dept, 709 Swedeland Rd,POB 1539, King Of Prussia, PA 19406 USA. E-mail Address: Pankaj.Agarwal at gsk.com ISSN: 1474-1776 DOI: 10.1038/nrd2973 URL: http://www.nature.com/nrd/journal/v8/n11/full/nrd2973.html From hck at LRZ.UNI-MUENCHEN.DE Thu Feb 25 04:39:10 2010 From: hck at LRZ.UNI-MUENCHEN.DE (Heinrich C. Kuhn) Date: Thu, 25 Feb 2010 10:39:10 +0100 Subject: Ranking Web of Universities: 2010 July edition In-Reply-To: <4B85555D.10107@cchs.csic.es> Message-ID: First of all: Thanks! Second: at http://www.webometrics.info/ it is written: " We have discovered several universities that are hosting large numbers of academic papers authored by scientist that do not belong to those institutions. This is not only unfair, but it clearly violates copyright of the involved papers. " Please be aware that this is not necessarily so: Such papers may be hosted where they are hosted because - The author of that paper wished then paper to be hosted at the hosting institution and the hosting institution acted on his wish. - The license (CC or other) of the paper permits such hosting and the archive in question is an archive for a certain subject. - The author once was a member of the hosting institution. Best regards Heinrich C. Kuhn +---------------------------------------------- | Dr. Heinrich C. Kuhn | Seminar fuer Geistesgeschichte und | Philosophie der Renaissance | Ludwig-Maximilians-Universitaet Muenchen | D-80539 Muenchen / Ludwigstr. 31 | T.: +49-89-2180 2018, F.: +49-89-2180 2907 | http://www.phil-hum-ren.uni-muenchen.de/ +---------------------------------------------- From isidro.aguillo at CCHS.CSIC.ES Thu Feb 25 04:55:59 2010 From: isidro.aguillo at CCHS.CSIC.ES (Isidro F. Aguillo) Date: Thu, 25 Feb 2010 10:55:59 +0100 Subject: Ranking Web of Universities: 2010 July edition In-Reply-To: <4B86534E.5788.4DBD4E6F@hck.lrz.uni-muenchen.de> Message-ID: Dear Dr. Kuhn: Thank you for your kind message. As supporters of Open Access initiatives, we are in favor of these situations you described. However our paragraphs refer to a completely different situation: librarians that are adding THOUSANDS of papers of authors unrelated with their organization and without any copyright agreement with them. Even worst, the reason for doing that is for increasing their web ranking. Best regards, Heinrich C. Kuhn escribi?: > Adminstrative info for SIGMETRICS (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html > > First of all: Thanks! > > Second: at http://www.webometrics.info/ > it is written: > " > We have discovered several universities that are hosting large > numbers of academic papers authored by scientist that do not > belong to those institutions. This is not only unfair, but it > clearly violates copyright of the involved papers. > " > Please be aware that this is not necessarily so: Such papers > may be hosted where they are hosted because > - The author of that paper wished then paper to be > hosted at the hosting institution and the hosting > institution acted on his wish. > - The license (CC or other) of the paper permits such > hosting and the archive in question is an archive > for a certain subject. > - The author once was a member of the hosting institution. > > Best regards > > Heinrich C. Kuhn > > +---------------------------------------------- > | Dr. Heinrich C. Kuhn > | Seminar fuer Geistesgeschichte und > | Philosophie der Renaissance > | Ludwig-Maximilians-Universitaet Muenchen > | D-80539 Muenchen / Ludwigstr. 31 > | T.: +49-89-2180 2018, F.: +49-89-2180 2907 > | http://www.phil-hum-ren.uni-muenchen.de/ > +---------------------------------------------- > > -- ************************************* Isidro F. Aguillo, HonPhD Cybermetrics Lab CCHS - CSIC Albasanz, 26-28, 3C1. 28037 Madrid. Spain Ph. 91-602 2890. Fax: 91-602 2971 isidro.aguillo @ cchs.csic.es www. webometrics.info ************************************* From jean.claude.guedon at UMONTREAL.CA Thu Feb 25 05:18:54 2010 From: jean.claude.guedon at UMONTREAL.CA (Jean-Claude =?ISO-8859-1?Q?Gu=E9don?=) Date: Thu, 25 Feb 2010 11:18:54 +0100 Subject: Ranking Web of Universities: 2010 July edition In-Reply-To: <4B86492F.6070605@cchs.csic.es> Message-ID: I find all this very amusing. As soon as you establish measurements of quality in one way or another, you are bound to see people adapt to it to their advantage. For example, because editors want to increase the impact factors of their journals, they increase the number of review articles and they ask their authors to cite articles from the journal itself. We must never forget the ancient links between our species and chimps - a point that Macchiavelli had clearly understood without knowing the fact. This said, there is an accusation here: librarians are said to store papers originating from outside their institution to improve its rankings in webometrics... While credible, is this allegation proved or even provable ? I would love to see real proofs of this beyond anecdotal rumours (sometimes aka urban myth). Which universities? Since when? Etc... The long and the short of this is that there does not exist a simple way to measure quality. Those who believe there is either delude themselves, or manipulate others' beliefs. Jean-Claude Gu?don Le jeudi 25 f?vrier 2010 ? 10:55 +0100, Isidro F. Aguillo a ?crit : > Adminstrative info for SIGMETRICS (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html > > Dear Dr. Kuhn: > > Thank you for your kind message. As supporters of Open Access > initiatives, we are in favor of these situations you described. However > our paragraphs refer to a completely different situation: librarians > that are adding THOUSANDS of papers of authors unrelated with their > organization and without any copyright agreement with them. Even worst, > the reason for doing that is for increasing their web ranking. > > Best regards, > > Heinrich C. Kuhn escribi?: > > Adminstrative info for SIGMETRICS (for example unsubscribe): > > http://web.utk.edu/~gwhitney/sigmetrics.html > > > > First of all: Thanks! > > > > Second: at http://www.webometrics.info/ > > it is written: > > " > > We have discovered several universities that are hosting large > > numbers of academic papers authored by scientist that do not > > belong to those institutions. This is not only unfair, but it > > clearly violates copyright of the involved papers. > > " > > Please be aware that this is not necessarily so: Such papers > > may be hosted where they are hosted because > > - The author of that paper wished then paper to be > > hosted at the hosting institution and the hosting > > institution acted on his wish. > > - The license (CC or other) of the paper permits such > > hosting and the archive in question is an archive > > for a certain subject. > > - The author once was a member of the hosting institution. > > > > Best regards > > > > Heinrich C. Kuhn > > > > +---------------------------------------------- > > | Dr. Heinrich C. Kuhn > > | Seminar fuer Geistesgeschichte und > > | Philosophie der Renaissance > > | Ludwig-Maximilians-Universitaet Muenchen > > | D-80539 Muenchen / Ludwigstr. 31 > > | T.: +49-89-2180 2018, F.: +49-89-2180 2907 > > | http://www.phil-hum-ren.uni-muenchen.de/ > > +---------------------------------------------- > > > > > > -- Jean-Claude Gu?don Professeur titulaire Litt?rature compar?e Universit? de Montr?al -------------- next part -------------- An HTML attachment was scrubbed... URL: From isidro.aguillo at CCHS.CSIC.ES Thu Feb 25 06:17:03 2010 From: isidro.aguillo at CCHS.CSIC.ES (Isidro F. Aguillo) Date: Thu, 25 Feb 2010 12:17:03 +0100 Subject: Ranking Web of Universities: 2010 July edition In-Reply-To: <1267093134.1944.63.camel@pinky> Message-ID: Dear Jean Claude: We have evidence from some universities from two different countries. We mentioned one specific (very important) university in our previous ranking and their response was very aggressive for later accepting that at least 3750 papers in the repository does not belong to their own scholars. Our figures are larger and checking it is very easy (in private I can point you some candidates). Best regards, Jean-Claude Gu?don escribi?: > Adminstrative info for SIGMETRICS (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html I find all this very > amusing. As soon as you establish measurements of quality in one way > or another, you are bound to see people adapt to it to their > advantage. For example, because editors want to increase the impact > factors of their journals, they increase the number of review articles > and they ask their authors to cite articles from the journal itself. > We must never forget the ancient links between our species and chimps > - a point that Macchiavelli had clearly understood without knowing the > fact. > > This said, there is an accusation here: librarians are said to store > papers originating from outside their institution to improve its > rankings in webometrics... While credible, is this allegation proved > or even provable ? I would love to see real proofs of this beyond > anecdotal rumours (sometimes aka urban myth). Which universities? > Since when? Etc... > > The long and the short of this is that there does not exist a simple > way to measure quality. Those who believe there is either delude > themselves, or manipulate others' beliefs. > > Jean-Claude Gu?don > > > Le jeudi 25 f?vrier 2010 ? 10:55 +0100, Isidro F. Aguillo a ?crit : >> Adminstrative info for SIGMETRICS (for example unsubscribe): >> http://web.utk.edu/~gwhitney/sigmetrics.html >> >> Dear Dr. Kuhn: >> >> Thank you for your kind message. As supporters of Open Access >> initiatives, we are in favor of these situations you described. However >> our paragraphs refer to a completely different situation: librarians >> that are adding THOUSANDS of papers of authors unrelated with their >> organization and without any copyright agreement with them. Even worst, >> the reason for doing that is for increasing their web ranking. >> >> Best regards, >> >> Heinrich C. Kuhn escribi?: >> > Adminstrative info for SIGMETRICS (for example unsubscribe): >> > http://web.utk.edu/~gwhitney/sigmetrics.html >> > >> > First of all: Thanks! >> > >> > Second: at http://www.webometrics.info/ >> > it is written: >> > " >> > We have discovered several universities that are hosting large >> > numbers of academic papers authored by scientist that do not >> > belong to those institutions. This is not only unfair, but it >> > clearly violates copyright of the involved papers. >> > " >> > Please be aware that this is not necessarily so: Such papers >> > may be hosted where they are hosted because >> > - The author of that paper wished then paper to be >> > hosted at the hosting institution and the hosting >> > institution acted on his wish. >> > - The license (CC or other) of the paper permits such >> > hosting and the archive in question is an archive >> > for a certain subject. >> > - The author once was a member of the hosting institution. >> > >> > Best regards >> > >> > Heinrich C. Kuhn >> > >> > +---------------------------------------------- >> > | Dr. Heinrich C. Kuhn >> > | Seminar fuer Geistesgeschichte und >> > | Philosophie der Renaissance >> > | Ludwig-Maximilians-Universitaet Muenchen >> > | D-80539 Muenchen / Ludwigstr. 31 >> > | T.: +49-89-2180 2018, F.: +49-89-2180 2907 >> > | http://www.phil-hum-ren.uni-muenchen.de/ >> > +---------------------------------------------- >> > >> > >> >> >> > > > -- > Jean-Claude Gu?don > Professeur titulaire > Litt?rature compar?e > Universit? de Montr?al > > -- ************************************* Isidro F. Aguillo, HonPhD Cybermetrics Lab CCHS - CSIC Albasanz, 26-28, 3C1. 28037 Madrid. Spain Ph. 91-602 2890. Fax: 91-602 2971 isidro.aguillo @ cchs.csic.es www. webometrics.info ************************************* From Christina.Pikas at JHUAPL.EDU Thu Feb 25 16:34:05 2010 From: Christina.Pikas at JHUAPL.EDU (Pikas, Christina K.) Date: Thu, 25 Feb 2010 16:34:05 -0500 Subject: Ranking Web of Universities: 2010 July edition In-Reply-To: <4B865C2F.8050109@cchs.csic.es> Message-ID: The two most obvious are Penn State and Cornell. Surely they receive special treatment? Christina Pikas -----Original Message----- From: ASIS&T Special Interest Group on Metrics [mailto:SIGMETRICS at LISTSERV.UTK.EDU] On Behalf Of Isidro F. Aguillo Sent: Thursday, February 25, 2010 6:17 AM To: SIGMETRICS at LISTSERV.UTK.EDU Subject: Re: [SIGMETRICS] Ranking Web of Universities: 2010 July edition Dear Jean Claude: We have evidence from some universities from two different countries. We mentioned one specific (very important) university in our previous ranking and their response was very aggressive for later accepting that at least 3750 papers in the repository does not belong to their own scholars. Our figures are larger and checking it is very easy (in private I can point you some candidates). Best regards, Jean-Claude Gu?don escribi?: > Adminstrative info for SIGMETRICS (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html I find all this very > amusing. As soon as you establish measurements of quality in one way > or another, you are bound to see people adapt to it to their > advantage. For example, because editors want to increase the impact > factors of their journals, they increase the number of review articles > and they ask their authors to cite articles from the journal itself. > We must never forget the ancient links between our species and chimps > - a point that Macchiavelli had clearly understood without knowing the > fact. > > This said, there is an accusation here: librarians are said to store > papers originating from outside their institution to improve its > rankings in webometrics... While credible, is this allegation proved > or even provable ? I would love to see real proofs of this beyond > anecdotal rumours (sometimes aka urban myth). Which universities? > Since when? Etc... > > The long and the short of this is that there does not exist a simple > way to measure quality. Those who believe there is either delude > themselves, or manipulate others' beliefs. > > Jean-Claude Gu?don > > > Le jeudi 25 f?vrier 2010 ? 10:55 +0100, Isidro F. Aguillo a ?crit : >> Adminstrative info for SIGMETRICS (for example unsubscribe): >> http://web.utk.edu/~gwhitney/sigmetrics.html >> >> Dear Dr. Kuhn: >> >> Thank you for your kind message. As supporters of Open Access >> initiatives, we are in favor of these situations you described. However >> our paragraphs refer to a completely different situation: librarians >> that are adding THOUSANDS of papers of authors unrelated with their >> organization and without any copyright agreement with them. Even worst, >> the reason for doing that is for increasing their web ranking. >> >> Best regards, >> >> Heinrich C. Kuhn escribi?: >> > Adminstrative info for SIGMETRICS (for example unsubscribe): >> > http://web.utk.edu/~gwhitney/sigmetrics.html >> > >> > First of all: Thanks! >> > >> > Second: at http://www.webometrics.info/ >> > it is written: >> > " >> > We have discovered several universities that are hosting large >> > numbers of academic papers authored by scientist that do not >> > belong to those institutions. This is not only unfair, but it >> > clearly violates copyright of the involved papers. >> > " >> > Please be aware that this is not necessarily so: Such papers >> > may be hosted where they are hosted because >> > - The author of that paper wished then paper to be >> > hosted at the hosting institution and the hosting >> > institution acted on his wish. >> > - The license (CC or other) of the paper permits such >> > hosting and the archive in question is an archive >> > for a certain subject. >> > - The author once was a member of the hosting institution. >> > >> > Best regards >> > >> > Heinrich C. Kuhn >> > >> > +---------------------------------------------- >> > | Dr. Heinrich C. Kuhn >> > | Seminar fuer Geistesgeschichte und >> > | Philosophie der Renaissance >> > | Ludwig-Maximilians-Universitaet Muenchen >> > | D-80539 Muenchen / Ludwigstr. 31 >> > | T.: +49-89-2180 2018, F.: +49-89-2180 2907 >> > | http://www.phil-hum-ren.uni-muenchen.de/ >> > +---------------------------------------------- >> > >> > >> >> >> > > > -- > Jean-Claude Gu?don > Professeur titulaire > Litt?rature compar?e > Universit? de Montr?al > > -- ************************************* Isidro F. Aguillo, HonPhD Cybermetrics Lab CCHS - CSIC Albasanz, 26-28, 3C1. 28037 Madrid. Spain Ph. 91-602 2890. Fax: 91-602 2971 isidro.aguillo @ cchs.csic.es www. webometrics.info ************************************* From ksc at LIBRARY.IISC.ERNET.IN Fri Feb 26 00:40:42 2010 From: ksc at LIBRARY.IISC.ERNET.IN (K S Chudamani) Date: Fri, 26 Feb 2010 11:10:42 +0530 Subject: Agarwal, P; Searls, DB. 2009. Can literature analysis identify innovation drivers in drug discovery?. NATURE REVIEWS DRUG DISCOVERY 8 (11): 865-878 In-Reply-To: Message-ID: The same has been observed long back by Chudamani and Asundi. It has been reported in a paper submitted OCLC- DRTC workshop on electronic resource management. In the paper we have said that there will be core and non core areas of thrust. The title of the paper is " Use of a Knowledge Base of Subject Relationship for Enhancing Retrieval Performance in the Digital Environment". It is available in the DRTC LDL digital Library Chudamani On Wed, 24 Feb 2010, Eugene Garfield wrote: > Adminstrative info for SIGMETRICS (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html > > Agarwal, P; Searls, DB. 2009. Can literature analysis identify innovation drivers > in drug discovery?. NATURE REVIEWS DRUG DISCOVERY 8 (11): 865-878. > > Author Full Name(s): Agarwal, Pankaj; Searls, David B. > > Language: English > Document Type: Article > > Abstract: Drug discovery must be guided not only by medical need and > commercial potential, but also by the areas in which new science is creating > therapeutic opportunities, such as target identification and the understanding > of disease mechanisms. To systematically identify such areas of high scientific > activity, we use bibliometrics and related data-mining methods to analyse over > half a terabyte of data, including PubMed abstracts, literature citation data and > patent filings. These analyses reveal trends in scientific activity related to > disease studied at varying levels, down to individual genes and pathways, and > provide methods to monitor areas in which scientific advances are likely to > create new therapeutic opportunities. > > Addresses: [Agarwal, Pankaj; Searls, David B.] GlaxoSmithKline Inc, Computat > Biol Dept, King Of Prussia, PA 19406 USA > Reprint Address: Agarwal, P, GlaxoSmithKline Inc, Computat Biol Dept, 709 > Swedeland Rd,POB 1539, King Of Prussia, PA 19406 USA. > > E-mail Address: Pankaj.Agarwal at gsk.com > ISSN: 1474-1776 > DOI: 10.1038/nrd2973 > URL: http://www.nature.com/nrd/journal/v8/n11/full/nrd2973.html > > -- This message has been scanned for viruses and dangerous content by MailScanner, and is believed to be clean. From isidro.aguillo at CCHS.CSIC.ES Fri Feb 26 04:00:40 2010 From: isidro.aguillo at CCHS.CSIC.ES (Isidro F. Aguillo) Date: Fri, 26 Feb 2010 10:00:40 +0100 Subject: Ranking Web of Universities: 2010 July edition In-Reply-To: <0BBD8C9342CBA343AE2C91D32990988C3BCF0F0946@aplesstripe.dom1.jhuapl.edu> Message-ID: Dear Christina: In fact these are very good examples of the contrary. The world famous Arxiv repository although located in Cornell is not using cornell.edu domain. Regarding PSU we decided to exclude CiteSeerX from the university counting because they use the domain psu.edu bit remember that the full papers in CiteSeer are only linked with the original file en the publishing institution. Tennessee Fayetteville also hosting CiteSeerX , Chicago and its MUSE Project and the European Trier University (DBLP) have been also recalculated, but none of these cases can be considered bad practices. On the contrary they are wonderful initiatives that perhaps should enjoy independent webdomains. Best regards, Pikas, Christina K. escribi?: > Adminstrative info for SIGMETRICS (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html > > The two most obvious are Penn State and Cornell. Surely they receive special treatment? > > Christina Pikas > > -----Original Message----- > From: ASIS&T Special Interest Group on Metrics [mailto:SIGMETRICS at LISTSERV.UTK.EDU] On Behalf Of Isidro F. Aguillo > Sent: Thursday, February 25, 2010 6:17 AM > To: SIGMETRICS at LISTSERV.UTK.EDU > Subject: Re: [SIGMETRICS] Ranking Web of Universities: 2010 July edition > > Adminstrative info for SIGMETRICS (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html > > Dear Jean Claude: > > We have evidence from some universities from two different countries. We > mentioned one specific (very important) university in our previous > ranking and their response was very aggressive for later accepting that > at least 3750 papers in the repository does not belong to their own > scholars. Our figures are larger and checking it is very easy (in > private I can point you some candidates). > > Best regards, > > Jean-Claude Gu?don escribi?: > >> Adminstrative info for SIGMETRICS (for example unsubscribe): >> http://web.utk.edu/~gwhitney/sigmetrics.html I find all this very >> amusing. As soon as you establish measurements of quality in one way >> or another, you are bound to see people adapt to it to their >> advantage. For example, because editors want to increase the impact >> factors of their journals, they increase the number of review articles >> and they ask their authors to cite articles from the journal itself. >> We must never forget the ancient links between our species and chimps >> - a point that Macchiavelli had clearly understood without knowing the >> fact. >> >> This said, there is an accusation here: librarians are said to store >> papers originating from outside their institution to improve its >> rankings in webometrics... While credible, is this allegation proved >> or even provable ? I would love to see real proofs of this beyond >> anecdotal rumours (sometimes aka urban myth). Which universities? >> Since when? Etc... >> >> The long and the short of this is that there does not exist a simple >> way to measure quality. Those who believe there is either delude >> themselves, or manipulate others' beliefs. >> >> Jean-Claude Gu?don >> >> >> Le jeudi 25 f?vrier 2010 ? 10:55 +0100, Isidro F. Aguillo a ?crit : >> >>> Adminstrative info for SIGMETRICS (for example unsubscribe): >>> http://web.utk.edu/~gwhitney/sigmetrics.html >>> >>> Dear Dr. Kuhn: >>> >>> Thank you for your kind message. As supporters of Open Access >>> initiatives, we are in favor of these situations you described. However >>> our paragraphs refer to a completely different situation: librarians >>> that are adding THOUSANDS of papers of authors unrelated with their >>> organization and without any copyright agreement with them. Even worst, >>> the reason for doing that is for increasing their web ranking. >>> >>> Best regards, >>> >>> Heinrich C. Kuhn escribi?: >>> >>>> Adminstrative info for SIGMETRICS (for example unsubscribe): >>>> http://web.utk.edu/~gwhitney/sigmetrics.html >>>> >>>> First of all: Thanks! >>>> >>>> Second: at http://www.webometrics.info/ >>>> it is written: >>>> " >>>> We have discovered several universities that are hosting large >>>> numbers of academic papers authored by scientist that do not >>>> belong to those institutions. This is not only unfair, but it >>>> clearly violates copyright of the involved papers. >>>> " >>>> Please be aware that this is not necessarily so: Such papers >>>> may be hosted where they are hosted because >>>> - The author of that paper wished then paper to be >>>> hosted at the hosting institution and the hosting >>>> institution acted on his wish. >>>> - The license (CC or other) of the paper permits such >>>> hosting and the archive in question is an archive >>>> for a certain subject. >>>> - The author once was a member of the hosting institution. >>>> >>>> Best regards >>>> >>>> Heinrich C. Kuhn >>>> >>>> +---------------------------------------------- >>>> | Dr. Heinrich C. Kuhn >>>> | Seminar fuer Geistesgeschichte und >>>> | Philosophie der Renaissance >>>> | Ludwig-Maximilians-Universitaet Muenchen >>>> | D-80539 Muenchen / Ludwigstr. 31 >>>> | T.: +49-89-2180 2018, F.: +49-89-2180 2907 >>>> | http://www.phil-hum-ren.uni-muenchen.de/ >>>> +---------------------------------------------- >>>> >>>> >>>> >>> >>> >> -- >> Jean-Claude Gu?don >> Professeur titulaire >> Litt?rature compar?e >> Universit? de Montr?al >> >> >> > > > -- ************************************* Isidro F. Aguillo, HonPhD Cybermetrics Lab CCHS - CSIC Albasanz, 26-28, 3C1. 28037 Madrid. Spain Ph. 91-602 2890. Fax: 91-602 2971 isidro.aguillo @ cchs.csic.es www. webometrics.info *************************************