From loet at LEYDESDORFF.NET Mon Sep 3 12:07:38 2007 From: loet at LEYDESDORFF.NET (Loet Leydesdorff) Date: Mon, 3 Sep 2007 18:07:38 +0200 Subject: CACM, 50(9), (2007), p. 14 Message-ID: Include Citations When Ranking Institutions and Scholars Communications of the ACM, Volume 50, Number 9 (2007), Pages 13-14 In their article "Automatic and Versatile Publications Ranking for Research Institutions and Scholars" (June 2007), Jie Ren and Richard N. Taylor showed that automatic publication ranking can yield results similar to those from manual ranking processes. Although they warned about the sensitivity of measuring for parameter choice, they also suggested that reproducing the rankings validates the use of the instrument for quality assessment. In addition to numbers of publications, citations are also useful for ranking. Of the 17 journals mentioned in the article, 10 were also in the Science Citation Index-Expanded Version. A publication and citation count of these 10 journals for the same period (1995-2003) yields completely different rankings. I've now extended the table in the article (see users.fmg.uva.nl/leydesdorff/Table1_CACM/) to include publication and citation rates for the top 50 computing graduate programs; the (Spearman) correlation coefficients between the original rankings and the new ones are on the order of 0.5. My rankings are based on the attribution of one full point to an institution for each (co-authored) publication and its corresponding citations. However, proportional attribution does not significantly affect my results. Not only does the order change, but five of the 50 institutions did not have a single publication attributed in the Thompson ISI selection during the same eight years. I don't claim that citation-based rankings are better than those published previously. The reliability of bibliometric constructs and their validity as indicators of quality are two different issues. Loet Leydesdorff Amsterdam, The Netherlands _____ Amsterdam School of Communications Research (ASCoR) Kloveniersburgwal 48, 1012 CX Amsterdam Tel.: +31-20- 525 6598; fax: +31-20- 525 3681 loet at leydesdorff.net ; http://www.leydesdorff.net/ Now available: The Knowledge-Based Economy: Modeled, Measured, Simulated. 385 pp.; US$ 18.95 The Self-Organization of the Knowledge-Based Society; The Challenge of Scientometrics -------------- next part -------------- An HTML attachment was scrubbed... URL: From Nelly.MAINY at DANONE.COM Mon Sep 3 22:02:56 2007 From: Nelly.MAINY at DANONE.COM (Nelly MAINY) Date: Tue, 4 Sep 2007 04:02:56 +0200 Subject: Nelly MAINY est absent(e). Message-ID: Je serai absent(e) du 01/09/2007 au 05/09/2007. Je r?pondrai ? votre message d?s mon retour. Ce message electronique et tous les fichiers attaches qu'il contient sont confidentiels et destines exclusivement a l'usage de la personne a laquelle ils sont adresses. Si vous avez recu ce message par erreur, merci de le retourner a son emetteur. Les idees et opinions presentees dans ce message sont celles de son auteur, et ne representent pas necessairement celles du Groupe DANONE ou d'une quelconque de ses filiales. La publication, l'usage, la distribution, l'impression ou la copie non autorisee de ce message et des attachements qu'il contient sont strictement interdits. This e-mail and any files transmitted with it are confidential and intended solely for the use of the individual to whom it is addressed. If you have received this email in error please send it back to the person that sent it to you. Any views or opinions presented are solely those of its author and do not necessarily represent those of DANONE Group or any of its subsidiary companies. Unauthorized publication, use, dissemination, forwarding, printing or copying of this email and its associated attachments is strictly prohibited. From harnad at ECS.SOTON.AC.UK Tue Sep 4 08:38:55 2007 From: harnad at ECS.SOTON.AC.UK (Stevan Harnad) Date: Tue, 4 Sep 2007 13:38:55 +0100 Subject: British Academy Report on Peer Review and Metrics In-Reply-To: <871B1597F6E4F04B85465F9C7E706C5021A0A9@CASEVS03.cas.anu.edu.au> Message-ID: > A pall of gloom lies over the vital system of peer review. > But the British Academy has some bright ideas. > The Guardian, Jessica Shepherd reports, Tuesday September 4, 2007 http://education.guardian.co.uk/higher/research/story/0,,2161680,00.html Jessica Shepherd's report on peer review seems to be a good one. The only thing it lacks is some conclusions (which journalists are often reluctant to take the responsibility of making): (1) Yes, peer review, like all human judgment, is fallible, and susceptible to error and abuse. (2) But, in point of fact, peer review just means the assessment of research by qualified experts. (In the case of research proposals, it is assessment for fundability, and in the case of research reports, it is assessment for publishability.) (3) Funding and publishing without any assessment is not a solution: (3a) Everything cannot be funded (there aren't enough funds), and even funded projects first need some expert advice in their design. (3b) And everything *does* get published, eventually, but there is a hierarchy of journal peer-review quality standards, serving as an essential guide for users, to guide them in what they can take the risk of trying to read, use and build upon. (There is not enough time to read everything, and it's to risky to try to build on anything that claims to have been found); and even accepted papers first need from expert advice in their revision.) (4) So far, nothing as good as or better than peer review (i.e., qualified experts vetting the work of their fellow-experts) has been found, tested and demonstrated. So peer review remains the only straw afloat, if the alternative is not to be tossing a coin for funding, and publishing everything on a par. (5) Peer review *can* be improved. The weak link is always the editor (or Board of Editors), who choose the reviewers and to whom the reviewers and authors are answerable; and the Funding Officer(s) or committee choosing the reviewers for proposals, and deciding how to act on the basis of the reviews. There are many possibilities for experimenting with ways to make this meta-review component more accurate, equitable, answerable, and efficient, especially now that we are in the online era: http://users.ecs.soton.ac.uk/harnad/Temp/peerev.pdf (6) Metrics are not a substitute for peer review, they are a *supplement* to it. In the case of the UK RAE, a Dual System of prospective funding of (i) individual competitive proposals (RCUK) and (ii) retrospective top-sliced funding of entire university departments, based on their recent past research performance (RAE), metrics can help inform and guide funding officers, committees, editors, Boards and reviewers. And in the case of the RAE in particular, they can shoulder a lot of the former peer-review burden: The RAE, being a retrospective rather than a prospective exercise, can benefit from the prior publication peer review that the journals have already done for the submissions, rank the outcomes with metrics, and then only add expert judgment afterward, as a way of checking and fine-tuning the metric rankings. Funders and universities explicitly recognizing peer review performance as a metric would be a very good idea, both for the reviewers and the researchers being reviewed. Harnad, S. (2007) Open Access Scientometrics and the UK Research Assessment Exercise. In Proceedings of 11th Annual Meeting of the International Society for Scientometrics and Informetrics 11(1), pp. 27-33, Madrid, Spain. Torres-Salinas, D. and Moed, H. F., Eds. http://eprints.ecs.soton.ac.uk/13804/ Brody, T., Carr, L., Gingras, Y., Hajjem, C., Harnad, S. and Swan, A. (2007) Incentivizing the Open Access Research Web: Publication-Archiving, Data-Archiving and Scientometrics. CTWatch Quarterly 3(3). http://eprints.ecs.soton.ac.uk/14418/ Shadbolt, N., Brody, T., Carr, L. and Harnad, S. (2006) The Open Research Web: A Preview of the Optimal and the Inevitable, in Jacobs, N., Eds. Open Access: Key Strategic, Technical and Economic Aspects, chapter 21. Chandos. http://eprints.ecs.soton.ac.uk/12453/ Some more generic references on peer review follow below. Stevan Harnad American Scientist Open Access Forum http://amsci-forum.amsci.org/archives/American-Scientist-Open-Access-Forum.html Chaire de recherche du Canada Professor of Cognitive Science Institut des sciences cognitives Electronics & Computer Science Universite du Quebec a Montreal University of Southampton Montreal, Quebec Highfield, Southampton Canada H3C 3P8 SO17 1BJ United Kingdom http://www.crsc.uqam.ca/ http://users.ecs.soton.ac.uk/harnad/ Harnad, S. (ed.) (1982) Peer commentary on peer review: A case study in scientific quality control, New York: Cambridge University Press. Harnad, Stevan (1985) Rational disagreement in peer review. Science, Technology and Human Values, 10 p.55-62. http://cogprints.org/2128/ Harnad, S. (1986) Policing the Paper Chase. (Review of S. Lock, A difficult balance: Peer review in biomedical publication.) Nature 322: 24 - 5. Harnad, S. (1996) Implementing Peer Review on the Net: Scientific Quality Control in Scholarly Electronic Journals. In: Peek, R. & Newby, G. (Eds.) Scholarly Publishing: The Electronic Frontier. Cambridge MA: MIT Press. Pp 103-118. http://cogprints.org/1692/ Harnad, S. (1997) Learned Inquiry and the Net: The Role of Peer Review, Peer Commentary and Copyright. Learned Publishing 11(4) 283-292. http://cogprints.org/1694/ Harnad, S. (1998/2000/2004) The invisible hand of peer review. Nature [online] (5 Nov. 1998), Exploit Interactive 5 (2000): and in Shatz, B. (2004) (ed.) Peer Review: A Critical Inquiry. Rowland & Littlefield. Pp. 235-242. http://cogprints.org/1646/ Peer Review Reform Hypothesis-Testing (started 1999) http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#480 A Note of Caution About "Reforming the System" (2001) http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#1170 Self-Selected Vetting vs. Peer Review: Supplement or Substitute? (2002) http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#2341 ---------------------------------------- > > No fewer than three academic journals dismissed the economist George Akerlof's paper The Market for Lemons as "trivial" and "too generic" when it was submitted in the late 1960s. Almost four decades later it was regarded as a seminal text and its author thought worthy of the Nobel prize for economics. > > Peer review, when an academic submits a scholarly work to the scrutiny of other experts in the field for publication in a journal or for a grant, for example, has always been an imperfect science. But lately it has had more, and fiercer, critics. > > They say peer review is biased against innovation and originality. They argue that it costs too much - more than ?196m a year was the estimate by Research Councils UK last year. And they say it takes up too much time now that more academics than ever are submitting papers and fewer claim they can afford the time to "peer review" them. > > Today a report published by the British Academy - an academic club of 800 scholars elected for distinction in the humanities and social sciences - speaks up for peer review. The professors quote Joan Sieber, a psychologist at California State University, who has said: "One suspects that peer review is a bit like democracy - a bad system, but the best one possible." > > Albert Weale, professor of government at the University of Essex and chair of the committee responsible for the report, describes peer review as "the essential backbone to knowledge and the crucial mechanism in maintaining its quality". > > Robert Bennett, professor of geography at Cambridge University, says it is "an essential, if imperfect, practice for the humanities and social sciences". > > The report's writers snap back at those who attack peer review. They back up their ripostes with the comments of journal editors, research councils, charities and funders, academics and postdoctoral students. To those who say peer review is biased against innovation and that journal editors "play safe" and are "friendly to their own work", the academy's response is that universities and research councils are awarding more grants for risky, avant-garde research projects than ever. > > The report admits that "there may be scope for the government to consider ways in which it can encourage endowments ... within universities to support small grants for innovative, high-risk research". > > But it warns: "It is important not to commit the fallacy of assuming that, because high quality will be innovative, the innovative is necessarily high quality ... other criteria include: accuracy, validity, replicability, reliability, substantively significant, authoritative and so on." > > Banality gets acceptance > > Marian Hobson, professor of French at Queen Mary, University of London, says: "If a journal editor gets everything right all the time, they are probably aiming for the middle, banally all-right work, which will be out of date in the blink of an eyelid. Really excellent work may sometimes take a while to be accepted." > > To those who lambast peer reviewing for being too time consuming and costly, the professors have the following suggestion: give far more recognition to the unpaid, altruistic labour of those who do it and the system will be under less strain. > > Hobson says: "If done properly, [peer review] entails bibliographical searches, checking of statements, repeated visits to the university library, not just to Google. Yet this kind of activity counts for nix, nothing, zilch in the research assessment exercise [in which every active researcher in every university in the UK is assessed by panels of other academics to receive grants for their research]." > > The academy stops short of demanding peer reviewers be paid. It realises this would be impossible for all but the most wealthy journal publishers. Instead, the report recommends that the importance of peer reviewing should be better reflected in the research assessment exercise. "Those responsible for the management of universities and research institutes need to ensure that they ... encourage and reward peer review activity," it says. > > This might stop some high calibre academics, already overburdened with work, from being put off peer reviewing, the professors say. It might also attract junior lecturers and even postdoctoral students. More reviewers would mean the system was under less strain. The strain is partly triggered by an increase of up to 62% in the number of academic papers submitted of in some fields in the past five years. > > Here lies another problem, says the report. "As we conducted our review, we were struck ... by the extent to which there is little attention to training in peer review," it says. "Training is important, not just in itself, but because of the privileged position that peer reviewers enjoy. > > "By virtue of reading a paper, reviewers can acquire access to original data sets, new empirical results or innovative conceptual work. In the business world, these would count as commercial secrets. In the academic world, the ethos is that reviewers are part of the gatekeeping system, the ultimate rationale of which is the fast and efficient dissemination of research findings. > > "The integrity of the peer review system is therefore of great importance. One of the ways in which that integrity is maintained is through its dependence upon professional and unselfish motivations, and this in turn suggests the importance of training in the professional and ethical conventions of the practice." > > The academy's report ends with a warning to the government: plans to overhaul the way research is assessed after next year will change peer review for the worse, especially in the humanities. > > Metrics-based approach > > After 2008, the quality of research - and hence the amount of funding that universities receive from the government - will be judged largely on the basis of statistics such as grant income and contracts. It is accepted in the sector that this "metrics-based" approach will work better for science and engineering than for arts and humanities research, which does not receive much income and where books take longer to have an impact. > > Hobson says: "Metrics is helpful in giving a kind of overview measured in terms of items. A bit like a waistline measurement. It doesn't give much of an idea of whether they are slim or fat, unless they are at the extreme ends of the spectrum. > > "Wittgenstein at his death had one book and one article published. Another book was on the way, but unfinished. "Heaven knows what would have happened to him in today's academia." > From loet at LEYDESDORFF.NET Tue Sep 4 11:31:20 2007 From: loet at LEYDESDORFF.NET (Loet Leydesdorff) Date: Tue, 4 Sep 2007 17:31:20 +0200 Subject: No subject Message-ID: The Communication of Meaning in Anticipatory Systems: A Simulation Study of the Dynamics of Intentionality in Social Systems Vice-Presidential Address at the 8th Int. Conference of Computing Anticipatory Systems (CASYS07), Li?ge, Belgium, 6-11 August 2007 < pdf-version> Abstract Psychological and social systems provide us with a natural domain for the study of anticipations because these systems are based on and operate in terms of intentionality. Psychological systems can be expected to contain a model of themselves and their environments; social systems can be strongly anticipatory and therefore co-construct their environments, for example, in techno-economic (co-)evolutions. Using Dubois?s hyper-incursive and incursive formulations of the logistic equation, these two types of systems and their couplings can be simulated. In addition to their structural coupling, psychological and social systems are also coupled by providing meaning reflexively to each other?s meaning-processing. Luhmann?s distinctions among (1) interactions between intentions at the micro-level, (2) organization at the meso-level, and (3) self-organization of the fluxes of meaningful communication at the global level can be modeled and simulated using three hyper-incursive equations. The global level of self-organizing interactions among fluxes of communication is retained at the meso-level of organization. In a knowledge-based economy, these two levels of anticipatory structuration can be expected to propel each other at the supra-individual level. _____ Loet Leydesdorff Amsterdam School of Communications Research (ASCoR) Kloveniersburgwal 48, 1012 CX Amsterdam Tel.: +31-20- 525 6598; fax: +31-20- 525 3681 loet at leydesdorff.net ; http://www.leydesdorff.net/ Now available: The Knowledge-Based Economy: Modeled, Measured, Simulated. 385 pp.; US$ 18.95 The Self-Organization of the Knowledge-Based Society; The Challenge of Scientometrics -------------- next part -------------- An HTML attachment was scrubbed... URL: From jbollen at LANL.GOV Tue Sep 4 15:01:46 2007 From: jbollen at LANL.GOV (Johan Bollen) Date: Tue, 4 Sep 2007 13:01:46 -0600 Subject: Rowlands, I; Nicholas, D "The missing link: journal usage metrics" ASLIB PROCEEDINGS 59 (3). 2007. p.222-228 In-Reply-To: Message-ID: Hi all, I have not yet had a chance to obtain a copy of the article, but I was surprised not to find the following among its 6 cited references: Johan Bollen, Herbert Van de Sompel. Usage Impact Factor: the effects of sample characteristics on usage-based impact metrics. JASIST, in press (http://arxiv.org/abs/cs/0610154, 2006) Instead of examining the hypothetical value of easy-to-understand, usage-based metrics and the potential value of linking article publication year to full text downloads, this article actually: 1) introduces and calculates a "usage impact factor (UIF)" that links article publication year to full text downloads, 2) is based on a large-scale usage data set (one year of usage recorded by the California State University link resolvers), 3) investigates the UIF's properties for a range of scholarly communities, including their demographics, and perhaps most importantly: 4) clearly shows that link resolver log data, like many other sources of usage data, is far from "uninterpretable". With regards to the last point, another article of relevance that seems to have not been picked up by the authors: Johan Bollen, Herbert Van de Sompel, Joan Smith, and Rick Luce. Toward alternative metrics of journal impact: a comparison of download and citation data. Information Processing and Management, 41 (6):1419?1440 (http://arxiv.org/abs/cs/0503007) Kind regards, Johan. ---- Johan Bollen Los Alamos National Laboratory (P362-proto) Los Alamos, NM 87545 Ph: 505 606 0030, Fx: 505 665 6452 jbollen at lanl.gov, http://public.lanl.gov/jbollen ---- On Aug 31, 2007, at 12:29 PM, Eugene Garfield wrote: > Adminstrative info for SIGMETRICS (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html > > E-mail: i.rowlands at ucl.ac.uk > > TITLE: The missing link: journal usage metrics (Article, > English) > > AUTHOR: Rowlands, I; Nicholas, D > > SOURCE: ASLIB PROCEEDINGS 59 (3). 2007. p.222-228 EMERALD > GROUP > PUBLISHING LIMITED, BRADFORD > > Document Type: Article > Language: English > Cited References: 6 Times Cited: 0 > > Abstract: > Purpose - The aim of this short communication is to contribute to a > growing > debate about how we can measure the "quality" of journals. More > specifically, the paper argues the need for a new range of > standardized > indicators based on reader (rather than author-facing) metrics. > Design/methodology/approach - This is a thought experiment, > outlining the > kinds of usage indicators that could be developed alongside the > traditional > ISI measures of impact, immediacy and obsolescence. > > Findings - The time is ripe to develop a set of standardised. > measures of > journal usage that are as easy to understand, and as universally > accepted, > as ISI's current citation-based indicators. By linking article > publication > year to full text downloads, this article argues that very > considerable > value could be extracted from what, in many cases, is almost > uninterpretable data. > > Practical implications - Indicators in the form proposed could find > a wide > variety of applications, from helping librarians to assess the > potential > value-for-money of bundled journal deals, to helping policy-makers and > scholarly communication researchers to better understand the > dynamics of > knowledge diffusion. > > Originality/value - The development of standardized usage factors > in the > form suggested here would radically shift the centre of gravity in > bibliometrics research from the author to the reader. This remains > largely > unexplored territory. > > > Addresses: Rowlands I (reprint author), Univ Coll London, Sch Lib > Archive & > Informat Studies, CIBER, London WC1E 6BT, England > Univ Coll London, Sch Lib Archive & Informat Studies, CIBER, London > WC1E > 6BT, England > > Publisher: EMERALD GROUP PUBLISHING LIMITED, 60/62 TOLLER LANE, > BRADFORD > BD8 9BY, W YORKSHIRE, ENGLAND > Subject Category: Computer Science, Information Systems; Information > Science & Library Science > IDS Number: 181AC > > ISSN: 0001-253X > > CITED REFERENCES : > AMIN M > PERSPECTIVES PUBLISH : 1 2000 > DARMONI SJ > Reading factor: a new bibliometric criterion for managing digital > libraries > JOURNAL OF THE MEDICAL LIBRARY ASSOCIATION 90 : 323 2002 > OBST O > HLTH INFORMATION LIB 20 : 22 2003 > ROWLANDS I > Scholarly communication in the digital environment - The 2005 > survey of > journal author behaviour and attitudes > ASLIB PROCEEDINGS 57 : 481 2005 > TSAY MY > Library journal use and citation age in medical science > JOURNAL OF DOCUMENTATION 55 : 543 1999 > WULFF JL > Quality markers and use of electronic journals in an academic health > sciences library > JOURNAL OF THE MEDICAL LIBRARY ASSOCIATION 92 : 315 2004 From harnad at ECS.SOTON.AC.UK Tue Sep 4 10:16:01 2007 From: harnad at ECS.SOTON.AC.UK (Stevan Harnad) Date: Tue, 4 Sep 2007 15:16:01 +0100 Subject: British Academy Report on Peer Review and Metrics In-Reply-To: <200709040811.AA110624942@stuart.iit.edu> Message-ID: On Tue, 4 Sep 2007, Eliezer Geisler wrote: > Dear Steven: > > In your latest Sigmetrics entry on the British response to Peer Review, > you left out my book on metrics and an article I wrote sometime ago in The > Scientist. It seems to me that all the discussion since then is rehashing > established concepts and analyses. We know the system is flawed, but, > as Winston Churchill once said in a different context about Democracy > and our own Mark Twain about getting old: the alternatives are worse. > > E. Geisler (2000) The Metrics of Science and Technology, Greenwood Press. > > Kind regards, > > Elie > > Elie Geisler > Distinguished Professor of Management > and Director, > IIT Center for the Management > of Medical Technology (CMMT) > Stuart School of Business > Illinois Institute of Technology > 565 West Adams Street > Chicago, Illinois 60661 > Tel:(312)906-6532 Fax:(312)906-6549 > http://www.stuart.iit.edu Dear Elie, Thanks for your note. The list I gave was not meant to be exhaustive (in fact it only referred to my own writings on this topic!), but I am happy to post the reference to your book, and your New Scientist Article too: Geisler, E. (2001) The Mires of Evaluation. The Scientist. 15(10) 39. I would also like to add Henk Moed's book on the topic: Moed, H. F. (2005) Citation Analysis in Research Evaluation. NY Springer. and his paper on download and citation metrics: Moed, H. F. (2005) Statistical Relationships Between Downloads and Citations at the Level of Individual Documents Within a Single Journal, Journal of the American Society for Information Science and Technology, 56(10): 1088-1097 Best wishes, Stevan Harnad From harnad at ECS.SOTON.AC.UK Fri Sep 7 18:49:49 2007 From: harnad at ECS.SOTON.AC.UK (Stevan Harnad) Date: Fri, 7 Sep 2007 23:49:49 +0100 Subject: Where There's No Access Problem There's No Open Access Advantage Message-ID: On Fri, 7 Sep 2007, Michael Kurtz wrote: > Posted today on arXiv: > http://arxiv.org/abs/0709.0896 > > Open Access does not increase citations for research articles from The > Astrophysical Journal > Authors: Michael J. Kurtz > , Edwin A. > Henneken > (Submitted on 6 Sep 2007) > > Abstract: We demonstrate conclusively that there is no "Open Access > Advantage" for papers from the Astrophysical Journal. The two to one > citation advantage enjoyed by papers deposited in the arXiv e-print > server is due entirely to the nature and timing of the deposited > papers. This may have implications for other disciplines. WHERE THERE'S NO ACCESS PROBLEM THERE'S NO OA ADVANTAGE Stevan Harnad Kurtz & Henneken (2007) new result is very interesting. Earlier, Kurtz et al. (2005) had shown that the lion's share of the citation advantage of astrophysics papers self-archived as preprints in Arxiv was caused by (1) Early Advantage (EA) (earlier citations for papers self-archived earlier) and (2) Quality Bias (QB) (a self-selection bias toward self-archiving higher quality papers) and not by (3) Open Access (OA) itself (being freely accessible online to those who cannot afford subscription-toll access). Kurtz et al. explained their finding by suggesting that: "in a well funded field like astrophysics essentially everyone who is in a position to write research articles has full access to the literature." This seems like a perfectly reasonable explanation for their findings. Where there is no access problem, OA cannot be the cause of whatever higher citation count is observed for self-archived articles. Moed (2007) has recently reported a similar result in Condensed Matter Physics, and so have Davis & Fromerth in 4 mathematics journals. Kurtz & Henneken's latest study confirms and strengthens their prior finding: They compared citation counts for articles published in two successive years of the Astrophysical Journal. For one of the years, the journal was freely accessible to everyone; for the other it was only accessible to subscribers. The citation counts for the self-archived articles, as expected, were twice as high as for the non-self-archived articles. They then compared the citation-counts for articles for non-self-archived articles in the free-access year and in the toll-access year, and found no difference. They concluded, again, that OA does not cause increased citations. But of course K&H's prior explanation -- which is that there is no access problem in astrophysics -- applies here too: It means that in a field where there is no access problem, whatever citation advantage comes from making an article OA by self-archiving cannot be an OA effect. K&H conclude: "This may have implications for other disciplines." It should be evident that the degree to which this has implications for other disciplines depends largely on the degree to which it is true in other disciplines that "essentially everyone who is in a position to write research articles has full access to the literature." We (Hajjem & Harnad 2007) have conducted (and are currently replicating) a similar study, but across the full spectrum of disciplines, measuring the citation advantage for mandated and unmandated self-archiving for articles from 4 Institutional Repositories that have self-archiving mandates (three universities plus CERN), each compared to articles in the very same journal and year by authors from other institutions (on the assumption that mandated self-archiving should have less of a self-selection quality bias than unmandated self-archiving). We again confirmed the citation advantage for self-archiving, and found no difference in the size of that advantage for mandated and unmandated self-archiving. (The finding of an equally large self-archiving advantage for mandated and unmandated self-archiving was also confirmed for CERN, whose articles are all in physics -- although one could perhaps argue that CERN articles enjoy a quality advantage over articles from other institutions.) A few closing points: (1) It is likely that the size of the access problem differs from field to field, and with it the size of the OA citation advantage. It is unlikely that most fields are as well-heeled as astrophysics. In Condensed Matter physics (and perhaps also mathematics) there is the further complication that it is mostly just preprints that are being self-archived, which reduces the scope for observing any postprint advantage, as opposed to just an Early Advantage (EA). (In fields that do have a significant access problem, the "Early Advantage" for postprints is the OA Advantage!) OA self-archiving mandates (and the OA movement in general) target refereed, accepted postprints, not unrefereed preprints. (2) The positive correlation is between citation counts and the number of self-archived articles in each citation-range (Hajjem et al 2005). This could be caused by Quality Bias (QA) (higher quality articles being more likely to be self-archived) or Quality Advantage (QA) (higher quality articles benefit more from being self-archived) or both. (3) Citations are of course not the only potential advantage of self-archiving. Downloads can increase too (Brody et al. 2006). (4) We are now in the Institutional Repository (IR) era. Arxiv is a venerable Central Repository, one of the biggest and oldest and most successful of them all. But, institutions are the sources of OA content, institutions are in the best position to mandate self-archiving, and institutions share with their authors the benefits of enhancing the accessibility, usage and impact of their research output. Moreover, the nature of the web is distributed local websites, harvested by central service providers, rather than central self-archiving. Stevan Harnad Brody, T., Harnad, S. and Carr, L. (2006) Earlier Web Usage Statistics as Predictors of Later Citation Impact. Journal of the American Association for Information Science and Technology (JASIST) 57(8) pp. 1060-1072. http://eprints.ecs.soton.ac.uk/10713/ Davis, P. M. and Fromerth, M. J. (2007) Does the arXiv lead to higher citations and reduced publisher downloads for mathematics articles? Scientometics, Vol. 71, No. 2. http://arxiv.org/abs/cs.DL/0603056 Hajjem, C., Harnad, S. and Gingras, Y. (2005) Ten-Year Cross-Disciplinary Comparison of the Growth of Open Access and How it Increases Research Citation Impact. IEEE Data Engineering Bulletin 28(4) pp. 39-47. http://eprints.ecs.soton.ac.uk/12906/ Hajjem, C. and Harnad, S. (2007) The Open Access Citation Advantage: Quality Advantage Or Quality Bias? Technical Report, Electronics and Computer Science, University of Southampton. http://eprints.ecs.soton.ac.uk/13328/ Kurtz, M. J. and Henneken, E. A. (2007) Open Access does not increase citations for research articles from The Astrophysical Journal. Preprint deposited in arXiv September 6, 2007. http://arxiv.org/abs/0709.0896 Kurtz, M. J., Eichhorn, G., Accomazzi, A., Grant, C. S., Demleitner, M., Murray, S. S. (2005, "The Effect of Use and Access on Citations. Information Processing and Management, 41, 1395-1402) http://cfa-www.harvard.edu/~kurtz/IPM-abstract.html Moed, H. F. (2007) The effect of 'open access' on citation impact: An analysis of ArXiv's condensed matter section, Journal of the American Society for Information Science and Technology, August 30, 2007. http://dx.doi.org/10.1002/asi.20663 From j.s.katz at SUSSEX.AC.UK Sat Sep 8 13:18:22 2007 From: j.s.katz at SUSSEX.AC.UK (Sylvan Katz) Date: Sat, 8 Sep 2007 11:18:22 -0600 Subject: Looking for Lotka 1926 Paper Message-ID: Does anyone have a scanned PDF version of Lotka's 1926 paper? Lotka, A. J., 1926: The frequency distribution of scientific productivity. Journal of the Washington Academy of Sciences, 16, 317-323. Thank you Sylvan Dr. J. Sylvan Katz, Visiting Fellow SPRU, University of Sussex http://www.sussex.ac.uk/Users/sylvank From elevel at GMAIL.COM Sat Sep 8 23:09:25 2007 From: elevel at GMAIL.COM (Sebastian Boell) Date: Sun, 9 Sep 2007 13:09:25 +1000 Subject: Looking for Lotka 1926 Paper In-Reply-To: Message-ID: Hello Sylvan, > Does anyone have a scanned PDF version of Lotka's 1926 paper? > > Lotka, A. J., 1926: The frequency distribution of scientific productivity. > Journal of the Washington Academy of Sciences, 16, 317-323. Please find attached a scanned PDF of Lotka's paper. If anyone else is interested also CC to the SIGMETRICS list. Kind Regards Sebastin Boell -------------- next part -------------- A non-text attachment was scrubbed... Name: Lotka 1929.pdf Type: application/pdf Size: 461095 bytes Desc: not available URL: From loet at LEYDESDORFF.NET Mon Sep 10 15:52:58 2007 From: loet at LEYDESDORFF.NET (Loet Leydesdorff) Date: Mon, 10 Sep 2007 21:52:58 +0200 Subject: Current Biology, Vol. 17, No.15, R583-585 Message-ID: My Word The mismeasurement of science Peter A. Lawrence a, E-mail The Corresponding Author aDepartment of Zoology, University of Cambridge, Downing Street, Cambridge CB2 3EJ, UK, and MRC Laboratory of Molecular Biology, Hills Road, Cambridge CB2 0QH, UK Available online 6 August 2007. Article Outline Impact factors and citations Changes in behaviour Who is to blame and what to do? References _____ Loet Leydesdorff Amsterdam School of Communications Research (ASCoR) Kloveniersburgwal 48, 1012 CX Amsterdam Tel.: +31-20- 525 6598; fax: +31-20- 525 3681 loet at leydesdorff.net ; http://www.leydesdorff.net/ Now available: The Knowledge-Based Economy: Modeled, Measured, Simulated. 385 pp.; US$ 18.95 The Self-Organization of the Knowledge-Based Society; The Challenge of Scientometrics -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: REemail.gif Type: image/gif Size: 112 bytes Desc: not available URL: From whitehd at DREXEL.EDU Wed Sep 12 16:26:19 2007 From: whitehd at DREXEL.EDU (Howard White) Date: Wed, 12 Sep 2007 16:26:19 -0400 Subject: New Version of Publish or Perish Message-ID: Dear Members, Anne-Wil Harzing of University of Melbourne has asked me to announce on this list that Version 2.3 of her Publish or Perish software has been released. As many of you know, PoP is an interface to Google Scholar that radically simplifies the gathering of citation data from the Web. For author analysis it provides: * Total number of papers * Total number of citations * Average number of citations per paper * Average number of citations per author * Average number of papers per author * Average number of citations per year * Hirsch's h-index and related parameters * Egghe's g-index * The contemporary h-index * The age-weighted citation rate * Two variations of individual h-indices * An analysis of the number of authors per paper. It also has modules for analyzing contributors to a journal and contributors to a subject literature as defined by the user. Several papers discussing its features are downloadable as well. For details, go to: http://www.harzing.com/resources.htm#/pop.htm Howard D. White -------------- next part -------------- An HTML attachment was scrubbed... URL: From harnad at ECS.SOTON.AC.UK Sat Sep 15 14:30:42 2007 From: harnad at ECS.SOTON.AC.UK (Stevan Harnad) Date: Sat, 15 Sep 2007 19:30:42 +0100 Subject: More Reasons for the Immediate Deposit Mandate and the Eprint Request Button Message-ID: ** Cross-Posted ** For hyperlinked references, see: http://openaccess.eprints.org/index.php?/archives/292-guid.html The paper reprint request era's prime innovator, Eugene Garfield, had already anticipated many of the current developments in Open Access: (1) Garfield, E. (1999) From Photostats to Home Pages on the World Wide Web: A Tutorial on How to Create Your Electronic Archive. The Scientist 13(4):14. EXCERPT: It is the utopian expectation of those who live in cyberspace that eventually most researchers will create Web sites containing the full text of all their papers... The social, economic, and scholarly impact of this development has major consequences for the future. Garfield, E. (1965) Is the 'free reprint system' free and/or obsolete? Essays of an Information Scientist 1:10-11. Garfield, E. (1972) Reprint Exchange. 1. The multimillion dollar problem ordinaire, Essays of an Information Scientist 1:359-60. (2) Drenth, JPH (2003) More reprint requests, more citations? Scientometrics 56: 283-286. ABSTRACT: Reprint requests are commonly used to obtain a copy of an article. This study aims to correlate the number of reprint requests from a 10-year-sample of articles with the number of citations. The database contained 28 articles published in over a 10-year-period (1992-2001). For each separate article the number of citations and and the number of reprint requests were retrieved. In total 303 reprint requests were analysed. Reviews (median 9, range 1 to 95) and original articles (median 8, range 1-36) attracted most reprint requests. There was an excellent correlation between the number of requests and citations to article (two-tailed non-parametric Spearman rank test r = 0.55; 95% confidence interval 0.18-0.78, P < 0.005). Articles that received most reprint requests are cited more often. (3) Swales, J. (1988), Language and scientific communication. The case of the reprint request. Scientometrics 13: 93-101. EXCERPT: This paper reports on a study of Reprint Requests (RRs). It is estimated that tens of millions of RRs are mailed each year, most being triggered by Current Contents... ------------------------------------------------------------------ In the online era, the days of reprint requests ought to be over, with Open Access taking their place. But some research funders and universities are still hesitating about mandating Open Access Self-Archiving, because they are concerned about publishers' embargoes. Here is the solution: Even where a publisher embargoes or does not endorse OA self-archiving, universities and research funders can and should still go ahead and mandate immediate deposit anyway, with no exceptions or delays, but allowing the deposit to be made Closed Access instead of Open Access during any publisher-imposed embargo period. The Institutional Repository's semi-automatized Email Eprint Request Button will provide almost-immediate, almost-OA to tide over all researcher usage needs webwide till the end of the embargo (or till embargoes die their natural and well-deserved deaths, under the growing pressure and increasingly apparent benefits of OA). Stevan Harnad AMERICAN SCIENTIST OPEN ACCESS FORUM: http://amsci-forum.amsci.org/archives/American-Scientist-Open-Access-Forum.html http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/ UNIVERSITIES and RESEARCH FUNDERS: If you have adopted or plan to adopt an policy of providing Open Access to your own research article output, please describe your policy at: http://www.eprints.org/signup/sign.php http://openaccess.eprints.org/index.php?/archives/71-guid.html http://openaccess.eprints.org/index.php?/archives/136-guid.html OPEN-ACCESS-PROVISION POLICY: BOAI-1 ("Green"): Publish your article in a suitable toll-access journal http://romeo.eprints.org/ OR BOAI-2 ("Gold"): Publish your article in an open-access journal if/when a suitable one exists. http://www.doaj.org/ AND in BOTH cases self-archive a supplementary version of your article in your own institutional repository. http://www.eprints.org/self-faq/ http://archives.eprints.org/ http://openaccess.eprints.org/ From A.Chiner-Arias at WARWICK.AC.UK Tue Sep 18 06:51:09 2007 From: A.Chiner-Arias at WARWICK.AC.UK (Chiner Arias, Alejandro) Date: Tue, 18 Sep 2007 11:51:09 +0100 Subject: Bibliometrics for Arts & Humanities Message-ID: Having failed to find a bibliometric tool for Arts & Humanities, I am asking to this list in the hope somebody here will know something similar to Journal Citation Reports. I am aware of the Arts & Humanities Citation Index as a bibliographic database, but the JCR only use Science and Social Sciences data from the respective Thompson ISI bibliographic databases. My second question is about ranking of cited academics. Again the ISI Higly Cited database applies only to Science and some of the Social Sciences. Is there something similar for the Humanities? http://isihighlycited.com/ Many thanks for any leads. I am aware of the software below thanks to that posting. Alec ___________________________________ Alejandro Chiner, Service Innovation Officer, University of Warwick Library Research & Innovation Unit, Gibbet Hill Road, Coventry CV4 7AL, United Kingdom. Tel: +(44/0) 24 765 23251, Fax: +(44/0) 24 765 24211, a.chiner-arias at warwick.ac.uk http://www.warwick.ac.uk/go/riu ___________________________________ -----Original Message----- From: ASIS&T Special Interest Group on Metrics [mailto:SIGMETRICS at listserv.utk.edu] On Behalf Of Howard White Sent: 12 September 2007 21:26 To: SIGMETRICS at listserv.utk.edu Subject: [SIGMETRICS] New Version of Publish or Perish Dear Members, Anne-Wil Harzing of University of Melbourne has asked me to announce on this list that Version 2.3 of her Publish or Perish software has been released. As many of you know, PoP is an interface to Google Scholar that radically simplifies the gathering of citation data from the Web. For author analysis it provides: * Total number of papers * Total number of citations * Average number of citations per paper * Average number of citations per author * Average number of papers per author * Average number of citations per year * Hirsch's h-index and related parameters * Egghe's g-index * The contemporary h-index * The age-weighted citation rate * Two variations of individual h-indices * An analysis of the number of authors per paper. It also has modules for analyzing contributors to a journal and contributors to a subject literature as defined by the user. Several papers discussing its features are downloadable as well. For details, go to: http://www.harzing.com/resources.htm#/pop.htm Howard D. White From Christina.Pikas at JHUAPL.EDU Tue Sep 18 08:51:42 2007 From: Christina.Pikas at JHUAPL.EDU (Pikas, Christina K.) Date: Tue, 18 Sep 2007 08:51:42 -0400 Subject: Bibliometrics for Arts & Humanities In-Reply-To: A<6C724FA6DC66CF4C971F385DBDE887227BD75B@ELDER.ads.warwick.ac.uk> Message-ID: This is most definitely not in my area of expertise, but I would think that books are more important to this crowd than peer reviewed articles so you'd have to look at citations to books. Amazon and I think Google books do this -- but I don't know of a tool to extract and analyze that info. Christina K. Pikas, MLS R.E. Gibson Library & Information Center The Johns Hopkins University Applied Physics Laboratory Voice 240.228.4812 (Washington), 443.778.4812 (Baltimore) Fax 443.778.5353 -----Original Message----- From: ASIS&T Special Interest Group on Metrics [mailto:SIGMETRICS at listserv.utk.edu] On Behalf Of Chiner Arias, Alejandro Sent: Tuesday, September 18, 2007 6:51 AM To: SIGMETRICS at listserv.utk.edu Subject: [SIGMETRICS] Bibliometrics for Arts & Humanities Having failed to find a bibliometric tool for Arts & Humanities, I am asking to this list in the hope somebody here will know something similar to Journal Citation Reports. I am aware of the Arts & Humanities Citation Index as a bibliographic database, but the JCR only use Science and Social Sciences data from the respective Thompson ISI bibliographic databases. My second question is about ranking of cited academics. Again the ISI Higly Cited database applies only to Science and some of the Social Sciences. Is there something similar for the Humanities? http://isihighlycited.com/ Many thanks for any leads. I am aware of the software below thanks to that posting. Alec ___________________________________ Alejandro Chiner, Service Innovation Officer, University of Warwick Library Research & Innovation Unit, Gibbet Hill Road, Coventry CV4 7AL, United Kingdom. Tel: +(44/0) 24 765 23251, Fax: +(44/0) 24 765 24211, a.chiner-arias at warwick.ac.uk http://www.warwick.ac.uk/go/riu ___________________________________ -----Original Message----- From: ASIS&T Special Interest Group on Metrics [mailto:SIGMETRICS at listserv.utk.edu] On Behalf Of Howard White Sent: 12 September 2007 21:26 To: SIGMETRICS at listserv.utk.edu Subject: [SIGMETRICS] New Version of Publish or Perish Dear Members, Anne-Wil Harzing of University of Melbourne has asked me to announce on this list that Version 2.3 of her Publish or Perish software has been released. As many of you know, PoP is an interface to Google Scholar that radically simplifies the gathering of citation data from the Web. For author analysis it provides: * Total number of papers * Total number of citations * Average number of citations per paper * Average number of citations per author * Average number of papers per author * Average number of citations per year * Hirsch's h-index and related parameters * Egghe's g-index * The contemporary h-index * The age-weighted citation rate * Two variations of individual h-indices * An analysis of the number of authors per paper. It also has modules for analyzing contributors to a journal and contributors to a subject literature as defined by the user. Several papers discussing its features are downloadable as well. For details, go to: http://www.harzing.com/resources.htm#/pop.htm Howard D. White From notsjb at LSU.EDU Tue Sep 18 09:28:16 2007 From: notsjb at LSU.EDU (Stephen J Bensman) Date: Tue, 18 Sep 2007 08:28:16 -0500 Subject: Bibliometrics for Arts & Humanities Message-ID: In general, the humanities have not been found amenable to bibliometric analysis. Not only has ISI-Thomson Scientific not developed a JCR for the AHCI despite an initial intent to do so, but publication and citation counts have not been utilized for the humanities by agencies like the American Council on Education and the National Research Council charged with evaluating the quality of US research-doctorate programs. These agencies have relied upon measures such as peer ratings and number of faculty awards. In general, humanities frequency distributions do not have the same highly skewed character as those in the sciences and social sciences, indicating the causal factors of variance are less strong. If you are looking for a quantitative humanities measure, I would suggest using the number of libraries holding a given item that is easily available in OCLC WorldCat. It is a substitute measure for subjective judgments of librarians and faculty on the importance of a given bibliographic item. I have advised humanities faculty to use this measure for journals, and they have told me that it matches their intuitive sense of the importance of journals, and it can be used to judge the importance of monographs--more important in the humanities. You can judge the importance of books written by persons in the humanities, and it can be used to rate the faculty of given programs. Another such measure would the number of times books are reviewed in journals widely held by libraries. One problem with WorldCat library counts is that they are dominated US holdings, but then so are Thomson Scientific publication and citation counts as well as everything else in this world. I hope that you find this of some help. Respectfully, Stephen J. Benman LSU Libraries Louisiana State University Baton Rouge, LA USA ________________________________ From: ASIS&T Special Interest Group on Metrics on behalf of Chiner Arias, Alejandro Sent: Tue 9/18/2007 5:51 AM To: SIGMETRICS at listserv.utk.edu Subject: [SIGMETRICS] Bibliometrics for Arts & Humanities Having failed to find a bibliometric tool for Arts & Humanities, I am asking to this list in the hope somebody here will know something similar to Journal Citation Reports. I am aware of the Arts & Humanities Citation Index as a bibliographic database, but the JCR only use Science and Social Sciences data from the respective Thompson ISI bibliographic databases. My second question is about ranking of cited academics. Again the ISI Higly Cited database applies only to Science and some of the Social Sciences. Is there something similar for the Humanities? http://isihighlycited.com/ Many thanks for any leads. I am aware of the software below thanks to that posting. Alec ___________________________________ Alejandro Chiner, Service Innovation Officer, University of Warwick Library Research & Innovation Unit, Gibbet Hill Road, Coventry CV4 7AL, United Kingdom. Tel: +(44/0) 24 765 23251, Fax: +(44/0) 24 765 24211, a.chiner-arias at warwick.ac.uk http://www.warwick.ac.uk/go/riu ___________________________________ -----Original Message----- From: ASIS&T Special Interest Group on Metrics [mailto:SIGMETRICS at listserv.utk.edu] On Behalf Of Howard White Sent: 12 September 2007 21:26 To: SIGMETRICS at listserv.utk.edu Subject: [SIGMETRICS] New Version of Publish or Perish Dear Members, Anne-Wil Harzing of University of Melbourne has asked me to announce on this list that Version 2.3 of her Publish or Perish software has been released. As many of you know, PoP is an interface to Google Scholar that radically simplifies the gathering of citation data from the Web. For author analysis it provides: * Total number of papers * Total number of citations * Average number of citations per paper * Average number of citations per author * Average number of papers per author * Average number of citations per year * Hirsch's h-index and related parameters * Egghe's g-index * The contemporary h-index * The age-weighted citation rate * Two variations of individual h-indices * An analysis of the number of authors per paper. It also has modules for analyzing contributors to a journal and contributors to a subject literature as defined by the user. Several papers discussing its features are downloadable as well. For details, go to: http://www.harzing.com/resources.htm#/pop.htm Howard D. White -------------- next part -------------- An HTML attachment was scrubbed... URL: From Jonathan.adams at EVIDENCE.CO.UK Tue Sep 18 11:03:04 2007 From: Jonathan.adams at EVIDENCE.CO.UK (Jonathan Adams) Date: Tue, 18 Sep 2007 16:03:04 +0100 Subject: Bibliometrics for Arts & Humanities Message-ID: Dear Alejandro Chiner The UK Arts & Humanities Research Council has been exploring this over the last couple of years (in collaboration with Humanities in the European Research Area, HERA) and should have some resources that would assist. Regards Jonathan Adams Director, Evidence Ltd + 44 113 384 5680 -----Original Message----- From: ASIS&T Special Interest Group on Metrics [mailto:SIGMETRICS at listserv.utk.edu] On Behalf Of Chiner Arias, Alejandro Sent: 18 September 2007 11:51 To: SIGMETRICS at listserv.utk.edu Subject: [SIGMETRICS] Bibliometrics for Arts & Humanities Having failed to find a bibliometric tool for Arts & Humanities, I am asking to this list in the hope somebody here will know something similar to Journal Citation Reports. I am aware of the Arts & Humanities Citation Index as a bibliographic database, but the JCR only use Science and Social Sciences data from the respective Thompson ISI bibliographic databases. My second question is about ranking of cited academics. Again the ISI Higly Cited database applies only to Science and some of the Social Sciences. Is there something similar for the Humanities? http://isihighlycited.com/ Many thanks for any leads. I am aware of the software below thanks to that posting. Alec ___________________________________ Alejandro Chiner, Service Innovation Officer, University of Warwick Library Research & Innovation Unit, Gibbet Hill Road, Coventry CV4 7AL, United Kingdom. Tel: +(44/0) 24 765 23251, Fax: +(44/0) 24 765 24211, a.chiner-arias at warwick.ac.uk http://www.warwick.ac.uk/go/riu ___________________________________ -----Original Message----- From: ASIS&T Special Interest Group on Metrics [mailto:SIGMETRICS at listserv.utk.edu] On Behalf Of Howard White Sent: 12 September 2007 21:26 To: SIGMETRICS at listserv.utk.edu Subject: [SIGMETRICS] New Version of Publish or Perish Dear Members, Anne-Wil Harzing of University of Melbourne has asked me to announce on this list that Version 2.3 of her Publish or Perish software has been released. As many of you know, PoP is an interface to Google Scholar that radically simplifies the gathering of citation data from the Web. For author analysis it provides: * Total number of papers * Total number of citations * Average number of citations per paper * Average number of citations per author * Average number of papers per author * Average number of citations per year * Hirsch's h-index and related parameters * Egghe's g-index * The contemporary h-index * The age-weighted citation rate * Two variations of individual h-indices * An analysis of the number of authors per paper. It also has modules for analyzing contributors to a journal and contributors to a subject literature as defined by the user. Several papers discussing its features are downloadable as well. For details, go to: http://www.harzing.com/resources.htm#/pop.htm Howard D. White From jrussell at SERVIDOR.UNAM.MX Tue Sep 18 12:45:07 2007 From: jrussell at SERVIDOR.UNAM.MX (Jane Russell) Date: Tue, 18 Sep 2007 11:45:07 -0500 Subject: Call for Papers/Convocatoria Message-ID: This message is for our Spanish speaking colleagues. Apologies for cross-posting. Jane M. Russell Centro Universitario de Investigaciones Bibliotecol?gicas Torre II de Humanidades piso 11 Universidad Nacional Aut?noma de M?xico Ciudad Universitaria 04510 M?xico DF M?xico Tel: (52) - 55- 56230363 Fax: (52) - 55- 55507461 Congreso Internacional de Informaci?n Palacio de Convenciones de La Habana, Cuba del 21 al 25 de abril de 2008 CONVOCATORIA IV SEMINARIO INTERNACIONAL SOBRE ESTUDIOS CUANTITATIVOS Y CUALITATIVOS DE LA CIENCIA Y LA TECNOLOG?A "Prof. GILBERTO SOTOLONGO AGUILAR" Este es el cuarto seminario internacional que se organiza dentro de los tradicionales congresos de INFO. El primero se realiz? el 25 de abril de 2002; el segundo los d?as 13 y 14 de abril de 2004. El tercer seminario los d?as 19 y 20 de abril de 2006. Los seminarios permiten contar con un espacio para identificar las ?reas de inter?s, conocer las instituciones que conducen proyectos de investigaci?n en el ?rea, y dialogar con los investigadores y con los diferentes actores que participan en este importante campo de trabajo. Algunas de las contribuciones seleccionadas de los seminarios son aceptadas para ser publicadas en la Revista Espa?ola de Documentaci?n Cient?fica (REDC). Debido al ?xito logrado y a petici?n de los participantes, la comisi?n cient?fica acept? organizar dentro de cada congreso de INFO, la incorporaci?n del seminario en los programas subsecuentes. El IV Seminario se realizar? entre los d?as 21 y 25 de abril de 2008. La Comisi?n Cient?fica recibir? trabajos de investigaci?n, revisi?n, o an?lisis de casos-estudio terminados o en proceso, relacionados con los aspectos cuantitativos y cualitativos de la ciencia y la tecnolog?a. Se aceptar?n trabajos relacionados con la bibliometr?a, la informetr?a, la cienciometr?a, y la webometr?a/cibermetr?a; sin excluir m?todos y enfoques cualitativos de an?lisis. Los t?picos que se consideran relevantes para el IV Seminario son los siguientes: ? Modelos de comunicaci?n cient?fica (enfoques de sistemas, modelos matem?ticos, etc.). ? Patrones de comunicaci?n, colaboraci?n, flujos de informaci?n en ciencia y tecnolog?a, migraci?n (an?lisis de citas, factores de impacto de revista, flujos nacionales, internacionales, etc.). ? Producci?n cient?fica (disciplinas, estudios de g?nero, departamentos de investigaci?n, institucionales, pa?ses, etc.). ? Din?mica de la literatura (historia, crecimiento, obsolescencia, dispersi?n, relaci?n ciencia-tecnolog?a, etc.). ? Indicadores de apoyo para toma de decisiones en pol?tica cient?fica (econom?a, organizaci?n y administraci?n, tecnolog?as de informaci?n y comunicaci?n, gesti?n de recursos, pron?sticos, impacto, evaluaci?n). ? Visualizaci?n y organizaci?n de la informaci?n para la bibliometr?a, la cienciometr?a y la webometr?a/cibermetr?a. ? Aspectos te?ricos de los estudios cuantitativos y cualitativos de la ciencia y la tecnolog?a. ? An?lisis, dise?o y aplicaci?n de sofware. ? T?cnicas de miner?a de datos y textos aplicados a la obtenci?n de indicadores. A los colegas interesados en participar como ponentes se les solicita enviar res?menes de hasta 1,500 palabras, que incluyan secciones claras sobre: (1) antecedentes del estudio, (2) el prop?sito del trabajo; (3) la metodolog?a/enfoque utilizado; y (4) los principales resultados obtenidos o esperados, seg?n el caso. Fechas importantes: Env?o de res?menes: antes del 07 de Diciembre de 2007 Notificaci?n de aceptaci?n de trabajos: antes del 11 de Enero de 2008 Env?o de trabajos completos: antes del 08 de Febrero de 2008 El idioma oficial del Seminario es el Espa?ol. Los trabajos deber?n enviarse en formato electr?nico al Coordinador General del Seminario, con copia a los coordinadores, a las direcciones electr?nicas abajo indicadas. Para mayor informaci?n sobre las especificaciones del trabajo como extensiones, tipo de letra, estructura, resumen, palabras, claves, etc., adem?s informaci?n sobre el programa general del congreso, registro, reservaci?n de habitaciones, etc., le solicitamos visitar el sitio oficial del congreso en la siguiente direcci?n: www.congreso-info.cu Coordinador General: Dr. C?sar A. Mac?as Chapula Hospital General de M?xico, Direcci?n de Investigaci?n, Dr. Balmis 148, U-301-2?. Piso, Col. Doctores, Delegaci?n Cuauht?moc, M?xico, D. F. 06726 Tel. (52)-55-5004-3842 cesarmch at liceaga.facmed.unam.mx chapula at data.net.mx Coordinadores: Dra. Jane M. Russell Centro Universitario de Investigaciones Bibliotecol?gicas,UNAM, Torre II de Humanidades, Piso 11, 04510, M?xico, D. F. Tel. (52)-55-5623-0363 Fax.(52)-55-55550-7461 jrussell at servidor.unam.mx MSc. Mar?a V. Guzm?n S?nchez Instituto FinlayAve. 27 #19805, Lisa, P.O. Box 16017, Cod 11600, La Habana, Cuba Tel. 53 7 2717452 Fax. 53 7 2720809 mvguzman at finlay.edu.cu MID. Isidro Aguillo Ca?o Centro de Informaci?n y Documentaci?n Cient?fica, Consejo Superior de Investigaciones Cient?ficas, Joaqu?n Costa, 22.28002, Madrid, Espa?a Tel. 91 563 54 82 Fax. 91 564 26 44 isidro at cindoc.csic.es -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Convocatoria.doc Type: application/msword Size: 45568 bytes Desc: not available URL: From loet at LEYDESDORFF.NET Tue Sep 18 13:05:59 2007 From: loet at LEYDESDORFF.NET (Loet Leydesdorff) Date: Tue, 18 Sep 2007 19:05:59 +0200 Subject: Bibliometrics for Arts & Humanities In-Reply-To: <81A4EE9059B88C4CBCB7F80231E065462F80EE@evidence1.Evidence.local> Message-ID: Linda Butler has been working on this for the RAE in Australia; Ton Nederhof, of course, previously in the Netherlands. Best wishes, Loet > -----Original Message----- > From: ASIS&T Special Interest Group on Metrics > [mailto:SIGMETRICS at listserv.utk.edu] On Behalf Of Jonathan Adams > Sent: Tuesday, September 18, 2007 5:03 PM > To: SIGMETRICS at listserv.utk.edu > Subject: Re: [SIGMETRICS] Bibliometrics for Arts & Humanities > > Adminstrative info for SIGMETRICS (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html > > Dear Alejandro Chiner > The UK Arts & Humanities Research Council has been exploring this over > the last couple of years (in collaboration with Humanities in the > European Research Area, HERA) and should have some resources > that would > assist. > Regards > Jonathan Adams > > Director, Evidence Ltd > + 44 113 384 5680 > > > -----Original Message----- > From: ASIS&T Special Interest Group on Metrics > [mailto:SIGMETRICS at listserv.utk.edu] On Behalf Of Chiner Arias, > Alejandro > Sent: 18 September 2007 11:51 > To: SIGMETRICS at listserv.utk.edu > Subject: [SIGMETRICS] Bibliometrics for Arts & Humanities > > Adminstrative info for SIGMETRICS (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html > > Having failed to find a bibliometric tool for Arts & Humanities, I am > asking to this list in the hope somebody here will know something > similar to Journal Citation Reports. > > I am aware of the Arts & Humanities Citation Index as a bibliographic > database, but the JCR only use Science and Social Sciences > data from the > respective Thompson ISI bibliographic databases. > > My second question is about ranking of cited academics. Again the ISI > Higly Cited database applies only to Science and some of the Social > Sciences. Is there something similar for the Humanities? > http://isihighlycited.com/ > > Many thanks for any leads. I am aware of the software below thanks to > that posting. > > Alec > ___________________________________ > Alejandro Chiner, Service Innovation Officer, > University of Warwick Library Research & Innovation Unit, > Gibbet Hill Road, Coventry CV4 7AL, United Kingdom. Tel: > +(44/0) 24 765 > 23251, Fax: +(44/0) 24 765 24211, > a.chiner-arias at warwick.ac.uk http://www.warwick.ac.uk/go/riu > ___________________________________ > > -----Original Message----- > From: ASIS&T Special Interest Group on Metrics > [mailto:SIGMETRICS at listserv.utk.edu] On Behalf Of Howard White > Sent: 12 September 2007 21:26 > To: SIGMETRICS at listserv.utk.edu > Subject: [SIGMETRICS] New Version of Publish or Perish > > Adminstrative info for SIGMETRICS (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html > Dear Members, > > Anne-Wil Harzing of University of Melbourne has asked me to > announce on this list that Version 2.3 of her Publish or Perish > software has been released. As many of you know, PoP is an > interface to Google Scholar that radically simplifies the gathering > of citation data from the Web. For author analysis it provides: > > > > * Total number of papers > * Total number of citations > * Average number of citations per paper > * Average number of citations per author > * Average number of papers per author > * Average number of citations per year > * Hirsch's h-index and related parameters > * Egghe's g-index > * The contemporary h-index > * The age-weighted citation rate > * Two variations of individual h-indices > * An analysis of the number of authors per paper. > > It also has modules for analyzing contributors to a journal and > contributors to a subject literature as defined by the user. > > Several papers discussing its features are downloadable as well. > For details, go to: > > http://www.harzing.com/resources.htm#/pop.htm > > Howard D. White > From whitehd at DREXEL.EDU Tue Sep 18 12:55:24 2007 From: whitehd at DREXEL.EDU (Howard White) Date: Tue, 18 Sep 2007 12:55:24 -0400 Subject: Bibliometrics for Arts & Humanities In-Reply-To: <4928689828488E458AECE7AFDCB52CFE019AB8@email003.lsu.edu> Message-ID: Dear Members, I was surprised and delighted to see Stephen Bensman's posting because in August I delivered a series of talks in Sydney and Brisbane Australia in which I advocated precisely the same measure?library holdings counts in WorldCat and, in their case, Libraries Australia?as a way of assessing the research contributions of academics in the arts and humanities. My talks were related to evaluating academics under the new Research Quality Framework, which is scheduled to begin in Australia in 2008. (RQF resembles the Research Assessment Exercise of the UK.) Australian academics in the arts, humanities, and softer social sciences are under a double whammy as far as the Thomson Scientific databases are concerned: not only are the journals of their fields much less well covered, the journals of their nation are also much less well covered. Linda Butler of Australian National University has been working on RQF-related measures involving citations to books (as opposed to journal articles) in the Thomson databases, but even these miss much of the whole picture, as she herself points out. I am currently teaming with some researchers from the Bibliometric and Informetric Research Group at the University of New South Wales (Connie Wilson, Mari Davis, and others) to work on the proposed holdings-count measure for "book- oriented" researchers Down Under. (It was in indirect connection with this project that I posted the notice about Anne-Wil Harzing's Publish or Perish software last week.) The larger bibliometrics community has pretty much ignored holdings counts and OCLC. In 1995 I published a whole book about them, Brief Tests of Collection Strength, and I have a long article about them coming out in College & Research Libraries in the first half of 2008. These works focus on their uses in library collection evaluation, but it has long been evident to me that they can also be used to evaluate authors? after all, OCLC's list of the top 1000 items by holdings counts reads like a Who's Who of authors in the Western intellectual tradition. So does a list of the most widely held items in LibraryThing, the Web tool for cataloging personal collections, although there the top authors are virtually all novelists, poets, and playwrights. In any case it is high time bibliometricians started paying attention to holdings- count data as a complement to citation data. Howard D. White, Professor Emeritus College of Information Science and Technology Drexel University Philadelphia, PA 19104 On Sep 18, 2007, at 9:28 AM, Stephen J Bensman wrote: > Adminstrative info for SIGMETRICS (for example unsubscribe): http:// > web.utk.edu/~gwhitney/sigmetrics.html > In general, the humanities have not been found amenable to > bibliometric analysis. Not only has ISI-Thomson Scientific not > developed a JCR for the AHCI despite an initial intent to do so, > but publication and citation counts have not been utilized for the > humanities by agencies like the American Council on Education and > the National Research Council charged with evaluating the quality > of US research-doctorate programs. These agencies have relied upon > measures such as peer ratings and number of faculty awards. In > general, humanities frequency distributions do not have the same > highly skewed character as those in the sciences and social > sciences, indicating the causal factors of variance are less strong. > > If you are looking for a quantitative humanities measure, I would > suggest using the number of libraries holding a given item that is > easily available in OCLC WorldCat. It is a substitute measure for > subjective judgments of librarians and faculty on the importance of > a given bibliographic item. I have advised humanities faculty to > use this measure for journals, and they have told me that it > matches their intuitive sense of the importance of journals, and it > can be used to judge the importance of monographs--more important > in the humanities. You can judge the importance of books written > by persons in the humanities, and it can be used to rate the > faculty of given programs. Another such measure would the number > of times books are reviewed in journals widely held by libraries. > One problem with WorldCat library counts is that they are dominated > US holdings, but then so are Thomson Scientific publication and > citation counts as well as everything else in this world. > > I hope that you find this of some help. > > Respectfully, > Stephen J. Benman > LSU Libraries > Louisiana State University > Baton Rouge, LA > USA > > From: ASIS&T Special Interest Group on Metrics on behalf of Chiner > Arias, Alejandro > Sent: Tue 9/18/2007 5:51 AM > To: SIGMETRICS at listserv.utk.edu > Subject: [SIGMETRICS] Bibliometrics for Arts & Humanities > > Adminstrative info for SIGMETRICS (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html > > Having failed to find a bibliometric tool for Arts & Humanities, I am > asking to this list in the hope somebody here will know something > similar to Journal Citation Reports. > > I am aware of the Arts & Humanities Citation Index as a bibliographic > database, but the JCR only use Science and Social Sciences data > from the > respective Thompson ISI bibliographic databases. > > My second question is about ranking of cited academics. Again the ISI > Higly Cited database applies only to Science and some of the Social > Sciences. Is there something similar for the Humanities? > http://isihighlycited.com/ > > Many thanks for any leads. I am aware of the software below thanks to > that posting. > > Alec > ___________________________________ > Alejandro Chiner, Service Innovation Officer, > University of Warwick Library Research & Innovation Unit, > Gibbet Hill Road, Coventry CV4 7AL, United Kingdom. Tel: +(44/0) 24 > 765 > 23251, Fax: +(44/0) 24 765 24211, > a.chiner-arias at warwick.ac.uk http://www.warwick.ac.uk/go/riu > ___________________________________ > > -----Original Message----- > From: ASIS&T Special Interest Group on Metrics > [mailto:SIGMETRICS at listserv.utk.edu] On Behalf Of Howard White > Sent: 12 September 2007 21:26 > To: SIGMETRICS at listserv.utk.edu > Subject: [SIGMETRICS] New Version of Publish or Perish > > Adminstrative info for SIGMETRICS (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html > Dear Members, > > Anne-Wil Harzing of University of Melbourne has asked me to > announce on this list that Version 2.3 of her Publish or Perish > software has been released. As many of you know, PoP is an > interface to Google Scholar that radically simplifies the gathering > of citation data from the Web. For author analysis it provides: > > > > * Total number of papers > * Total number of citations > * Average number of citations per paper > * Average number of citations per author > * Average number of papers per author > * Average number of citations per year > * Hirsch's h-index and related parameters > * Egghe's g-index > * The contemporary h-index > * The age-weighted citation rate > * Two variations of individual h-indices > * An analysis of the number of authors per paper. > > It also has modules for analyzing contributors to a journal and > contributors to a subject literature as defined by the user. > > Several papers discussing its features are downloadable as well. > For details, go to: > > http://www.harzing.com/resources.htm#/pop.htm > > Howard D. White > -------------- next part -------------- An HTML attachment was scrubbed... URL: From loet at LEYDESDORFF.NET Wed Sep 19 02:48:03 2007 From: loet at LEYDESDORFF.NET (Loet Leydesdorff) Date: Wed, 19 Sep 2007 08:48:03 +0200 Subject: Bibliometrics for Arts & Humanities In-Reply-To: Message-ID: Dear collleagues, One can measure the impact of humanities scholars on the social sciences because the non-ISI sources are processed in the Journals Citation Reports. For those of you who read these messages in html I reprint below the cosine-normalized result of "HABERMAS PUBLIC SPHE" as a source item using 2005 data: Best wishes, Loet _____ Loet Leydesdorff Amsterdam School of Communications Research (ASCoR) Kloveniersburgwal 48, 1012 CX Amsterdam Tel.: +31-20- 525 6598; fax: +31-20- 525 3681 loet at leydesdorff.net ; http://www.leydesdorff.net/ Now available: The Knowledge-Based Economy: Modeled, Measured, Simulated. 385 pp.; US$ 18.95 The Self-Organization of the Knowledge-Based Society; The Challenge of Scientometrics _____ From: ASIS&T Special Interest Group on Metrics [mailto:SIGMETRICS at LISTSERV.UTK.EDU] On Behalf Of Howard White Sent: Tuesday, September 18, 2007 6:55 PM To: SIGMETRICS at LISTSERV.UTK.EDU Subject: Re: [SIGMETRICS] Bibliometrics for Arts & Humanities I was surprised and delighted to see Stephen Bensman's posting because in August I delivered a series of talks in Sydney and Brisbane Australia in which I advocated precisely the same measure-library holdings counts in WorldCat and, in their case, Libraries Australia-as a way of assessing the research contributions of academics in the arts and humanities. My talks were related to evaluating academics under the new Research Quality Framework, which is scheduled to begin in Australia in 2008. (RQF resembles the Research Assessment Exercise of the UK.) Australian academics in the arts, humanities, and softer social sciences are under a double whammy as far as the Thomson Scientific databases are concerned: not only are the journals of their fields much less well covered, the journals of their nation are also much less well covered. Linda Butler of Australian National University has been working on RQF-related measures involving citations to books (as opposed to journal articles) in the Thomson databases, but even these miss much of the whole picture, as she herself points out. I am currently teaming with some researchers from the Bibliometric and Informetric Research Group at the University of New South Wales (Connie Wilson, Mari Davis, and others) to work on the proposed holdings-count measure for "book-oriented" researchers Down Under. (It was in indirect connection with this project that I posted the notice about Anne-Wil Harzing's Publish or Perish software last week.) The larger bibliometrics community has pretty much ignored holdings counts and OCLC. In 1995 I published a whole book about them, Brief Tests of Collection Strength, and I have a long article about them coming out in College & Research Libraries in the first half of 2008. These works focus on their uses in library collection evaluation, but it has long been evident to me that they can also be used to evaluate authors- after all, OCLC's list of the top 1000 items by holdings counts reads like a Who's Who of authors in the Western intellectual tradition. So does a list of the most widely held items in LibraryThing, the Web tool for cataloging personal collections, although there the top authors are virtually all novelists, poets, and playwrights. In any case it is high time bibliometricians started paying attention to holdings-count data as a complement to citation data. Howard D. White, Professor Emeritus College of Information Science and Technology Drexel University Philadelphia, PA 19104 On Sep 18, 2007, at 9:28 AM, Stephen J Bensman wrote: In general, the humanities have not been found amenable to bibliometric analysis. Not only has ISI-Thomson Scientific not developed a JCR for the AHCI despite an initial intent to do so, but publication and citation counts have not been utilized for the humanities by agencies like the American Council on Education and the National Research Council charged with evaluating the quality of US research-doctorate programs. These agencies have relied upon measures such as peer ratings and number of faculty awards. In general, humanities frequency distributions do not have the same highly skewed character as those in the sciences and social sciences, indicating the causal factors of variance are less strong. If you are looking for a quantitative humanities measure, I would suggest using the number of libraries holding a given item that is easily available in OCLC WorldCat. It is a substitute measure for subjective judgments of librarians and faculty on the importance of a given bibliographic item. I have advised humanities faculty to use this measure for journals, and they have told me that it matches their intuitive sense of the importance of journals, and it can be used to judge the importance of monographs--more important in the humanities. You can judge the importance of books written by persons in the humanities, and it can be used to rate the faculty of given programs. Another such measure would the number of times books are reviewed in journals widely held by libraries. One problem with WorldCat library counts is that they are dominated US holdings, but then so are Thomson Scientific publication and citation counts as well as everything else in this world. I hope that you find this of some help. Respectfully, Stephen J. Benman LSU Libraries Louisiana State University Baton Rouge, LA USA _____ From: ASIS&T Special Interest Group on Metrics on behalf of Chiner Arias, Alejandro Sent: Tue 9/18/2007 5:51 AM To: SIGMETRICS at listserv.utk.edu Subject: [SIGMETRICS] Bibliometrics for Arts & Humanities Having failed to find a bibliometric tool for Arts & Humanities, I am asking to this list in the hope somebody here will know something similar to Journal Citation Reports. I am aware of the Arts & Humanities Citation Index as a bibliographic database, but the JCR only use Science and Social Sciences data from the respective Thompson ISI bibliographic databases. My second question is about ranking of cited academics. Again the ISI Higly Cited database applies only to Science and some of the Social Sciences. Is there something similar for the Humanities? http://isihighlycited.com/ Many thanks for any leads. I am aware of the software below thanks to that posting. Alec ___________________________________ Alejandro Chiner, Service Innovation Officer, University of Warwick Library Research & Innovation Unit, Gibbet Hill Road, Coventry CV4 7AL, United Kingdom. Tel: +(44/0) 24 765 23251, Fax: +(44/0) 24 765 24211, a.chiner-arias at warwick.ac.uk http://www.warwick.ac.uk/go/riu ___________________________________ -----Original Message----- From: ASIS&T Special Interest Group on Metrics [mailto:SIGMETRICS at listserv.utk.edu] On Behalf Of Howard White Sent: 12 September 2007 21:26 To: SIGMETRICS at listserv.utk.edu Subject: [SIGMETRICS] New Version of Publish or Perish Dear Members, Anne-Wil Harzing of University of Melbourne has asked me to announce on this list that Version 2.3 of her Publish or Perish software has been released. As many of you know, PoP is an interface to Google Scholar that radically simplifies the gathering of citation data from the Web. For author analysis it provides: * Total number of papers * Total number of citations * Average number of citations per paper * Average number of citations per author * Average number of papers per author * Average number of citations per year * Hirsch's h-index and related parameters * Egghe's g-index * The contemporary h-index * The age-weighted citation rate * Two variations of individual h-indices * An analysis of the number of authors per paper. It also has modules for analyzing contributors to a journal and contributors to a subject literature as defined by the user. Several papers discussing its features are downloadable as well. For details, go to: http://www.harzing.com/resources.htm#/pop.htm Howard D. White -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Outlook.jpg Type: image/jpeg Size: 74402 bytes Desc: not available URL: From dgoodman at PRINCETON.EDU Wed Sep 19 03:11:59 2007 From: dgoodman at PRINCETON.EDU (David Goodman) Date: Wed, 19 Sep 2007 03:11:59 -0400 Subject: Bibliometrics for Arts & Humanities In-Reply-To: <4928689828488E458AECE7AFDCB52CFE019AB8@email003.lsu.edu> Message-ID: Another rough method is the use of Google Scholar. It includes citations to and from books as well as journals--but what it includes is erratic and unpredictable. The book citations are from the material in Google Books--including not just the out-of-copyright material but what later material is available, including current material for the publishers that participate. the journal material is whatever is available on the part of the web they scan. There is also miscellaneous material from various academic web sites. If they were complete--or even if they said what they cover--the information would be more usable. In its present state, I would be loth to make more than the roughest comparisons--certainly not if someone's career was dependent on it, orthe fate of an academic department. But nonetheless it can prove informative if the items found are carefully examined and sorted by someone knowledgeable in the subject literature, and not just counted. There is one other more traditional measure--to examine whether there are book reviews of books in the standard bibliographic tools, such as Book Review Index and Book Review Digest. Within a subject, and for the sam type of book, the number of reviews has some connection with the importance. I wouldn't want to say anything more exact than that, but at least the coverage of these bibliographic databases is known and predictable. David Goodman, Ph.D., M.L.S. previously: Bibliographer and Research Librarian Princeton University Library dgoodman at princeton.edu ----- Original Message ----- From: Stephen J Bensman Date: Tuesday, September 18, 2007 9:41 am Subject: Re: [SIGMETRICS] Bibliometrics for Arts & Humanities To: SIGMETRICS at LISTSERV.UTK.EDU > Adminstrative info for SIGMETRICS (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html > > In general, the humanities have not been found amenable to > bibliometric analysis. Not only has ISI-Thomson Scientific not > developed a JCR for the AHCI despite an initial intent to do so, > but publication and citation counts have not been utilized for the > humanities by agencies like the American Council on Education and > the National Research Council charged with evaluating the quality > of US research-doctorate programs. These agencies have relied upon > measures such as peer ratings and number of faculty awards. In > general, humanities frequency distributions do not have the same > highly skewed character as those in the sciences and social > sciences, indicating the causal factors of variance are less strong. > > If you are looking for a quantitative humanities measure, I would > suggest using the number of libraries holding a given item that is > easily available in OCLC WorldCat. It is a substitute measure for > subjective judgments of librarians and faculty on the importance of > a given bibliographic item. I have advised humanities faculty to > use this measure for journals, and they have told me that it > matches their intuitive sense of the importance of journals, and it > can be used to judge the importance of monographs--more important > in the humanities. You can judge the importance of books written > by persons in the humanities, and it can be used to rate the > faculty of given programs. Another such measure would the number > of times books are reviewed in journals widely held by libraries. > One problem with WorldCat library counts is that they are dominated > US holdings, but then so are Thomson Scientific publication and > citation counts as well as everything else in this world. > > I hope that you find this of some help. > > Respectfully, > Stephen J. Benman > LSU Libraries > Louisiana State University > Baton Rouge, LA > USA > > ________________________________ > > From: ASIS&T Special Interest Group on Metrics on behalf of Chiner > Arias, Alejandro > Sent: Tue 9/18/2007 5:51 AM > To: SIGMETRICS at listserv.utk.edu > Subject: [SIGMETRICS] Bibliometrics for Arts & Humanities > > > > Adminstrative info for SIGMETRICS (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html > > Having failed to find a bibliometric tool for Arts & Humanities, I am > asking to this list in the hope somebody here will know something > similar to Journal Citation Reports. > > I am aware of the Arts & Humanities Citation Index as a bibliographic > database, but the JCR only use Science and Social Sciences data > from the > respective Thompson ISI bibliographic databases. > > My second question is about ranking of cited academics. Again the ISI > Higly Cited database applies only to Science and some of the Social > Sciences. Is there something similar for the Humanities? > http://isihighlycited.com/ > > Many thanks for any leads. I am aware of the software below thanks to > that posting. > > Alec > ___________________________________ > Alejandro Chiner, Service Innovation Officer, > University of Warwick Library Research & Innovation Unit, > Gibbet Hill Road, Coventry CV4 7AL, United Kingdom. Tel: +(44/0) 24 > 76523251, Fax: +(44/0) 24 765 24211, > a.chiner-arias at warwick.ac.uk http://www.warwick.ac.uk/go/riu > ___________________________________ > > -----Original Message----- > From: ASIS&T Special Interest Group on Metrics > [mailto:SIGMETRICS at listserv.utk.edu] On Behalf Of Howard White > Sent: 12 September 2007 21:26 > To: SIGMETRICS at listserv.utk.edu > Subject: [SIGMETRICS] New Version of Publish or Perish > > Adminstrative info for SIGMETRICS (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html > Dear Members, > > Anne-Wil Harzing of University of Melbourne has asked me to > announce on this list that Version 2.3 of her Publish or Perish > software has been released. As many of you know, PoP is an > interface to Google Scholar that radically simplifies the gathering > of citation data from the Web. For author analysis it provides: > > > > * Total number of papers > * Total number of citations > * Average number of citations per paper > * Average number of citations per author > * Average number of papers per author > * Average number of citations per year > * Hirsch's h-index and related parameters > * Egghe's g-index > * The contemporary h-index > * The age-weighted citation rate > * Two variations of individual h-indices > * An analysis of the number of authors per paper. > > It also has modules for analyzing contributors to a journal and > contributors to a subject literature as defined by the user. > > Several papers discussing its features are downloadable as well. > For details, go to: > > http://www.harzing.com/resources.htm#/pop.htm > > Howard D. White > > > From notsjb at LSU.EDU Wed Sep 19 09:58:28 2007 From: notsjb at LSU.EDU (Stephen J Bensman) Date: Wed, 19 Sep 2007 08:58:28 -0500 Subject: Bibliometrics for Arts & Humanities In-Reply-To: A Message-ID: Howard, I see once again that great minds tend to think alike. A bibliometric justification for using library holdings as a quality measure is provided by Urquhart's Law. This law forms the operating basis of the British Library Document Supply Centre, and I consider it the most important law of library science per se. Simply stated, Urquhart's Law posits that the more libraries in a given system that hold a given bibliographic item, the more this given bibliographic item is not only used internally in these libraries (intralibrary use) but the more these libraries borrow this item from each other either through interlibrary loan or document delivery (supralibrary use). Urquhart developed his law in respect to scientific journals, but he held that it should also be applicable to monographs, and in this I think that he was right. Basing myself on Urquhart's Law, I have demonstrated that library holdings are significantly correlated not only to supralibrary use but also to total citations, expert ratings, and number of times titles are indexed. If you are interested in Urquhart's Law, then go to the Web site below and read the attachment. These are my major writings on Urquhart's Law. http://www.garfield.library.upenn.edu/bensman/bensman.html As an aside, the same sociological role of publishing elite writings, which is played by the associations in science and social science journals, appears to be played in monographs by the presses of the elite universities such Chicago, Harvard, MIT, California, Oxford, Cambridge, etc. etc. I think that if you would check scholarly monographs published by these university presses against those published by commercial presses, you would find former's library holdings much higher than the latter's. Stephen J. Bensman LSU Libraries Louisiana State University Baton Rouge, LA 70803 USA notsjb at lsu.edu ________________________________ From: ASIS&T Special Interest Group on Metrics [mailto:SIGMETRICS at LISTSERV.UTK.EDU] On Behalf Of Howard White Sent: Tuesday, September 18, 2007 11:55 AM To: SIGMETRICS at LISTSERV.UTK.EDU Subject: Re: [SIGMETRICS] Bibliometrics for Arts & Humanities Dear Members, I was surprised and delighted to see Stephen Bensman's posting because in August I delivered a series of talks in Sydney and Brisbane Australia in which I advocated precisely the same measure-library holdings counts in WorldCat and, in their case, Libraries Australia-as a way of assessing the research contributions of academics in the arts and humanities. My talks were related to evaluating academics under the new Research Quality Framework, which is scheduled to begin in Australia in 2008. (RQF resembles the Research Assessment Exercise of the UK.) Australian academics in the arts, humanities, and softer social sciences are under a double whammy as far as the Thomson Scientific databases are concerned: not only are the journals of their fields much less well covered, the journals of their nation are also much less well covered. Linda Butler of Australian National University has been working on RQF-related measures involving citations to books (as opposed to journal articles) in the Thomson databases, but even these miss much of the whole picture, as she herself points out. I am currently teaming with some researchers from the Bibliometric and Informetric Research Group at the University of New South Wales (Connie Wilson, Mari Davis, and others) to work on the proposed holdings-count measure for "book-oriented" researchers Down Under. (It was in indirect connection with this project that I posted the notice about Anne-Wil Harzing's Publish or Perish software last week.) The larger bibliometrics community has pretty much ignored holdings counts and OCLC. In 1995 I published a whole book about them, Brief Tests of Collection Strength, and I have a long article about them coming out in College & Research Libraries in the first half of 2008. These works focus on their uses in library collection evaluation, but it has long been evident to me that they can also be used to evaluate authors- after all, OCLC's list of the top 1000 items by holdings counts reads like a Who's Who of authors in the Western intellectual tradition. So does a list of the most widely held items in LibraryThing, the Web tool for cataloging personal collections, although there the top authors are virtually all novelists, poets, and playwrights. In any case it is high time bibliometricians started paying attention to holdings-count data as a complement to citation data. Howard D. White, Professor Emeritus College of Information Science and Technology Drexel University Philadelphia, PA 19104 On Sep 18, 2007, at 9:28 AM, Stephen J Bensman wrote: In general, the humanities have not been found amenable to bibliometric analysis. Not only has ISI-Thomson Scientific not developed a JCR for the AHCI despite an initial intent to do so, but publication and citation counts have not been utilized for the humanities by agencies like the American Council on Education and the National Research Council charged with evaluating the quality of US research-doctorate programs. These agencies have relied upon measures such as peer ratings and number of faculty awards. In general, humanities frequency distributions do not have the same highly skewed character as those in the sciences and social sciences, indicating the causal factors of variance are less strong. If you are looking for a quantitative humanities measure, I would suggest using the number of libraries holding a given item that is easily available in OCLC WorldCat. It is a substitute measure for subjective judgments of librarians and faculty on the importance of a given bibliographic item. I have advised humanities faculty to use this measure for journals, and they have told me that it matches their intuitive sense of the importance of journals, and it can be used to judge the importance of monographs--more important in the humanities. You can judge the importance of books written by persons in the humanities, and it can be used to rate the faculty of given programs. Another such measure would the number of times books are reviewed in journals widely held by libraries. One problem with WorldCat library counts is that they are dominated US holdings, but then so are Thomson Scientific publication and citation counts as well as everything else in this world. I hope that you find this of some help. Respectfully, Stephen J. Benman LSU Libraries Louisiana State University Baton Rouge, LA USA ________________________________ From: ASIS&T Special Interest Group on Metrics on behalf of Chiner Arias, Alejandro Sent: Tue 9/18/2007 5:51 AM To: SIGMETRICS at listserv.utk.edu Subject: [SIGMETRICS] Bibliometrics for Arts & Humanities Having failed to find a bibliometric tool for Arts & Humanities, I am asking to this list in the hope somebody here will know something similar to Journal Citation Reports. I am aware of the Arts & Humanities Citation Index as a bibliographic database, but the JCR only use Science and Social Sciences data from the respective Thompson ISI bibliographic databases. My second question is about ranking of cited academics. Again the ISI Higly Cited database applies only to Science and some of the Social Sciences. Is there something similar for the Humanities? http://isihighlycited.com/ Many thanks for any leads. I am aware of the software below thanks to that posting. Alec ___________________________________ Alejandro Chiner, Service Innovation Officer, University of Warwick Library Research & Innovation Unit, Gibbet Hill Road, Coventry CV4 7AL, United Kingdom. Tel: +(44/0) 24 765 23251, Fax: +(44/0) 24 765 24211, a.chiner-arias at warwick.ac.uk http://www.warwick.ac.uk/go/riu ___________________________________ -----Original Message----- From: ASIS&T Special Interest Group on Metrics [mailto:SIGMETRICS at listserv.utk.edu] On Behalf Of Howard White Sent: 12 September 2007 21:26 To: SIGMETRICS at listserv.utk.edu Subject: [SIGMETRICS] New Version of Publish or Perish Dear Members, Anne-Wil Harzing of University of Melbourne has asked me to announce on this list that Version 2.3 of her Publish or Perish software has been released. As many of you know, PoP is an interface to Google Scholar that radically simplifies the gathering of citation data from the Web. For author analysis it provides: * Total number of papers * Total number of citations * Average number of citations per paper * Average number of citations per author * Average number of papers per author * Average number of citations per year * Hirsch's h-index and related parameters * Egghe's g-index * The contemporary h-index * The age-weighted citation rate * Two variations of individual h-indices * An analysis of the number of authors per paper. It also has modules for analyzing contributors to a journal and contributors to a subject literature as defined by the user. Several papers discussing its features are downloadable as well. For details, go to: http://www.harzing.com/resources.htm#/pop.htm Howard D. White -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: UrqILDSPt1.pdf Type: application/octet-stream Size: 167664 bytes Desc: UrqILDSPt1.pdf URL: From A.Chiner-Arias at WARWICK.AC.UK Thu Sep 20 07:16:16 2007 From: A.Chiner-Arias at WARWICK.AC.UK (Chiner Arias, Alejandro) Date: Thu, 20 Sep 2007 12:16:16 +0100 Subject: Bibliometrics for Arts & Humanities In-Reply-To: A<4928689828488E458AECE7AFDCB52CFE0CDE4A@email003.lsu.edu> Message-ID: Thank you very much for all the helpful replies being sent. I wonder why no-one so far has mentioned Scopus.com. I had a quick play with Scopus yesterday and it retrieved 93,934 "Arts & Humanities" references, each with their "Cited by" count. When I included the "multidisciplinary" references the number of articles retrieved was 394,058..! Scopus now has a Citation Tracker similar to the Citation Reports facility on ISI Web of Knowledge. WoK is of course the search interface for the Arts & Humanities Citation Index by Thompson ISI. It is hard to compare the usefulness of the two products as bibliometric tools. Scopus has a relatively limited coverage of Arts & Humanities, but there is always going to be a limit to the coverage of any tool. May I request from the list some opinions on how these two products compare, Scopus and WoK when trying to stretch them for bibliometrics? Alec ___________________________________ Alejandro Chiner, Service Innovation Officer, University of Warwick Library Research & Innovation Unit, Gibbet Hill Road, Coventry CV4 7AL, United Kingdom. Tel: +(44/0) 24 765 23251, Fax: +(44/0) 24 765 24211, a.chiner-arias at warwick.ac.uk http://www.warwick.ac.uk/go/riu ___________________________________ -----Original Message----- From: ASIS&T Special Interest Group on Metrics on behalf of Chiner Arias, Alejandro Sent: Tue 9/18/2007 5:51 AM To: SIGMETRICS at listserv.utk.edu Subject: [SIGMETRICS] Bibliometrics for Arts & Humanities Having failed to find a bibliometric tool for Arts & Humanities, I am asking to this list in the hope somebody here will know something similar to Journal Citation Reports. I am aware of the Arts & Humanities Citation Index as a bibliographic database, but the JCR only use Science and Social Sciences data from the respective Thompson ISI bibliographic databases. My second question is about ranking of cited academics. Again the ISI Higly Cited database applies only to Science and some of the Social Sciences. Is there something similar for the Humanities? http://isihighlycited.com/ Many thanks for any leads. I am aware of the software below thanks to that posting. Alec ___________________________________ Alejandro Chiner, Service Innovation Officer, University of Warwick Library Research & Innovation Unit, Gibbet Hill Road, Coventry CV4 7AL, United Kingdom. Tel: +(44/0) 24 765 23251, Fax: +(44/0) 24 765 24211, a.chiner-arias at warwick.ac.uk http://www.warwick.ac.uk/go/riu ___________________________________ -----Original Message----- From: ASIS&T Special Interest Group on Metrics [mailto:SIGMETRICS at listserv.utk.edu] On Behalf Of Howard White Sent: 12 September 2007 21:26 To: SIGMETRICS at listserv.utk.edu Subject: [SIGMETRICS] New Version of Publish or Perish Dear Members, Anne-Wil Harzing of University of Melbourne has asked me to announce on this list that Version 2.3 of her Publish or Perish software has been released. As many of you know, PoP is an interface to Google Scholar that radically simplifies the gathering of citation data from the Web. For author analysis it provides: * Total number of papers * Total number of citations * Average number of citations per paper * Average number of citations per author * Average number of papers per author * Average number of citations per year * Hirsch's h-index and related parameters * Egghe's g-index * The contemporary h-index * The age-weighted citation rate * Two variations of individual h-indices * An analysis of the number of authors per paper. It also has modules for analyzing contributors to a journal and contributors to a subject literature as defined by the user. Several papers discussing its features are downloadable as well. For details, go to: http://www.harzing.com/resources.htm#/pop.htm Howard D. White From meho at INDIANA.EDU Thu Sep 20 12:00:29 2007 From: meho at INDIANA.EDU (Lokman I. Meho) Date: Thu, 20 Sep 2007 12:00:29 -0400 Subject: Bibliometrics for Arts & Humanities In-Reply-To: <6C724FA6DC66CF4C971F385DBDE887227BDA8A@ELDER.ads.warwick.ac.uk> Message-ID: Alec, The following article will be published in the November 2007 issue of JASIST. Meho, L. I., & Yang, K. (in press). "Impact of Data Sources on Citation Counts and Rankings of LIS Faculty: Web of Science vs. Scopus and Google Scholar." Journal of the American Society for Information Science and Technology. Lokman Lokman I. Meho, Ph.D. Assistant Professor School of Library and Information Science Indiana University 1320 East 10th Street, LI 011 Bloomington, IN 47405-3907 Tel: (812) 856-2323 Fax: (812) 855-6166 E-mail: meho at indiana.edu http://www.slis.indiana.edu/faculty/meho/ Quoting "Chiner Arias, Alejandro" : > Adminstrative info for SIGMETRICS (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html > > Thank you very much for all the helpful replies being sent. I wonder > why no-one so far has mentioned Scopus.com. > > I had a quick play with Scopus yesterday and it retrieved 93,934 "Arts & > Humanities" references, each with their "Cited by" count. When I > included the "multidisciplinary" references the number of articles > retrieved was 394,058..! > > Scopus now has a Citation Tracker similar to the Citation Reports > facility on ISI Web of Knowledge. WoK is of course the search interface > for the Arts & Humanities Citation Index by Thompson ISI. > > It is hard to compare the usefulness of the two products as bibliometric > tools. Scopus has a relatively limited coverage of Arts & Humanities, > but there is always going to be a limit to the coverage of any tool. > > May I request from the list some opinions on how these two products > compare, Scopus and WoK when trying to stretch them for bibliometrics? > > Alec > ___________________________________ > Alejandro Chiner, Service Innovation Officer, > University of Warwick Library Research & Innovation Unit, > Gibbet Hill Road, Coventry CV4 7AL, United Kingdom. Tel: +(44/0) 24 765 > 23251, Fax: +(44/0) 24 765 24211, > a.chiner-arias at warwick.ac.uk http://www.warwick.ac.uk/go/riu > ___________________________________ > > -----Original Message----- > From: ASIS&T Special Interest Group on Metrics on behalf of Chiner > Arias, Alejandro > Sent: Tue 9/18/2007 5:51 AM > To: SIGMETRICS at listserv.utk.edu > Subject: [SIGMETRICS] Bibliometrics for Arts & Humanities > > Adminstrative info for SIGMETRICS (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html > > Having failed to find a bibliometric tool for Arts & Humanities, I am > asking to this list in the hope somebody here will know something > similar to Journal Citation Reports. > > I am aware of the Arts & Humanities Citation Index as a bibliographic > database, but the JCR only use Science and Social Sciences data from the > respective Thompson ISI bibliographic databases. > > My second question is about ranking of cited academics. Again the ISI > Higly Cited database applies only to Science and some of the Social > Sciences. Is there something similar for the Humanities? > http://isihighlycited.com/ > > Many thanks for any leads. I am aware of the software below thanks to > that posting. > > Alec > ___________________________________ > Alejandro Chiner, Service Innovation Officer, > University of Warwick Library Research & Innovation Unit, > Gibbet Hill Road, Coventry CV4 7AL, United Kingdom. Tel: +(44/0) 24 765 > 23251, Fax: +(44/0) 24 765 24211, > a.chiner-arias at warwick.ac.uk http://www.warwick.ac.uk/go/riu > ___________________________________ > > -----Original Message----- > From: ASIS&T Special Interest Group on Metrics > [mailto:SIGMETRICS at listserv.utk.edu] On Behalf Of Howard White > Sent: 12 September 2007 21:26 > To: SIGMETRICS at listserv.utk.edu > Subject: [SIGMETRICS] New Version of Publish or Perish > > Adminstrative info for SIGMETRICS (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html > Dear Members, > > Anne-Wil Harzing of University of Melbourne has asked me to > announce on this list that Version 2.3 of her Publish or Perish > software has been released. As many of you know, PoP is an > interface to Google Scholar that radically simplifies the gathering > of citation data from the Web. For author analysis it provides: > > > > * Total number of papers > * Total number of citations > * Average number of citations per paper > * Average number of citations per author > * Average number of papers per author > * Average number of citations per year > * Hirsch's h-index and related parameters > * Egghe's g-index > * The contemporary h-index > * The age-weighted citation rate > * Two variations of individual h-indices > * An analysis of the number of authors per paper. > > It also has modules for analyzing contributors to a journal and > contributors to a subject literature as defined by the user. > > Several papers discussing its features are downloadable as well. > For details, go to: > > http://www.harzing.com/resources.htm#/pop.htm > > Howard D. White > > > From notsjb at LSU.EDU Fri Sep 21 10:32:24 2007 From: notsjb at LSU.EDU (=?windows-1252?Q?Stephen_J._Bensman?=) Date: Fri, 21 Sep 2007 10:32:24 -0400 Subject: Bibliometrics for Arts & Humanities Message-ID: Alec, Before you get involved with using tools like Scopus, etc., to evaluate the humanities, I think that you have to investigate the basic question of why agencies like the US National Research Council rejected using publication/citation counts, etc., to evaluate research-doctorate programs in the humanities. You may start by looking at the following Web site: http://www7.nationalacademies.org/resdoc/index.html I would also advise you to investigate why the Institute for Scientific Information did not develop a JCR for the Arts & Humanities Citation Index. Perhaps you should go to the Web site below to read what Eugene Garfield wrote about citations in the humanities: http://www.garfield.library.upenn.edu/ There appear to be major problems with using such counts for the humanities. One problem that I can think of is that the humanities are spread over time, geography, and culture to a much greater extent than the sciences and social sciences. Therefore, humanists tend to work in temporal/cultural pockets isolated from each other, making comparisons much more difficult due to the differing variables affecting these pockets. Stephen J. Bensman LSU Libraries Lousiana State University Baton Rouge, LA 70803 USA From pmd8 at CORNELL.EDU Mon Sep 24 11:02:13 2007 From: pmd8 at CORNELL.EDU (Phil Davis) Date: Mon, 24 Sep 2007 11:02:13 -0400 Subject: Downloads as predictors of future citations Message-ID: Several articles have been published that look at article "hits" as a predictor of future citations. Some researchers use fulltext downloads (HTML) as their predictor, some combine HTML and PDF. Does anyone know of a paper that discriminates between the type of view and future citations? In other words, which type of view is the best predictor of future citations? My sense is that it is PDF, but I'm open to be convinced. Thanks, Phil -- Philip M. Davis PhD Student Department of Communication 336 Kennedy Hall Cornell University, Ithaca, NY 14853 email: pmd8 at cornell.edu https://confluence.cornell.edu/display/~pmd8/resume From dgoodman at PRINCETON.EDU Thu Sep 27 02:49:14 2007 From: dgoodman at PRINCETON.EDU (David Goodman) Date: Thu, 27 Sep 2007 02:49:14 -0400 Subject: Vidyanidhi Digital Library In-Reply-To: <4928689828488E458AECE7AFDCB52CFE0CDE4A@email003.lsu.edu> Message-ID: This is a very interesting project to provide an electronic library of Indian doctoral theses: Unfortunately, their web site seems no longer working--does anyone know if the project is still live? David Goodman, Ph.D., M.L.S. previously: Bibliographer and Research Librarian Princeton University Library dgoodman at princeton.edu From ricardo.arencibia at CNIC.EDU.CU Fri Sep 28 10:48:49 2007 From: ricardo.arencibia at CNIC.EDU.CU (Lic. Ricardo Arencibia Jorge) Date: Fri, 28 Sep 2007 10:48:49 -0400 Subject: Successive H indices: a new proposal based on H index Message-ID: Dear colleagues, This is the applying of a Schubert?s proposal based on the widely discussed H index (Andr?s Schubert, Scientometrics 2007;70(1):201-205) I think it can be a good option to evaluate institutions taking into acount the performance of the researcher?s staff. Title: Applying successive H indices in the institutional evaluation: a case study (Brief Communication) Authors: Ricardo Arencibia-Jorge*, Ismaray Barrios-Almaguer, Sandra Fern?ndez-Hern?ndez, Rachel Carvajal-Espino. * Network of Scientometric Studies for Higher Education, National Scientific Research Center, Havana, Cuba. Abstract: The present work shows the applying of successive H indices in the evaluation of a scientific institution, using the researcher-department-institution hierarchy as level of aggregation. The scientific production covered by the Web of Science of the researcher's staff from the Cuban National Scientific Research Center, during the period 2001-2005, was studied. The Hirsch index (h-index; J.E. Hirsch, 2005) was employed to calculate the individual performance of the staff, using the g-index created by Leo Egghe (2006) and the A-index developed by Jin Bi-Hui (2006) as complementary indicators. The successive H indices proposed by Andr?s Schubert (2007) were used to determine the scientific performance of each department as well as the general performance of the institution. The possible advantages of the method for the institutional evaluation processes were exposed. Index Terms: author productivity, impact factor, scientometrics, scientists, H index. http://www3.interscience.wiley.com/cgi-bin/abstract/116324237/ABSTRACT Best Regards, BSc. Ricardo Arencibia-Jorge Network of Scientometric Studies for Higher Education National Scientific Research Center, Havana, Cuba. ricardo.arencibia at cnic.edu.cu http://www.directorioexit.info/consulta.php?campo=ID&directorio=exit&texto=478 __________________________________________ Participe en Universidad 2008. 11 al 15 de febrero del 2008. Palacio de las Convenciones, Ciudad de la Habana, Cuba http://www.universidad2008.cu/