From harnad at ECS.SOTON.AC.UK Fri Nov 3 07:12:32 2006 From: harnad at ECS.SOTON.AC.UK (Stevan Harnad) Date: Fri, 3 Nov 2006 12:12:32 +0000 Subject: University of Southampton and MIT launch WWW research collaboration (fwd) Message-ID: ** Apologies for Cross-Posting ** ---------- Forwarded message ---------- Date: Thu, 2 Nov 2006 15:40:02 -0000 From: Joyce Lewis To: AMERICAN-SCIENTIST-OPEN-ACCESS-FORUM at LISTSERVER.SIGMAXI.ORG Subject: University of Southampton and MIT launch WWW research collaboration University of Southampton and MIT launch World Wide Web research collaboration Joint initiative will analyse and shape Web's evolution CAMBRIDGE, Mass., Nov. 2-The University of Southampton and the Massachusetts Institute of Technology today announced the launch of a long-term research collaboration that aims to produce the fundamental scientific advances necessary to guide the future design and use of the World Wide Web. The Web Science Research Initiative (WSRI) will generate a research agenda for understanding the scientific, technical and social challenges underlying the growth of the Web. Of particular interest is the volume of information on the Web that documents more and more aspects of human activity and knowledge. WSRI research projects will weigh such questions as, how do we access information and assess its reliability? By what means may we assure its use complies with social and legal rules? How will we preserve the Web over time? Commenting on the new initiative, Tim Berners-Lee, inventor of the World Wide Web and a founding director of WSRI, said, "As the Web celebrates its first decade of widespread use, we still know surprisingly little about how it evolved, and we have only scratched the surface of what could be realized with deeper scientific investigation into its design, operation and impact on society. "The Web Science Research Initiative will allow researchers to take the Web seriously as an object of scientific inquiry, with the goal of helping to foster the Web's growth and fulfill its great potential as a powerful tool for humanity." The joint MIT-Southampton initiative will provide a global forum for scientists and scholars to collaborate on the first multidisciplinary scientific research effort specifically designed to study the Web at all scales of size and complexity, and to develop a new discipline of Web science for future generations of researchers. Professor Wendy Hall, head of school at Southampton University School of Electronics and Computer Science and also, along with Professor Nigel Shadbolt of ECS, a founding director of WSRI, said: "As the Web continues to evolve, it is becoming increasingly clear that a new type of graduate will be required to meet the needs of science and industry. Already we are seeing evidence of this, with major Internet companies and research institutions lamenting the fact that there are simply not enough people with the right mix of skills to meet current and future employment demands. In launching WSRI, one of our ultimate aims is to address this issue." WSRI will be headquartered at the Computer Science and Artificial Intelligence Laboratory (CSAIL) at MIT and at the School of Electronics and Computer Science (ECS) at the University of Southampton. Initial plans call for joint research projects, workshops and student/faculty exchanges between the two institutions. The initiative will have four founding directors: Tim Berners-Lee, director of the World Wide Web Consortium, senior research scientist at MIT and professor at the University of Southampton; Wendy Hall, professor of computer science and head of the School of Electronics and Computer Science at the University of Southampton; Nigel Shadbolt, professor of artificial intelligence at the University of Southampton and director of the Advanced Knowledge Technologies Interdisciplinary Research Collaboration; and Daniel J. Weitzner, Technology and Society Domain leader of the World Wide Web Consortium and principal research scientist at MIT. About MIT The Massachusetts Institute of Technology is dedicated to advancing knowledge and educating students in science, technology, and other areas of scholarship that will best serve the nation and the world in the 21st century. The Institute has more than 900 faculty and 10,000 undergraduate and graduate students. It is organized into five Schools -- Architecture and Urban Planning; Engineering; Humanities, Arts, and Social Sciences; Sloan School of Management; and Science.Current areas of research and education include neuroscience and the study of the brain and mind, bioengineering, the environment and sustainable development, information sciences and technology, new media, financial technology, and entrepreneurship. About the University of Southampton The University of Southampton is a leading UK teaching and research institution with a global reputation for leading-edge research and scholarship. It is one of the UK's top 10 research universities, offering first-rate opportunities and facilities for study and research across a wide range of subjects in humanities, health, science and engineering. The University has around 20,000 students and over 5000 staff. Its annual turnover is in the region of ?310 million. With around 500 researchers, and 900 undergraduate students, the School of Electronics and Computer Science at Southampton is one of the world's largest and most successful integrated research groupings, covering Computer Science, Software Engineering, Electronics, and Electrical Engineering. ECS has unrivalled depth and breadth of expertise in world-leading research, new developments and their applications. _________________________________________________ Joyce Lewis Marketing and Communications Manager School of Electronics and Computer Science University of Southampton Southampton SO17 1BJ, UK T +44(0)23 8059 5453 E j.k.lewis at ecs.soton.ac.uk Steele, Colin and Butler, Linda and Kingsley, Danny (2006) The Publishing Imperative: the pervasive influence of publication metrics. Learned Publishing 19(4):277-290. http://eprints.anu.edu.au/archive/00003523/ ABSTRACT: This article summarises the effects of the increasing global trend towards measuring research quality and effectiveness through, in particular, publication-based metrics, and its effects on scholarly communication. Such metrics are increasingly influencing the behaviour patterns of administrators, publishers, librarians and researchers. Impact and citation measures, which often rely solely on Thomson Scientific data, are examined in the context of university league tables and research assessment exercises. The need to establish alternate metrics, particularly for the social sciences and humanities, is emphasised, as is an holistic approach to scholarly communication agendas. Pertinent Prior American Scientist Open Access Forum Topic Threads: UK "RAE" Evaluations (began Nov 2000) http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#1018 Scientometric OAI Search Engines (began Aug 2002) http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#2238 Australia stirs on metrics (June 2006) http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/5417.html Big Brother and Digitometrics (began May 2001) http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#1298 UK Research Assessment Exercise (RAE) review (began Oct 2002) http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#2326 Need for systematic scientometric analyses of open-access data (began Dec 2002) http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#2522 Potential Metric Abuses (and their Potential Metric Antidotes) (began Jan 2003) http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#2643 Future UK RAEs to be Metrics-Based (began Mar 2006) http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#5251 Let 1000 RAE Metric Flowers Bloom: Avoid Matthew Effect as Self-Fulfilling Prophecy (Jun 2006) http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/5418.html From harnad at ECS.SOTON.AC.UK Sun Nov 5 08:41:08 2006 From: harnad at ECS.SOTON.AC.UK (Stevan Harnad) Date: Sun, 5 Nov 2006 13:41:08 +0000 Subject: What Can and Should Be Mandated In-Reply-To: Message-ID: Many thanks to Kimmo Kuusela for the prompt provision of data on Finland's research output, by discipline! On the question of whether the proportion of national research output published in subsidised national journals is closer to 5% or 50%, the answer for Finland overall is closer to 5%; but looked at by discipline, for arts, humanities and social sciences it is closer to 50%. (The overall average is presumably 16% because of the lower relative proportion of articles in the arts, humanities and social sciences.) On the basis of these data, if I were a Finnish researcher, institution or funder, I would hope that (1) all Finnish researchers would be required by their funders and institutions to self-archive their own journal articles and that (2) all subsidised Finnish journals would be required by their subsiders to make their online editions open access. I don't think trying to combine (1) and (2) into a single mandate would make much sense, since not only would the *requirees* -- researchers in (1), publishers in (2) -- not be the same in the two cases, but it is not even clear that the *requirers* -- research institutions and funders in (1), journal subsidisers in (2) -- would be the same either. Hence it would be best if the two were pursued separately, in parallel. It is also worth noting that (1) would already moot (2), since 100% OA self-archiving would include the OA self-archiving of the subsidised 16% too! But I agree with Jean-Claude Guedon that this is no reason not to pursue the subsidised option (2) in parallel: just don't wrap (2) into (1) (at least not until (1) is adopted!). It would be splendid if we could see data from other countries (along with their discipline data) along the lines Kimmo Kuusela has provided for Finland. (Arthur Sale has already made a stab for Australia, though I'll bet there are a few subsidised journals still lurking in the Aussie outback somewhere, possibly in the arts?) Stevan Harnad On Sun, 5 Nov 2006, Kimmo Kuusela wrote: > wrote: > > >No need to compare Finnish journals to Dutch journals. Just > >Finnish research output in subsidised journals to total Finnish > >research output. (If there is a way to estimate relative > >quality, that would be helpful too, as would separate tallies > >by discipline.) > > I have compared Finnish research output in *Finnish* subsidised journals > (that is, practically all Finnish journals) to total Finnish research output > in 2005. > > The results: > > 16 % (2100) of the total number of articles (12839) in refereed journals > were published in Finnish journals. > > By field of education: > > Theology 56 % > The Humanities 49 % > Art and Design 64 % > Music 44 % > Theatre and Dance 43 % > Education 45 % > Sport Sciences 0 % > Social Sciences 47 % > Psychology 16 % > Health Sciences 31 % > Law 77 % > Economics 21 % > Natural sciences 6 % > Agriculture and Forestry 14 % > Engineering 8 % > Medicine 11 % > Dentistry 18 % > Veterinary Medicine 2 % > Pharmacy 13 % > Fine Arts 100 % > Field of education unspecified 15 % > > Here's a handy tool for even more calculations: > > http://kotaplus.csc.fi:7777/online/Etusivu.do?lng=en > > -- Kimmo Kuusela > From Chaomei.Chen at CIS.DREXEL.EDU Sun Nov 5 10:02:43 2006 From: Chaomei.Chen at CIS.DREXEL.EDU (Chaomei Chen) Date: Sun, 5 Nov 2006 10:02:43 -0500 Subject: Chaomei Chen/Drexel_IST is out of the office. Message-ID: I will be out of the office starting Sun 11/05/2006 and will not return until Thu 11/09/2006. I will respond to your message as soon as I can. -------------- next part -------------- An HTML attachment was scrubbed... URL: From isidro at CINDOC.CSIC.ES Mon Nov 6 02:25:21 2006 From: isidro at CINDOC.CSIC.ES (Isidro F. Aguillo) Date: Mon, 6 Nov 2006 08:25:21 +0100 Subject: Reminder: ISSI 2007 Madrid In-Reply-To: Message-ID: Sorry for duplication of messages sent to different lists: The 11th International Conference of the International Society for Scientometrics and Informetrics will be held on June 25-27, 2007 in Madrid (Spain) http://issi2007.cindoc.csic.es/ Full papers, research-in-progress papers and posters are accepted for this conference. The category "research-in-progress papers" is meant to enable participants to present ongoing research, and particularly their most recent work that is not yet fully completed at the time papers are to be submitted. A research in progress paper is shorter than a full paper (see below). The authors should clearly outline the central objective or hypothesis of the research, and present preliminary or intermediary results. If they intend to present their most recent findings (not yet available at the submission date) at the conference, they should clearly indicate their potential significance. Authors are requested to submit their contributions using the ISSI 2007 electronic submission form: http://www.softconf.com/start/ISSI2007/ Valid document formats are: Microsoft Word (doc) and Rich Text Format (rtf). No Adobe Acrobat (pdf), please! ? Full papers - max. 4500 words. ? Research-in-progress papers - max. 2000 words ? Poster presentations - abstract of max. 2 pages. Important dates: ? Full paper and research-in-progress paper submission, deadline: 30 November 2006 ? Notification of acceptance of paper submissions: 31 January 2007 ? Poster submission, deadline: 1 February 2007 ? Doctoral Forum application, deadline: 1 February 2007 ? Notification of acceptance of posters: 28 February 2007 ? Camera ready papers due in MS Word/RTF format: 28 February 2007 -- *************************************** Isidro F. Aguillo isidro at cindoc.csic.es Ph:(+34) 91-5635482 ext. 313 Cybermetrics Research Group CINDOC-CSIC Joaquin Costa, 22 28002 Madrid. SPAIN http://www.webometrics.info http://www.cindoc.csic.es/cybermetrics http://internetlab.cindoc.csic.es **************************************** From jbollen at LANL.GOV Mon Nov 6 16:26:18 2006 From: jbollen at LANL.GOV (Johan Bollen) Date: Mon, 6 Nov 2006 14:26:18 -0700 Subject: MESUR project at LANL Message-ID: -- My apologies for multiple postings -- Andrew W. Mellon Foundation funds two-year LANL project for the development of metrics derived from scholarly usage data. Los Alamos, New Mexico, November 6th 2006 - The Andrew W. Mellon Foundation has awarded funding to Los Alamos National Laboratory (LANL) in support of the two-year MESUR project that will investigate metrics derived from the network-based usage of scholarly information. The Digital Library Research & Prototyping Team of the LANL Research Library will carry out the project. Johan Bollen is the Principal Investigator, Herbert Van de Sompel serves as an architectural consultant, and Aric Hagberg of the LANL Mathematical Modeling and Analysis group serves as modeling consultant. Marko A. Rodriguez, PhD student at the University of California Santa Cruz and LANL Graduate Research Assistant, supports the project's research and development. The project's major objective is enriching the toolkit used for the assessment of the impact of scholarly communication items, and hence of scholars, with metrics that derive from usage data. The project will start with the creation of a semantic model of scholarly communication, and an associated large-scale semantic store that relates a range of scholarly bibliographic, citation and usage data obtained from a variety of sources. Next, an investigation into the definition and validation of usage-based metrics will be conducted on the basis of this comprehensive collection. Finally, the defined metrics will be cross-validated, resulting in the formulation of guidelines and recommendations for future applications of metrics derived from scholarly usage data. Project results will be made public on the project's web site . The MESUR project currently has an open position for a software developer; a job description is available at ---- Johan Bollen Los Alamos National Laboratory (P362-proto) Los Alamos, NM 87545 Ph: 505 606 0030, Fx: 505 665 6452 jbollen at lanl.gov, http://public.lanl.gov/jbollen ---- From harnad at ECS.SOTON.AC.UK Thu Nov 9 14:00:04 2006 From: harnad at ECS.SOTON.AC.UK (Stevan Harnad) Date: Thu, 9 Nov 2006 19:00:04 +0000 Subject: Proportion Open Access in Biomedical Sciences Message-ID: Comments on: Matsubayashi, Mamiko and Kurata, Keiko and Sakai, Yukiko and Morioka, Tomoko and Kato, Shinya and Mine, Shinji and Ueda, Shuichi (2006) Current Status of Open Access in Biomedical Field - the Comparison of Countries Related to the Impact of National Policies. 2006 Annual Meeting of the American Society for Information Science and Technology, Austin, Texas. http://dlist.sir.arizona.edu/1624/ This study randomly sampled 4756 biomedical articles published between January and September in 2005 and indexed in PubMed, hand-checking how many of them were OA, and if so how: via OA journal (gold) or self-archiving (green, via IRs or websites). Its findings: 75% of the sampled 4756 articles were available online. 25% of the sampled 4756 articles were OA. Over 70% of the 1189 (25%) OA articles were OA via OA or Hybrid OA journals 10.9% of the 1189 (25%) OA articles were OA via IRs or websites (6.0% and 4.9% respectively). 20.6% of the articles in journals with an Impact Factor were OA. 30.8% of the articles in journals with no Impact Factor were OA. Countries were compared, but the variation is more likely in national practices than in national policies. The authors note that their 25% OA estimate in biomedical sciences in 2005 is higher than Hajjem et al's s estimate of 15% OA in biology and 6% OA in health (but Hajjem et al's sample was for 1992-2003, based only on articles indexed by Thompson ISI, and explicitly excluded articles published in OA journals, hence the relevant comparison figure is the present study;s 10.9% for self-archiving). The authors also note that their estimate of 10.9% self-archiving is lower than Swan's estimate of 49% (but Swan's sample was for all disciplines, and the 49% referred only to the proportion of respondents who had self-archived at least one article). Presumably "articles in journals that had an Impact Factor" means articles in journals indexed by Thompson ISI. If so, then the finding that fewer ISI articles are OA means that fewer ISI journals are OA and/or fewer authors of articles in non-ISI journals self-archive. There is considerable scope for variability here (by year, by field, by quality, and by country), but it is certainly true that fewer ISI journals than non-ISI journals are OA (though "Hybrid OA"/Open-Choice may change that). Several studies -- from Lawrence 2001 to Hajjem et al 2005 -- have reported that there is a positive correlation between citation-bracket and OA (the higher the citations, the more likely the article is OA), and there is disagreement over how much of this effect is a causal Quality Advantage (OA causing higher citations for higher quality articles) or a self-selection Quality Bias (authors of higher quality articles being more likely to make them OA, one way or the other). The present results don't resolve this, as they go both ways. Clearly, more studies are needed. But even more than that, more OA is needed! Stevan Harnad From loet at LEYDESDORFF.NET Fri Nov 10 01:58:45 2006 From: loet at LEYDESDORFF.NET (Loet Leydesdorff) Date: Fri, 10 Nov 2006 07:58:45 +0100 Subject: the relative decline of US and EU science Message-ID: Dear colleagues, Since the NSF has held a workshop on November 7 on "the decline of US science," I was asked whether this "decline" can be explained in terms of the increases of China. Indeed, it can! The figure below shows that the US looses 0.23% point/year and the EU-25 0.34%, while China gains 0.71%/year when using linear fits. Consequently, the decline of the (US + EU) does not even fully account for the gains of China; the rest of the world also contributes 0.14%. In other words, it seems that the shifts are approximately pro rato. These figures are based on: Loet Leydesdorff & Caroline Wagner, Is the United States losing ground in science? A global perspective on the world science system in 2005. < pdf-version> With best wishes, Loet _____ Loet Leydesdorff Amsterdam School of Communications Research (ASCoR) Kloveniersburgwal 48, 1012 CX Amsterdam Tel.: +31-20- 525 6598; fax: +31-20- 525 3681 loet at leydesdorff.net ; http://www.leydesdorff.net/ Now available: The Knowledge-Based Economy: Modeled, Measured, Simulated. 385 pp.; US$ 18.95 The Self-Organization of the Knowledge-Based Society; The Challenge of Scientometrics -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Outlook.jpg Type: image/jpeg Size: 67172 bytes Desc: not available URL: From toshev at CHEM.UNI-SOFIA.BG Sat Nov 11 10:19:54 2006 From: toshev at CHEM.UNI-SOFIA.BG (=?windows-1251?Q?B.V._Toshev?=) Date: Sat, 11 Nov 2006 10:19:54 -0500 Subject: BULGARIAN JOURNAL OF SCIENCE AND EDUCATION POLICY: CALL FOR PAPERS Message-ID: BULGARIAN JOURNAL OF SCIENCE AND EDUCATION POLICY (BJSEP) CALL FOR PAPERS The Bulgarian Journal of Science and Education Policy (BJSEP) is a peer- reviewed scholarly journal of the Council of Rectors (Bulgaria) with an international audience. The journal presents scholarly, rigorously argued analyses of higher educational and scientific institutions. The general problems of the other levels of education are not excluded as well. BJSEP deals with these institutions and their traditions historically and in their interaction with political, economic, moral and administrative problems. The reform of the educational systems and the efficiency of the research activities are both on the focus. BJSEP presents information, interpretations, and criticism in regard to current development in the field, in order to improve articulation on research and higher education policy and practice. In addition BJSEP seeks descriptions and evaluations of current innovations and provocative new ideas with relevance to the journal?s scope. We want to emphasize on the effect of these innovations and ideas on teaching, students and researchers. Among the journal topics are: Educational Policy and Management; University Policy and Management; Research Policy and Research Efficiency; History and Philosophy of Education; History and Philosophy of Science; People in Science and Education; Science and Society; Book Reviews. Contributors from both faculty and administrators from all over the world are encouraged to send manuscripts that should be written in a readable and scholarly manner. Manuscripts (in English or in Bulgarian) should not exceed 15 standard pages in length including illustrations, tables, figures and references. Articles must be accompanied by a summary of size not exceeding 15 lines. Style should conform to that of the Publication Manual of the Psychological Association, widely used for such type of publications. Manuscripts should be sent to the Editor of BJSEP: Professor B.V. Toshev, University of Sofia, 1 James Bourchier Blvd., 1164 Sofia, BULGARIA E-Mail: toshev at chem.uni-sofia.bg The electronic submission of the manuscripts (in word format) is preferable. The deadline for the Issue 1 of BJSEP is 18 December 2006. From yangly at MAIL.LAS.AC.CN Mon Nov 13 00:28:56 2006 From: yangly at MAIL.LAS.AC.CN (=?gb2312?B?eWFuZ2x5?=) Date: Mon, 13 Nov 2006 13:28:56 +0800 Subject: collaboration institutions Message-ID: Dear friends, I am a PHD student at the Chinese National Academy in Beijing. I am trying to find papers that discuss bibliometric studies of collaboration among institutions. There are many papers that discuss collaboration of authors, but I can't find any that talk about institutions. Does anyone know of any papers about that topic? wish to your reply, Thanks, Best regards, LiYing Yang, Beijing. -------------- next part -------------- An HTML attachment was scrubbed... URL: From christian.schloegl at UNI-GRAZ.AT Mon Nov 13 04:10:28 2006 From: christian.schloegl at UNI-GRAZ.AT (Christian Schloegl) Date: Mon, 13 Nov 2006 10:10:28 +0100 Subject: collaboration institutions In-Reply-To: <45580298.000208.20256@app-03> Message-ID: Dear colleague, maybe the COLLNET research network can be useful for you. Please have a look at http://www.collnet.de/ The main coordinator is Ms. Hildrun Kretschmer. Best regards, Christian > Dear friends, > I am a PHD student at the Chinese National Academy in Beijing. I am > trying to find papers that discuss bibliometric studies of > collaboration among institutions. There are many papers that discuss > collaboration of authors, but I can't find any that talk about > institutions. Does anyone know of any papers about that topic?wish to > your reply,Thanks, > Best regards, > LiYing Yang, Beijing. > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From krichel at OPENLIB.ORG Mon Nov 13 04:28:46 2006 From: krichel at OPENLIB.ORG (Thomas Krichel) Date: Mon, 13 Nov 2006 03:28:46 -0600 Subject: collaboration institutions In-Reply-To: <45583684.7030002@uni-graz.at> Message-ID: Christian Schloegl writes > maybe the COLLNET research network can be useful for you. Please have a > look at http://www.collnet.de/ Yes. There was a paper by Yuan Sun on institutional collaboration at the last Collnet meeting. The proceedings of that meeting are archived at E-LIS http://eprints.rclis.org/view/conftitle/International_Workshop_on_Webometrics,_Informetrics_and_Scientometrics_=_Seventh_COLLNET_Meeting.html RePEc have collected data for both identified authors and identified institutions. Such data could be easily used to look at institutional collaboration in economics. Email me for details. Cheers, Thomas Krichel mailto:krichel at openlib.org http://openlib.org/home/krichel RePEc:per:1965-06-05:thomas_krichel skype id: thomaskrichel From j.s.katz at SUSSEX.AC.UK Mon Nov 13 07:34:47 2006 From: j.s.katz at SUSSEX.AC.UK (Sylvan Katz) Date: Mon, 13 Nov 2006 06:34:47 -0600 Subject: collaboration institutions In-Reply-To: <45580298.000208.20256@app-03> Message-ID: I have published a number of papers on institutional collaboration. Many other have been published in Sceintometric. Here are some of my publications. The reference in these papers and reports will give you others. Katz, J. S. and B. R. Martin, 1997: What is Research Collaboration. Research Policy, 26, 1-18. Katz, J. S. and D. Hicks, 1998: Indicators for Systems of Innovation - a bibliometrics-based approach Project No. PL951005 under the Targeted Socio-Economic Research Programme. Katz, J. S., 1994: Geographical Proximity and Research Collaboration. Scientometrics, 1, 31-41. Katz, J. S., 1992: Bibliometric assessment of university-university collaboration, SPRU, University of Sussex, 231. Smith, D. and J. S. Katz, 2000: Collaborative Approaches to Research, HEFCE Fundamental Review of Research Policy and Funding, Final Report. Dr. J. Sylvan Katz, Visiting Fellow SPRU, University of Sussex http://www.sussex.ac.uk/Users/sylvank Adjunct Professor Mathematics & Statistics, University of Saskatchewan Associate Researcher Institut national de la recherche scientifique, University of Quebec From kretschmer.h at T-ONLINE.DE Mon Nov 13 12:06:48 2006 From: kretschmer.h at T-ONLINE.DE (kretschmer.h@t-online.de) Date: Mon, 13 Nov 2006 18:06:48 +0100 Subject: collaboration institutions In-Reply-To: <45580298.000208.20256@app-03> Message-ID: An HTML attachment was scrubbed... URL: From andrea.scharnhorst at VKS.KNAW.NL Mon Nov 13 16:12:50 2006 From: andrea.scharnhorst at VKS.KNAW.NL (Andrea Scharnhorst) Date: Mon, 13 Nov 2006 22:12:50 +0100 Subject: collaboration institutions In-Reply-To: <45580298.000208.20256@app-03> Message-ID: For institutional analysis you could also look into webometrics (as the e-mail of Hildrun already indicated). A lot of hyperlink studies are on a institutional level. See the web site of Mike Thelwall http://www.scit.wlv.ac.uk/~cm1993/ and his book Link Analysis, and the web project of the CINDOC group http://www.webometrics.info Warmest regards Andrea Scharnhorst www.virtualknowledgestudio.nl > Dear friends, > I am a PHD student at the Chinese National Academy in Beijing. I am > trying to find papers that discuss bibliometric studies of collaboration > among institutions. There are many papers that discuss collaboration of > authors, but I can't find any that talk about institutions. Does anyone > know of any papers about that topic? wish to your reply, Thanks, > Best regards, > > LiYing Yang, Beijing. > > > From harnad at ECS.SOTON.AC.UK Tue Nov 14 06:06:21 2006 From: harnad at ECS.SOTON.AC.UK (Stevan Harnad) Date: Tue, 14 Nov 2006 11:06:21 +0000 Subject: Draft report from Australian government recommends OA mandate (fwd) Message-ID: From Peter Suber's Open Access News http://www.earlham.edu/~peters/fos/2006_11_12_fosblogarchive.html#116344467230316048 Draft report from Australian government recommends OA mandate The Australian Government Productivity Commission has released an important study, Public Support for Science and Innovation: Draft Research Report (November 2, 2006). (Thanks to Colin Steele.) http://www.pc.gov.au/study/science/draftreport/index.html Excerpt: Impediments to the functioning of the innovation system [:]....There is scope for the ARC and the NHMRC to play a more active role than they currently do in promoting access to the results of research they fund. They could require as a condition of funding that research papers, data and other information produced as a result of their funding are made publicly available such as in an "open access" repository. The Australian Government has sought to enhance access to the results of publicly funded research through the: - development of an Accessibility Framework for Publicly Funded Research; and - allocation of funding under the Systemic Infrastructure Initiative to build technical information infrastructure that supports the creation, dissemination of and access to knowledge, and the use of digital assets and their management (box 5.10).... In a recent report to DEST, Houghton et al. (2006) estimated net gains from improving access to publicly-funded research across the board and in particular research sectors (table 5.2). http://www.dest.gov.au/NR/rdonlyres/0ACB271F-EA7D-4FAF-B3F7-0381F441B175/13935/DEST_Research_Communications_Cost_Report_Sept2006.pdf - The estimated benefits from an assumed 5 per cent increase in access and efficiency and level of social rate of return were between $2 million (ARC competitively-funded research) and $628 million (gross expenditure on R&D). - Assuming a move from this level of improved access and efficiency to a national system of institutional repositories in Australia over twenty years, the estimated benefit/cost ratios were between 3.1 (NHMRC-funded research) to 214 (gross expenditure on R&D).... Of interest, is whether funding agencies themselves could become more actively involved in enhancing access to the results of the research they fund.... In their recent report to DEST, Houghton et al. (2006) http://openaccess.eprints.org/index.php?/archives/120-guid.html made a number of suggestions to improve access to and dissemination of research including: - developing a national system of institutional or enterprise-based repositories to support new modes of enquiry and research; ... - ensuring that the Research Quality Framework supports and encourages the development of new, more open scholarly communication mechanisms, rather than encouraging "a retreat" by researchers to conventional publication forms and media, and a reliance by evaluators upon traditional publication metrics (for example, by ensuring dissemination and impact are an integral part of evaluation); - encouraging funding agencies (for example, ARC and NHMRC) to mandate that the results of their supported research be made available in open access archives and repositories; - encouraging universities and research institutions to support the development of new, more open scholarly communication mechanisms, through, for example, the development of "hard or soft open access" mandates for their supported research; and - providing support for a structured advocacy program to raise awareness and inform all stakeholders about the potential benefits of more open scholarly communication alternatives, and provide leadership in such areas as copyright (for example, by encouraging use of "creative commons" licensing) (pp. xii-xiii).... Several impediments to innovation should be addressed: ... - published papers and data from ARC and NHMRC-funded projects should be freely and publicly available.... --- Comment [from Peter Suber]: It's important that this report was written by a government commission and important that it recommends an OA mandate. From the file of preliminaries: You are invited to examine this draft research study and to provide written submissions to the Commission. Submissions should reach the Commission by Thursday, 21 December 2006. In addition, the Commission intends to hold a limited number of consultations to obtain feedback on this draft. The Commission intends to present its final report to the Government in early March 2007. The Productivity Commission gives no address (and worse, no email address) specifically for comments, but it does give this contact info for its Media and Publications division: Locked Bag, 2 Collins Street East Melbourne VIC 8003 Fax: (03) 9653 2303 Email: maps at pc.gov.au [Peter Suber, Open Access News] http://www.earlham.edu/~peters/fos/2006_11_12_fosblogarchive.html#116344467230316048 From juan.gorraiz at UNIVIE.AC.AT Tue Nov 14 10:49:25 2006 From: juan.gorraiz at UNIVIE.AC.AT (Juan Gorraiz) Date: Tue, 14 Nov 2006 16:49:25 +0100 Subject: collaboration institutions In-Reply-To: <45583684.7030002@uni-graz.at> Message-ID: Hast du die Daten erhalten? abrazos desde san diego juan On Mo, 13.11.2006, 10:10, Christian Schloegl wrote: > Adminstrative info for SIGMETRICS (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html > > Dear colleague, > > maybe the COLLNET research network can be useful for you. Please have a > look at http://www.collnet.de/ > The main coordinator is Ms. Hildrun Kretschmer. > > Best regards, > Christian > >> Dear friends, >> I am a PHD student at the Chinese National Academy in Beijing. I am >> trying to find papers that discuss bibliometric studies of >> collaboration among institutions. There are many papers that discuss >> collaboration of authors, but I can't find any that talk about >> institutions. Does anyone know of any papers about that topic?wish to >> your reply,Thanks, >> Best regards, >> LiYing Yang, Beijing. >> >> >> >> > > From havemanf at CMS.HU-BERLIN.DE Tue Nov 14 11:03:42 2006 From: havemanf at CMS.HU-BERLIN.DE (Frank Havemann) Date: Tue, 14 Nov 2006 17:03:42 +0100 Subject: collaboration institutions In-Reply-To: <1GjfGi-0Aca8G0@fwd29.aul.t-online.de> Message-ID: Dear LiYing Yang, the second publication Hildrun mentioned is in an Open-Acces-journal: http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=1533863 Yours sincerely, Frank Havemann *************************** Dr. Frank Havemann Department of Library and Information Science Humboldt University Dorotheenstr. 26 D-10099 Berlin Germany tel.: (0049) (030) 2093 4228 http://www.ib.hu-berlin.de/inf/havemann.html 13. November 2006 18:06 kretschmer.h at t-online.de: > Adminstrative info for SIGMETRICS (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html > > Dear LiYing Yang, > > > > > We have 3 publications: > > > > Kretschmer, H., U. Hoffmann & T. Kretschmer: Collaboration structures > between German immunology institutions, and gender visibility, as reflected > in the Web. Research Evaluation. Vol. 15, No 2 (2006), 117-126 > > > ??????????? > > > ??????????? > > > ??????????? Havemann, F., M. Heinz & H. Kretschmer: Collaboration and > distances between ????? German immunological institutes. Journal of > Biomedical Discovery and Collaboration 2006, 1:6 (14 June 2006) > > > > ? > > > > Kretschmer, H., U. Kretschmer, T. Kretschmer: Reflection of Co-authorship > Networks??in the Web: Web Hyperlinks versus Web Visibility Rates. (accepted > for publication in?Scientometrics) > > > > > ? > > > With best wishes, > > > Hildrun Kretschmer > > > > -----Original Message----- > Date: Mon, 13 Nov 2006 06:28:56 +0100 > Subject: [SIGMETRICS] collaboration institutions > From: yangly > To: SIGMETRICS at LISTSERV.UTK.EDU > > > Dear friends, > I am a PHD student at the Chinese National Academy in Beijing. ?I am trying > to find papers that discuss bibliometric studies of collaboration among > institutions. ?There are many papers that discuss collaboration of authors, > but I can't find any that talk about institutions. ?Does anyone know of any > papers about that topic? wish to your reply, Thanks, ? Best regards, > ? ? ? ? ? ? ? ? ? ? ? ? ? > ? ? ? ? ? ? ? ? ? ? ? ?LiYing Yang, Beijing. From havemanf at CMS.HU-BERLIN.DE Tue Nov 14 13:48:55 2006 From: havemanf at CMS.HU-BERLIN.DE (Frank Havemann) Date: Tue, 14 Nov 2006 19:48:55 +0100 Subject: network visualization linux Message-ID: I recommend the sna-package of R: http://www.r-project.org/ 26. September 2006 16:40 michele.catanzaro at tin.it: > Hallo, > > I would be most grateful if somebody of you could tell me if > there is any program for network visualiztion I can istall in a linux > environment (something like a pajek for Linux). Indeed, I need a quite > simple program, for small to medium networks. I tried Osprey, but it > asks an input with a formar I can't reproduce. > > Thank you in advance, > > Michele From krichel at OPENLIB.ORG Tue Nov 14 14:36:27 2006 From: krichel at OPENLIB.ORG (Thomas Krichel) Date: Tue, 14 Nov 2006 13:36:27 -0600 Subject: network visualization linux In-Reply-To: <200611141948.55606.havemanf@cms.hu-berlin.de> Message-ID: Frank Havemann writes > I recommend the sna-package of R: > http://www.r-project.org/ I have had problems with sna and had to abandon it to calculate my 4k+ author network paths. I kept running into memory problems even on my HP Proliant 8 processor box with close to 4G of memory. I ended up having to write my own software to do the calculations. I have not used its vizualization features. But that was last year. Maybe this is fixed now. Cheers, Thomas Krichel mailto:krichel at openlib.org http://openlib.org/home/krichel RePEc:per:1965-06-05:thomas_krichel skype id: thomaskrichel From elw at STDERR.ORG Tue Nov 14 15:39:02 2006 From: elw at STDERR.ORG (Elijah Wright) Date: Tue, 14 Nov 2006 14:39:02 -0600 Subject: network visualization linux In-Reply-To: <20061114193627.GA26377@openlib.org> Message-ID: We've used R's sna package locally to generate lots (and lots, and lots, and lots...) of graph plots. It works well, particularly for work with data that may not conform to people's initial expectations of what relations may be... However... it is sometimes a bit memory-hungry. :) You are correct in asserting that you will need quite a bit of memory to plot a large network - I've used numerous 16GB and 32GB machines running Sun's Solaris to plot absurdly-sized networks with what I would call a good degree of success. All of the standard tricks for working with large matrices work quite well with R-sna. [In effect, there's a real payoff for properly trimming or partitioning your data as early in the analysis process as possible...] Getting to know the other R packages that deal with matrix-oriented data is also highly recommended, for those who want to make serious use of 'sna'. Several of the more useful tools are not actually *in* the sna package, but found inside other toolkits that are packaged for R. A few of Katy Borner's students have simply run Pajek under Wine; that seems to work well enough. I don't think I've seen anyone run UCINET under Wine, but I suppose that should also be possible. --elijah On Tue, 14 Nov 2006, Thomas Krichel wrote: > Date: Tue, 14 Nov 2006 13:36:27 -0600 > From: Thomas Krichel > Reply-To: ASIS&T Special Interest Group on Metrics > > To: SIGMETRICS at LISTSERV.UTK.EDU > Subject: Re: [SIGMETRICS] network visualization linux > > Adminstrative info for SIGMETRICS (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html > > Frank Havemann writes > >> I recommend the sna-package of R: >> http://www.r-project.org/ > > I have had problems with sna and had to abandon it to > calculate my 4k+ author network paths. I kept running into > memory problems even on my HP Proliant 8 processor box > with close to 4G of memory. I ended up having to write > my own software to do the calculations. I have not used its > vizualization features. > > But that was last year. Maybe this is fixed now. > > > > Cheers, > > Thomas Krichel mailto:krichel at openlib.org > http://openlib.org/home/krichel > RePEc:per:1965-06-05:thomas_krichel > skype id: thomaskrichel > From Chaomei.Chen at CIS.DREXEL.EDU Tue Nov 14 16:00:50 2006 From: Chaomei.Chen at CIS.DREXEL.EDU (Chaomei Chen) Date: Tue, 14 Nov 2006 16:00:50 -0500 Subject: Chaomei Chen/Drexel_IST is out of the office. Message-ID: I will be out of the office starting 11/14/2006 and will not return until 11/26/2006. I will respond to your message as soon as I can. -------------- next part -------------- An HTML attachment was scrubbed... URL: From nisa.bakkalbasi at YALE.EDU Tue Nov 14 16:19:33 2006 From: nisa.bakkalbasi at YALE.EDU (Nisa Bakkalbasi) Date: Tue, 14 Nov 2006 16:19:33 -0500 Subject: network visualization linux In-Reply-To: Message-ID: I'd like to make a comment on R's sna package and memory problems for clarification. The memory problems I encountered are not related to plotting a large network or visualizing network data, but it was related to computing one metric only; that is "betweenness." This issue has been discussed at the recent ASIST meeting and colleagues offered alternative approaches. Nisa ------------------------ Nisa Bakkalbasi Electronic Collections Librarian Yale University Library 130 Wall Street, P.O. Box 208240 New Haven, CT 06520-8240 Phone: (203) 436-2851 | Fax: (203) 432-8527 At 03:39 PM 11/14/2006, Elijah Wright wrote: >Adminstrative info for SIGMETRICS (for example unsubscribe): >http://web.utk.edu/~gwhitney/sigmetrics.html > >We've used R's sna package locally to generate lots (and lots, and >lots, and lots...) of graph plots. It works well, particularly for >work with data that may not conform to people's initial expectations >of what relations may be... > >However... it is sometimes a bit memory-hungry. :) You are correct >in asserting that you will need quite a bit of memory to plot a >large network - I've used numerous 16GB and 32GB machines running >Sun's Solaris to plot absurdly-sized networks with what I would call >a good degree of success. > >All of the standard tricks for working with large matrices work >quite well with R-sna. [In effect, there's a real payoff for >properly trimming or partitioning your data as early in the analysis >process as possible...] Getting to know the other R packages that >deal with matrix-oriented data is also highly recommended, for those >who want to make serious use of 'sna'. Several of the more useful >tools are not actually *in* the sna package, but found inside other >toolkits that are packaged for R. > > >A few of Katy Borner's students have simply run Pajek under Wine; >that seems to work well enough. I don't think I've seen anyone run >UCINET under Wine, but I suppose that should also be possible. > >--elijah > > >On Tue, 14 Nov 2006, Thomas Krichel wrote: > >>Date: Tue, 14 Nov 2006 13:36:27 -0600 >>From: Thomas Krichel >>Reply-To: ASIS&T Special Interest Group on Metrics >> >>To: SIGMETRICS at LISTSERV.UTK.EDU >>Subject: Re: [SIGMETRICS] network visualization linux >>Adminstrative info for SIGMETRICS (for example unsubscribe): >>http://web.utk.edu/~gwhitney/sigmetrics.html >> >> Frank Havemann writes >> >>>I recommend the sna-package of R: >>>http://www.r-project.org/ >> >> I have had problems with sna and had to abandon it to >> calculate my 4k+ author network paths. I kept running into >> memory problems even on my HP Proliant 8 processor box >> with close to 4G of memory. I ended up having to write >> my own software to do the calculations. I have not used its >> vizualization features. >> >> But that was last year. Maybe this is fixed now. >> >> >> >> Cheers, >> >> Thomas Krichel mailto:krichel at openlib.org >> http://openlib.org/home/krichel >> RePEc:per:1965-06-05:thomas_krichel >> skype id: thomaskrichel From liberman at SERVIDOR.UNAM.MX Tue Nov 14 17:28:45 2006 From: liberman at SERVIDOR.UNAM.MX (Sofia Liberman) Date: Tue, 14 Nov 2006 17:28:45 -0500 Subject: International Society for the Psychology of Science & Technology Message-ID: We are quite pleased and excited to announce the formation of the first society for the psychology of science, the "International Society for the Psychology of Science & Technology"! It has been years in the making and fills a gap in the studies of science. Philosophy of science, History of Science, and Sociology of Science have had their own organizations for decades or longer, and now it's psychology's turn to have a unqiue voice in the study of scientific thought and behavior. We hope you will explore our new website http://www.psychologyofscience.org/index.php We also hope you will not only join the society, but become an active founding member. The Society's vitality will come directly from the vitality of its members. Because this is all so new, we welcome any feedback you might have concerning the Society and/or the webpage. Welcome and please join us in this exciting new venture. ~Gregory Feist, Michael Gorman, & Sofia Liberman PS: Also, visit our new posting on Wikipedia and feel free to edit and add missing details and information you would like to see From jessica.shepherd at THES.CO.UK Thu Nov 16 12:33:11 2006 From: jessica.shepherd at THES.CO.UK (Shepherd, Jessica) Date: Thu, 16 Nov 2006 17:33:11 -0000 Subject: No subject Message-ID: Dear Sigmetrics subscribers, I am a journalist at The Times Higher Education Supplement who is very interested in citation trends. I have been told that some academic journals have a policy of encouraging or even requiring their authors to cite papers in the same journal, in order to raise its citation impact. I am very keen to hear any examples of this. I have also been told that citation clubs exist in which colleagues agree to cite each other a certain number of times per year. Any information on this would be extremely gratefully received. Many thanks for your help. Best wishes, Jessica Jessica Shepherd News Reporter The Times Higher Education Supplement 020 7782 3303 07957147308 The Newspaper Marketing Agency: Opening Up Newspapers: www.nmauk.co.uk This e-mail and all attachments are confidential and may be privileged. If you have received this e-mail in error, notify the sender immediately. Do not use, disseminate, store or copy it in any way. Statements or opinions in this e-mail or any attachment are those of the author and are not necessarily agreed or authorised by News International (NI). NI Group may monitor emails sent or received for operational or business reasons as permitted by law. NI Group accepts no liability for viruses introduced by this e-mail or attachments. You should employ virus checking software. News International Limited, 1 Virginia St, London E98 1XY, is the holding company for the News International group and is registered in England No 81701 -------------- next part -------------- An HTML attachment was scrubbed... URL: From garfield at CODEX.CIS.UPENN.EDU Thu Nov 16 15:59:41 2006 From: garfield at CODEX.CIS.UPENN.EDU (=?windows-1252?Q?Eugene_Garfield?=) Date: Thu, 16 Nov 2006 15:59:41 -0500 Subject: Lange T. "The imprecise science of evaluating scholarly performance - Utilizing broad quality categories for an assessment of business and management journals" EVALUATION REVIEW 30 (4): 505-532 AUG 2006 Message-ID: Thomas Lange : thomas.lange at aut.ac.nz Title: The imprecise science of evaluating scholarly performance - Utilizing broad quality categories for an assessment of business and management journals Author(s): Lange T (Lange, Thomas) Source: EVALUATION REVIEW 30 (4): 505-532 AUG 2006 Document Type: Article Language: English Cited References: 60 Times Cited: 0 Abstract: In a growing number of countries, government-appointed assessment panels develop ranks on the basis of the quality of scholarly outputs to apportion budgets in recognition of evaluated performance and to justify public funds for future R&D activities. When business and management journals are being grouped in broad quality categories, a recent study has noted that this procedure teas placing the same journals in essentially the same categories. Drawing on journal quality categorizations by several German- and English-speaking business departments and academic associations, the author performs nonparametric tests and correlations to analyze whether this claim can be substantiated. In particular, he examines the ability of broad quality categorizations to add value to governmental, administrative, and academic decision making by withstanding the criticism traditionally levied at research quality assessments. Addresses: Lange T (reprint author), Auckland Univ Technol, Fac Business, Auckland, New Zealand Auckland Univ Technol, Fac Business, Auckland, New Zealand Publisher: SAGE PUBLICATIONS INC, 2455 TELLER RD, THOUSAND OAKS, CA 91320 USA Subject Category: SOCIAL SCIENCES, INTERDISCIPLINARY IDS Number: 065IB ISSN: 0193-841X CITED REFERENCES: *ALL CONS GROUP INT COMP RES QUAL NZ : 2004 *COMM AUSTR RES QUAL FRAM ASS QU : 2005 *HIGH ED AUTH PROGR RES 3 LEV INS : 2003 *ORG EC COOP DEV FRASC MAN PROP STAND : 2002 *TERT ED COMM PERF BAS RES FUND EV : 2004 ALEXANDER JC RELATIVE SIGNIFICANCE OF JOURNALS, AUTHORS, AND ARTICLES CITED IN FINANCIAL RESEARCH JOURNAL OF FINANCE 49 : 697 1994 BALLAS A Exploring diversity in accounting through faculty journal perceptions CONTEMPORARY ACCOUNTING RESEARCH 20 : 619 2003 BAVELAS JB SOCIAL-PSYCHOLOGY OF CITATIONS CANADIAN PSYCHOLOGICAL REVIEW-PSYCHOLOGIE CANADIENNE 19 : 158 1978 BEED C Measuring the quality of academic journals: The case of economics JOURNAL OF POST KEYNESIAN ECONOMICS 18 : 369 1996 BLACKBURN RS ORGANIZATIONAL-BEHAVIOR - WHOM DO WE TALK TO AND WHO TALKS TO US JOURNAL OF MANAGEMENT 16 : 279 1990 BOURKE P EVALUATING U RES BRI : 1997 BROWN LD CONTEMP ACCOUNT RES 11 : 223 1994 BROWN LD REV QUANTITATIVE FIN 20 : 291 2003 CHAN KC FINANC MANAGE : 131 2002 COE R AACSB B 1 : 23 1969 COE R EVALUATING THE MANAGEMENT JOURNALS - A 2ND LOOK ACADEMY OF MANAGEMENT JOURNAL 27 : 660 1984 COLLIN SO BRIT J MANAGE 7 : 141 1996 CONOVER WJ PRACTICAL NONPARAMET : 2003 CROOM DL DANGERS IN USE OF SCIENCE CITATION INDEX NATURE 227 : 1173 1970 CUDD M FINANCIAL REV 23 : 117 1988 DUNDEN G ATLANTIC ECON J 21 : 1 1994 DUNN OJ MULTIPLE COMPARISONS USING RANK SUMS TECHNOMETRICS 6 : 241 1964 EXTEJT MM THE BEHAVIORAL-SCIENCES AND MANAGEMENT - AN EVALUATION OF RELEVANT JOURNALS JOURNAL OF MANAGEMENT 16 : 539 1990 GARFIELD E SOCIAL SCI CITATION 17 : 1991 GOMEZMEJIA LR DETERMINANTS OF FACULTY PAY - AN AGENCY THEORY PERSPECTIVE ACADEMY OF MANAGEMENT JOURNAL 35 : 921 1992 HARZING A J QUALITY LIST : 2000 HASSELBACK J MCGRAW HILL DIRECTOR : 1996 HASSELBACK JR ISSUES ACCOUNTING ED 10 : 269 1995 HENDERSON G ACROSS-DISCIPLINE JOURNAL AWARENESS AND EVALUATION - IMPLICATIONS FOR THE PROMOTION AND TENURE PROCESS JOURNAL OF ECONOMICS AND BUSINESS 42 : 325 1990 HOLMSTROM B MORAL HAZARD AND OBSERVABILITY BELL JOURNAL OF ECONOMICS 10 : 74 1979 HOLSAPPLE CW A CITATION ANALYSIS OF BUSINESS COMPUTING RESEARCH JOURNALS INFORMATION & MANAGEMENT 25 : 231 1993 HULT G J MARKETING ED 19 : 37 1997 HUSTAD TP Untitled JOURNAL OF PRODUCT INNOVATION MANAGEMENT 14 : 157 1997 JOBBER D INT J RES MARK 5 : 137 1988 JOHNSON JL JOURNAL INFLUENCE IN THE FIELD OF MANAGEMENT - AN ANALYSIS USING SALANCIK INDEX IN A DEPENDENCY NETWORK ACADEMY OF MANAGEMENT JOURNAL 37 : 1392 1994 KENDALL MG The problem of m rankings ANNALS OF MATHEMATICAL STATISTICS 10 : 275 1939 KRUSKAL WH USE OF RANKS IN ONE-CRITERION VARIANCE ANALYSIS JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION 47 : 583 1952 LANE DM HYPERSTAT ONLINE INT : 2003 LOFT A EUROPEAN ACCOUNTING 11 : 43 2002 LUKKA K Is accounting a global or a local discipline? Evidence from major research journals ACCOUNTING ORGANIZATIONS AND SOCIETY 21 : 755 1996 LYALL C Assessing end-use relevance of public sector research organisations RESEARCH POLICY 33 : 73 2004 MACROBERTS MH PROBLEMS OF CITATION ANALYSIS - A CRITICAL-REVIEW JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE 40 : 342 1989 MAY KO SCIENCE MAY : 890 1967 PANAZZO F ACCOUNT ORG SOC 22 : 447 1997 PARNELL JA MIDATLANTIC J BUSINE 33 : 69 1997 POLONSKY MJ J MARKETING ED 21 : 181 1999 ROBERTS G REV RES ASSESSMENT : 2003 SCHIERMEIER Q NATURE 18 : 260 2004 SCHMIDT RC Managing Delphi surveys using nonparametric statistical techniques DECISION SCIENCES 28 : 763 1997 SCHOFIELD PJ SIIMPLE STAT APPROAC : 2005 SEGLEN PO CAUSAL RELATIONSHIP BETWEEN ARTICLE CITEDNESS AND JOURNAL IMPACT JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE 45 : 1 1994 SMITH J J ACCOUNTING ED 16 : 497 1998 SOTERIOU AC J OPER MANAG 17 : 225 1999 THEOHARAKIS V Perceptual differences of marketing journals: A worldwide perspective MARKETING LETTERS 13 : 389 2002 TUKEY JW COLLECTED WORKS JW T : 1994 VANFLEET DD A theoretical and empirical analysis of journal rankings: The case of formal lists JOURNAL OF MANAGEMENT 26 : 839 2000 VASTAG G Journal characteristics, rankings and social acculturation in operations management OMEGA-INTERNATIONAL JOURNAL OF MANAGEMENT SCIENCE 30 : 109 2002 WALLIS WA The correlation ratio for ranked data JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION 34 : 533 1939 WOODWARD AM J DOCUMENTATION DEC : 290 1976 ZAR JH BIOSTATISTICAL ANAL : 1996 From garfield at CODEX.CIS.UPENN.EDU Thu Nov 16 16:23:53 2006 From: garfield at CODEX.CIS.UPENN.EDU (=?windows-1252?Q?Eugene_Garfield?=) Date: Thu, 16 Nov 2006 16:23:53 -0500 Subject: Bonner SE (Bonner, Sarah E.), Hesford JW (Hesford, James W.), Van der Stede WA (Van der Stede, Wim A.), Young SM (Young, S. Mark) "The most influential journals in academic accounting " ACCOUNTING ORGANIZATIONS AND SOCIETY 31 (7): 663-685 OCT 2006 Message-ID: E-mail Addresses: mark.young at marshall.usc.edu Title: The most influential journals in academic accounting Author(s): Bonner SE (Bonner, Sarah E.), Hesford JW (Hesford, James W.), Van der Stede WA (Van der Stede, Wim A.), Young SM (Young, S. Mark) Source: ACCOUNTING ORGANIZATIONS AND SOCIETY 31 (7): 663-685 OCT 2006 Document Type: Article Language: English Cited References: 65 Times Cited: 0 Abstract: In this article we summarize the findings of articles that have ranked academic accounting journals, as well as articles that provide other bases for considering journal quality. Results indicate that five journals- Accounting, Organizations and Society, Contemporary Accounting Research, Journal of Accounting and Economics, Journal of Accounting Research, and The Accounting Review-rank consistently as the top journals in the field. However, these five journals differ substantially as to the numbers of articles they publish overall as well as the proportions of articles that are related to the various specialty areas of accounting. Further, the relative proportions of articles by area do not correspond to the numbers of individuals working in the specialty areas. Financial accounting articles appear in disproportionately high numbers for all journals except Accounting, Organizations and Society, whereas management accounting articles appear in disproportionately low numbers for all journals except Accounting, Organizations and Society. In all journals, systems and tax articles also appear to be disproportionately low vis-a-vis the numbers of individuals working in these areas. Auditing receives fairly even exposure across journals and vis-a-vis individuals in the area, except in the Journal of Accounting and Economics. (c) 2005 Published by Elsevier Ltd. Addresses: Young SM (reprint author), Univ So Calif, Marshall Sch Business, Leventhal Sch Accounting, Los Angeles, CA 90089 USA Univ So Calif, Marshall Sch Business, Leventhal Sch Accounting, Los Angeles, CA 90089 USA Cornell Univ, Sch Hotel Adm, Dept Finance Accounting & Real Estate, Ithaca, NY 14853 USA E-mail Addresses: mark.young at marshall.usc.edu Publisher: PERGAMON-ELSEVIER SCIENCE LTD, THE BOULEVARD, LANGFORD LANE, KIDLINGTON, OXFORD OX5 1GB, ENGLAND Subject Category: BUSINESS, FINANCE IDS Number: 080YP ISSN: 0361-3682 CITED REFERENCES: ANDREWS WT LEADING ACCOUNTING DEPARTMENTS REVISITED ACCOUNTING REVIEW 53 : 135 1978 BALLAS A Exploring diversity in accounting through faculty journal perceptions CONTEMPORARY ACCOUNTING RESEARCH 20 : 619 2003 BAZLEY JD COMPARISON OF PUBLISHED ACCOUNTING RESEARCH AND QUALITIES OF ACCOUNTING FACULTY AND DOCTORAL PROGRAMS ACCOUNTING REVIEW 50 : 605 1975 BEATTIE VA BR ACCOUNT REV 21 : 267 1989 BELL TB ACCOUNTING HORIZONS 7 : 33 1993 BENJAMIN JJ PERCEPTIONS OF JOURNAL QUALITY ACCOUNTING REVIEW 49 : 360 1974 BRINN T ACCOUNT BUSINESS RES 26 : 265 1996 BROWN LD Influential accounting articles, individuals, PhD granting institutions and faculties: A citational analysis ACCOUNTING ORGANIZATIONS AND SOCIETY 21 : 723 1996 BROWN LD AN ANALYSIS OF THE RESEARCH CONTRIBUTIONS OF ACCOUNTING, ORGANIZATIONS AND SOCIETY, 1976-1984 ACCOUNTING ORGANIZATIONS AND SOCIETY 12 : 193 1987 BROWN LD APPLYING CITATION ANALYSIS TO EVALUATE THE RESEARCH CONTRIBUTIONS OF ACCOUNTING FACULTY AND DOCTORAL PROGRAMS ACCOUNTING REVIEW 60 : 262 1985 BROWN LD CONTEMP ACCOUNT RES 11 : 223 1994 BROWN LD USING CITATION ANALYSIS TO ASSESS THE IMPACT OF JOURNALS AND ARTICLES ON CONTEMPORARY ACCOUNTING RESEARCH (CAR) JOURNAL OF ACCOUNTING RESEARCH 23 : 84 1985 BROWN LD REV QUANTITATIVE FIN 20 : 291 2003 BUBLITZ B ISSUES ACCOUNTING ED : 39 1984 CAMPBELL DR ISSUES ACCOUNTING ED 2 : 28 1987 DWYER P ISSUES ACCOUNTING ED 9 : 231 1994 DYCKMAN TR 2 DECADES OF THE JOURNAL-OF-ACCOUNTING-RESEARCH JOURNAL OF ACCOUNTING RESEARCH 22 : 225 1984 DYL EA A NOTE ON INSTITUTIONAL CONTRIBUTIONS TO THE ACCOUNTING LITERATURE ACCOUNTING ORGANIZATIONS AND SOCIETY 10 : 171 1985 ENGLEBRECHT TD ACCOUNTING HORIZONS 8 : 45 1994 ERKUT E Measuring Canadian business school research output and impact CANADIAN JOURNAL OF ADMINISTRATIVE SCIENCES-REVUE CANADIENNE DES SCIENCES DE L ADMINISTRATION 19 : 97 2002 GARFIELD E SCIENCE 178 : 4060 1972 GASS SI Model world: When is a number a number? INTERFACES 31 : 93 2001 HAGERMAN RL ISSUES ACCOUNTING ED 4 : 265 1989 HALL T ADV ACCOUNTING 9 : 161 1991 HASSELBACK JR ACCOUNTING FACULTY D : 2002 HASSELBACK JR ADV ACCOUNTING 20 : 95 2003 HASSELBACK JR ADV ACCOUNTING 13 : 61 1995 HASSELBACK JR ISSUES ACCOUNTING ED 10 : 269 1995 HECK JL 6 DECADES OF THE ACCOUNTING REVIEW - A SUMMARY OF AUTHOR AND INSTITUTIONAL CONTRIBUTORS ACCOUNTING REVIEW 61 : 735 1986 HOWARD TP ATTITUDE MEASUREMENT AND PERCEPTIONS OF ACCOUNTING FACULTY PUBLICATION OUTLETS ACCOUNTING REVIEW 58 : 765 1983 HULL RP ACCOUNTING HORIZONS 4 : 77 1990 JACOBS FA PUBLICATION PRODUCTIVITY OF DOCTORAL ALUMNI - A TIME-ADJUSTED MODEL ACCOUNTING REVIEW 61 : 179 1986 JOHNSON PM ADV ACCOUNTING 19 : 235 2002 JOLLY SA ACCOUNTING ED J 7 : 47 1995 KINNEY WR ACCOUNTING HORIZONS 13 : 69 1999 LOWE A ACCOUNT ORG SOC 30 : 81 2004 LOWE A ACCOUNTING FORUM 26 : 45 2002 MCRAE TW CITATIONAL ANALYSIS OF ACCOUNTING INFORMATION NETWORK JOURNAL OF ACCOUNTING RESEARCH 12 : 80 1974 MILNE RA ISSUES ACCOUNTING ED 2 : 94 1987 MORRIS JL ACCOUNTING ED J 2 : 46 1990 NOBES CW INTERNATIONAL VARIATIONS IN PERCEPTIONS OF ACCOUNTING JOURNALS ACCOUNTING REVIEW 60 : 702 1985 NOBES CW BR ACCOUNT REV 18 : 7 1986 PRATHER J ACCOUNTING HORIZONS 10 : 1 1996 RAABE WA J ACCOUNTING ED 5 : 45 1987 REEVE RC ABACUS 24 : 190 1988 REINSTEIN KA J ACCOUNTING ED 15 : 425 1997 RICHARDSON AJ CONTEMP ACCOUNT RES 7 : 278 1990 ROUSE RW J ACCOUNTING ED FAL : 43 1984 SCHROEDER RG ACCOUNTING ED J : 1 1988 SMITH G SOURCES AND USES OF AUDITING - A JOURNAL OF PRACTICE AND THEORY LITERATURE - THE 1ST DECADE AUDITING-A JOURNAL OF PRACTICE & THEORY 10 : 84 1991 SMITH G A TAXONOMY OF CONTENT AND CITATIONS IN AUDITING - A-JOURNAL-OF-PRACTICE-AND- THEORY AUDITING-A JOURNAL OF PRACTICE & THEORY 8 : 108 1988 SMITH G IMPACT OF SOURCES AND AUTHORS ON AUDITING-A-JOURNAL-OF-PRACTICE-AND-THEORY - A CITATION ANALYSIS AUDITING-A JOURNAL OF PRACTICE & THEORY 4 : 107 1984 SMITH SW HEALTH COMMUN 6 : 1 1994 SNOWBALL D ACCOUNTING LABORATORY EXPERIMENTS ON HUMAN JUDGMENT - SOME CHARACTERISTICS AND INFLUENCES ACCOUNTING ORGANIZATIONS AND SOCIETY 11 : 47 1986 SPICELAND JD INT GUIDE ACCOUNTING : 1993 SRIRAM RS ACCOUNTING ED J 6 : 32 1994 STREULY CA ISSUES ACCOUNTING ED 9 : 247 1994 TAHAI A Information processing using citations to investigate journal influence in accounting INFORMATION PROCESSING & MANAGEMENT 34 : 341 1998 TRIESCHMANN JS Serving multiple constituencies in business schools: MBA program versus research performance ACADEMY OF MANAGEMENT JOURNAL 43 : 1130 2000 WATTS RL Commemorating the 25th volume of the Journal of Accounting and Economics JOURNAL OF ACCOUNTING & ECONOMICS 25 : 217 1998 WEBER RP EVALUATIONS OF ACCOUNTING JOURNAL AND DEPARTMENT QUALITY ACCOUNTING REVIEW 56 : 596 1981 WILLIAMS PF A DESCRIPTIVE ANALYSIS OF AUTHORSHIP IN THE ACCOUNTING REVIEW ACCOUNTING REVIEW 60 : 300 1985 WINDAL FW PUBLISHING FOR A VARIED PUBLIC - AN EMPIRICAL-STUDY ACCOUNTING REVIEW 56 : 653 1981 ZEFF SA ACCOUNTING HORIZONS 10 : 158 1996 ZIVNEY TL ISSUES ACCOUNTING ED 10 : 1 1995 From garfield at CODEX.CIS.UPENN.EDU Thu Nov 16 16:44:13 2006 From: garfield at CODEX.CIS.UPENN.EDU (=?windows-1252?Q?Eugene_Garfield?=) Date: Thu, 16 Nov 2006 16:44:13 -0500 Subject: Sorenson O (Sorenson, Olav), Rivkin JW (Rivkin, Jan W.), Fleming L (Fleming, Lee) "Complexity, networks and knowledge flow" Research Policy 35(7):994-1017, September 2006. Message-ID: E-mail Addresses: osorenson at london.edu, jrivkin at hbs.edu, lfleming at hbs.edu Title: Complexity, networks and knowledge flow Author(s): Sorenson O (Sorenson, Olav), Rivkin JW (Rivkin, Jan W.), Fleming L (Fleming, Lee) Source: RESEARCH POLICY 35 (7): 994-1017 SEP 2006 Document Type: Review Language: English Cited References: 102 Times Cited: 0 Abstract: Because knowledge plays an important role in the creation of wealth, economic actors often wish to skew the flow of knowledge in their favor. We ask, when will an actor socially close to the source of some knowledge have the greatest advantage over distant actors in receiving and building on the knowledge? Marrying a social network perspective with a view of knowledge transfer as a search process, we argue that the value of social proximity to the knowledge source depends crucially on the nature of the knowledge at hand. Simple knowledge diffuses equally to close and distant actors because distant recipients with poor connections to the source of the knowledge can compensate for their limited access by means of unaided local search. Complex knowledge resists diffusion even within the social circles in which it originated. With knowledge of moderate complexity, however, high- fidelity transmission along social networks combined with local search allows socially proximate recipients to receive and extend knowledge generated elsewhere, while interdependencies stymie more distant recipients who rely heavily on unaided search. To test this hypothesis, we examine patent data and compare citation rates across proximate and distant actors on three dimensions: (1) the inventor collaboration network; (2) firm membership; and (3) geography. We find robust support for the proposition that socially proximate actors have the greatest advantage over distant actors for knowledge of moderate complexity. We discuss the implications of our findings for the distribution of intra-industry profits, the geographic agglomeration of industries, the design of social networks within firms, and the modularization of technologies. (c) 2006 Elsevier B.V. All rights reserved. Addresses: Sorenson O (reprint author), London Business Sch, Sussex Pl,Regents Pk, London NW1 4SA, England London Business Sch, London NW1 4SA, England Harvard Univ, Sch Business, Boston, MA 02163 USA E-mail Addresses: osorenson at london.edu, jrivkin at hbs.edu, lfleming at hbs.edu Publisher: ELSEVIER SCIENCE BV, PO BOX 211, 1000 AE AMSTERDAM, NETHERLANDS Subject Category: MANAGEMENT; PLANNING & DEVELOPMENT IDS Number: 082MS ISSN: 0048-7333 CITED REFERENCES : ALCACER J IN PRESS EC STAT ALLEN TJ MANAGING FLOW TECHNO : 1977 ARGOTE L ORG LEARNING CREATIN : 1999 ARROW KJ LIMITS ORG : 1974 ARROW KJ RATE DIRECTION INVEN : 609 1962 AUDRETSCH DB Innovative clusters and the industry life cycle REVIEW OF INDUSTRIAL ORGANIZATION 11 : 253 1996 BAKER WE Social networks and loss of capital SOCIAL NETWORKS 26 : 91 2004 BALDWIN CY DESIGN RULES POWER : 2000 BASALLA G EVOLUTION TECHNOLOGY : 1988 BOSSARD JHS AM J SOCIOL 38 : 219 1932 BRESCHI S MOBILITY SOCIAL NETW : 2002 BURT RS SOCIAL CONTAGION AND INNOVATION - COHESION VERSUS STRUCTURAL EQUIVALENCE AMERICAN JOURNAL OF SOCIOLOGY 92 : 1287 1987 BURT RS STRUCTURAL HOLES SOC : 1992 CHEW WB MEASURES MANUFACTURI : 129 1990 CHRISTENSEN CM Disruption, disintegration and the dissipation of differentiability INDUSTRIAL AND CORPORATE CHANGE 11 : 955 2002 COHEN WM ABSORPTIVE-CAPACITY - A NEW PERSPECTIVE ON LEARNING AND INNOVATION ADMINISTRATIVE SCIENCE QUARTERLY 35 : 128 1990 COHEN WM PROTECTING THEIR INT : 2000 COLEMAN J THE DIFFUSION OF AN INNOVATION AMONG PHYSICIANS SOCIOMETRY 20 : 253 1957 COLEMAN JS MED INNOVATION DIFFU : 1966 DAVIS GF AM J SOCIOL 103 : 1 1980 DUGUET E EC INNOVATION NEW TE 14 : 375 2005 DURKHEIM E ELEMENTARY FORMS REL : 1912 FELD SL THE FOCUSED ORGANIZATION OF SOCIAL TIES AMERICAN JOURNAL OF SOCIOLOGY 86 : 1015 1981 FLEMING L Technology as a complex adaptive system: evidence from patent data RESEARCH POLICY 30 : 1019 2001 FLEMING L Science as a map in technological search STRATEGIC MANAGEMENT JOURNAL 25 : 909 2004 FRIEDRICH RJ IN DEFENSE OF MULTIPLICATIVE TERMS IN MULTIPLE-REGRESSION EQUATIONS AMERICAN JOURNAL OF POLITICAL SCIENCE 26 : 797 1982 GILFILLAN S INVENTING SHIP : 1935 GRILICHES Z HYBRID CORN - AN EXPLORATION IN THE ECONOMICS OF TECHNOLOGICAL-CHANGE ECONOMETRICA 25 : 501 1957 GULATI R Social structure and alliance formation patterns: A longitudinal analysis ADMINISTRATIVE SCIENCE QUARTERLY 40 : 619 1995 HAGERSTRAND T INNOVATION DIFFUSION : 1953 HALL BH 8498 NAT BUR EC RES : 2001 HANSEN MT The search-transfer problem: The role of weak ties in sharing knowledge across organization subunits ADMINISTRATIVE SCIENCE QUARTERLY 44 : 82 1999 HARGADON A TECHNOLOGY MANAGEMEN : 1998 HEDSTROM P CONTAGIOUS COLLECTIVITIES - ON THE SPATIAL DIFFUSION OF SWEDISH TRADE- UNIONS, 1890-1940 AMERICAN JOURNAL OF SOCIOLOGY 99 : 1157 1994 HEDSTROM P CONTAGIOUS COLLECTIVITIES - ON THE SPATIAL DIFFUSION OF SWEDISH TRADE- UNIONS, 1890-1940 AMERICAN JOURNAL OF SOCIOLOGY 99 : 1157 1994 HENDERSON R MEASURING COMPETENCE - EXPLORING FIRM EFFECTS IN PHARMACEUTICAL RESEARCH STRATEGIC MANAGEMENT JOURNAL 15 : 63 1994 IRWIN DA LEARNING-BY-DOING SPILLOVERS IN THE SEMICONDUCTOR INDUSTRY JOURNAL OF POLITICAL ECONOMY 102 : 1200 1994 JAFFE AB GEOGRAPHIC LOCALIZATION OF KNOWLEDGE SPILLOVERS AS EVIDENCED BY PATENT CITATIONS QUARTERLY JOURNAL OF ECONOMICS 108 : 577 1993 JORDAN K RIGHT TOOLS JOB WORK : 77 1992 KAUFFMAN SA ORIGINS ORDER : 1993 KING G POLIT ANAL 9 : 137 2001 KOGUT B KNOWLEDGE OF THE FIRM, COMBINATIVE CAPABILITIES, AND THE REPLICATION OF TECHNOLOGY ORGANIZATION SCIENCE 3 : 383 1992 KRUGMAN PR GEOGRAPHY TRADE : 1991 LANJOUW JO Patent quality and research productivity: Measuring innovation with multiple indicators ECONOMIC JOURNAL 114 : 441 2004 LAZARSFELD PF PEOPLES CHOICE VOTER : 1944 LEVIN RC BROOKINGS PAPERS EC 3 : 783 1987 LEVINTHAL DA Adaptation on rugged landscapes MANAGEMENT SCIENCE 43 : 934 1997 LIM K MANY FACES ABSORPTIV : 2001 LIPPMAN SA UNCERTAIN IMITABILITY - AN ANALYSIS OF INTERFIRM DIFFERENCES IN EFFICIENCY UNDER COMPETITION BELL JOURNAL OF ECONOMICS 13 : 418 1982 MAHAJAN V NEW PRODUCT DIFFUSION-MODELS IN MARKETING - A REVIEW AND DIRECTIONS FOR RESEARCH JOURNAL OF MARKETING 54 : 1 1990 MANSFIELD E IND RES TECHNOLOGICA : 1968 MANSKI CF ESTIMATION OF CHOICE PROBABILITIES FROM CHOICE BASED SAMPLES ECONOMETRICA 45 : 1977 1977 MARCH JG ORG : 1958 MARSDEN PV NETWORK STUDIES OF SOCIAL-INFLUENCE SOCIOLOGICAL METHODS & RESEARCH 22 : 127 1993 MARSHALL A PRINCIPLES EC MCEVILY SK The persistence of knowledge-based advantage: An empirical test for product performance and technological knowledge STRATEGIC MANAGEMENT JOURNAL 23 : 285 2002 MCPHERSON JM SOCIAL NETWORKS AND ORGANIZATIONAL DYNAMICS AMERICAN SOCIOLOGICAL REVIEW 57 : 153 1992 MCPHERSON M Birds of a feather: Homophily in social networks ANNUAL REVIEW OF SOCIOLOGY 27 : 415 2001 MEAD C INTRO VSLI SYSTEMS : 1980 NELSON RR EVOLUTIONARY THEORY : 1982 NOMANS GC HUMAN GROUP : 1950 OSULLIVAN A AEROSPACE DESIGN BUI : 2001 OWENSMITH J Knowledge networks as channels and conduits: The effects of spillovers in the Boston biotechnology community ORGANIZATION SCIENCE 15 : 5 2004 PARK RE URBAN COMMUNITY : 3 1926 PERROW C NORMAL ACCIDENTS LIV : 1984 PODOLNY JM MARKET UNCERTAINTY AND THE SOCIAL CHARACTER OF ECONOMIC EXCHANGE ADMINISTRATIVE SCIENCE QUARTERLY 39 : 458 1994 POLANYI M TACIT DIMENSION : 1966 PORTER ME MATCHING DELL : 158 1999 PRENTICE RL LOGISTIC DISEASE INCIDENCE MODELS AND CASE-CONTROL STUDIES BIOMETRIKA 66 : 403 1979 REED R CAUSAL AMBIGUITY, BARRIERS TO IMITATION, AND SUSTAINABLE COMPETITIVE ADVANTAGE ACADEMY OF MANAGEMENT REVIEW 15 : 88 1990 REINGANUM JF MARKET-STRUCTURE AND THE DIFFUSION OF NEW TECHNOLOGY BELL JOURNAL OF ECONOMICS 12 : 618 1981 RIVKIN JW Imitation of complex strategies MANAGEMENT SCIENCE 46 : 824 2000 RIVKIN JW Reproducing knowledge: Replication without imitation at moderate complexity ORGANIZATION SCIENCE 12 : 274 2001 ROGERS EM DIFFUSION INNOVATION : 1995 ROMER PM GROWTH BASED ON INCREASING RETURNS DUE TO SPECIALIZATION AMERICAN ECONOMIC REVIEW 77 : 56 1987 RYAN B RURAL SOCIOL 8 : 15 1943 SAMPAT BN EXAMINING PATENT EXA : 2004 SCHERER FM INNOVATION GROWTH SC : 1984 SCHUMPETER J BUSINESS CYCLES : 1939 SCOTT AJ Fitting regression models to case-control data by maximum likelihood BIOMETRIKA 84 : 57 1997 SIMON HA P AM PHILOS SOC 106 : 467 1962 SINGH J Collaborative networks as determinants of knowledge diffusion patterns MANAGEMENT SCIENCE 51 : 756 2005 SMITH JK CAROTHERS,WALLACE,H. AND FUNDAMENTAL RESEARCH AT DUPONT SCIENCE 229 : 436 1985 SORENSON O Syndication networks and the spatial distribution of venture capital investments AMERICAN JOURNAL OF SOCIOLOGY 106 : 1546 2001 SORENSON O Science and the diffusion of knowledge RESEARCH POLICY 33 : 1615 2004 SORENSON O ROLE LABOUR MOBILITY : 79 2004 STERN S COMMUNICATION : 2001 STRANG D Diffusion in organizations and social movements: From hybrid corn to poison pills ANNUAL REVIEW OF SOCIOLOGY 24 : 265 1998 STUART T The geography of opportunity: spatial heterogeneity in founding rates and the performance of biotechnology firms RESEARCH POLICY 32 : 229 2003 SZULANSKI G Exploring internal stickiness: Impediments to the transfer of best practice within the firm STRATEGIC MANAGEMENT JOURNAL 17 : 27 1996 TEECE DJ TECHNOLOGY-TRANSFER BY MULTINATIONAL FIRMS - RESOURCE COST OF TRANSFERRING TECHNOLOGICAL KNOW-HOW ECONOMIC JOURNAL 87 : 242 1977 TOMZ M RELOGIT STAT FIL : 1999 TUSHMAN ML TECHNOLOGICAL DISCONTINUITIES AND ORGANIZATIONAL ENVIRONMENTS ADMINISTRATIVE SCIENCE QUARTERLY 31 : 439 1986 ULRICH K THE ROLE OF PRODUCT ARCHITECTURE IN THE MANUFACTURING FIRM RESEARCH POLICY 24 : 419 1995 USHER A HIST MECH INVENTION : 1954 VONHIPPEL E SOURCES INNOVATION : 1988 WEICK KE EDUCATIONAL ORGANIZATIONS AS LOOSELY COUPLED SYSTEMS ADMINISTRATIVE SCIENCE QUARTERLY 21 : 1 1976 WINTER SG RESOURCE BASED EVOLU : 1995 WOMACK JP MACHINE CHANGED WORL : 1990 ZANDER U KNOWLEDGE AND THE SPEED OF THE TRANSFER AND IMITATION OF ORGANIZATIONAL CAPABILITIES - AN EMPIRICAL-TEST ORGANIZATION SCIENCE 6 : 76 1995 ZIMMERMAN MB LEARNING EFFECTS AND THE COMMERCIALIZATION OF NEW ENERGY TECHNOLOGIES - THE CASE OF NUCLEAR-POWER BELL JOURNAL OF ECONOMICS 13 : 297 1982 ZUCKER LG Intellectual human capital and the birth of US biotechnology enterprises AMERICAN ECONOMIC REVIEW 88 : 290 1998 From harnad at ECS.SOTON.AC.UK Thu Nov 16 23:34:08 2006 From: harnad at ECS.SOTON.AC.UK (Stevan Harnad) Date: Fri, 17 Nov 2006 04:34:08 +0000 Subject: Australia's RQF (fwd) Message-ID: ---------- Forwarded message ---------- Date: Fri, 17 Nov 2006 14:44:39 +1100 From: Arthur Sale To: AMERICAN-SCIENTIST-OPEN-ACCESS-FORUM at LISTSERVER.SIGMAXI.ORG The Australian Government has released a definitive, if incomplete, description of Australia's Research Quality Framework (RQF) which is our equivalent of the UK's RAE. If familiar with the RAE, you will recognize the family resemblance. I extract the essentials of the RQF for an international readership, and analyze some of the consequences likely to flow from it. To see the documentation, see http://www.dest.gov.au/sectors/research_sector/policies_issues_reviews/key_i ssues/research_quality_framework/rqf_development_2006.htm. ESSENTIAL POINTS 1. The first RQF assessment will be based on submissions by the 38 Australian universities by 30 April 2008. Funding based on the assessment will flow in calendar year 2009. Six years will elapse before the next assessment (ie 2014), but there is provision to shorten this. 2. The Unit of Assessment is the Research Group. Research Groups will be defined by up to three RFCD four-digit codes (to allow for multi-disciplinary groups). The RFCD classification is uniquely Australian, and for example there are six four-digit codes in the field of ICT. Engineering has more but for example Civil Engineering is one. If you are interested in the codes see http://www.research.utas.edu.au/publications/docs/14_rfcd.doc, the four digit codes are the sub-headings. 3. Each Research Group will be allocated to and assessed by one of 13 Panels. The Panel is determined by the primary RFCD code. Thus Mathematics, Computing and Information Technology is Panel 4. 4. Each University will submit an Evidence Portfolio (EP) for each identified Research Group. There is provision for cross-university Research Groups. 5. The ratings will be based on Quality and Impact separately. These words have peculiar (ie not common-usage) meanings. Approximately, Quality is a bag of quantifiable metrics, and Impact is all the soft things like Fellowships of Academies, Honors, journal associate editorships, etc. The relative importance of Quality and Impact will vary by Panel and is similarly not yet resolved. Quality is based on the best four publications (Research Output) of each researcher in the group over the six years 2002-2007, on a full list of all Research Output from the group including honorary and emeritus professors, and on competitive grants received over the period. Impact is covered in the Context Statement of the EP 6. Impact for each Research Group will be assessed on a scale of 1 (not important) to 5 (prestigious).. 7. Impact is rated A (outstanding) to E (poor). 8. Research Groups which rate below 2 for Quality, or below D for Impact, will attract no funding to their university, though the two factors are separately aggregated for the University. The weighting of funding is stated to be linear with rating, but the gradient will be determined during 2007. 9. The Panels require access to the electronic versions of any of the Research Output within four working days. The Panels will (a) rank the outputs by things like journal impact factors, journal standing, etc, (b) assess citation counts, both in aggregate and by the percentage that fall in the top decile for the discipline, and (c) competitive grant income. 10. The RQF is based on a semi-centralized IT model (or semi-decentralized). In other words, the full-texts of the research outputs (publications) will be held in IRs in each university, while the RQF secretariat will run a repository with all the EPs and develop the citation counts independent of the universities (in conjunction with Thomson Scientific and possibly EndNote Web). The Australian Government will be approached for funds to universities to establish these IRs. ANALYSIS FOR OPEN ACCESS * The RQF will actually use citation metrics in the assessment, not just test them as a "shadow exercise" as in the next RAE. This will mean that the OA citation advantage will suddenly look very attractive to Australian universities, though it is a bit late to do anything about it five years into a six-year window. However, with 2014 in mind, there will be pressure to increase citations. * Every university will have to have an IR to hold the full-text of Research Outputs. About half already do, with EPrints and DSpace being the most popular software with a few Fedora-based repositories and outsourced ProQuest hosts. There will be funding to establish repositories. * I expect a mad scramble in the smaller universities, with outsourcing and hosting solutions being very attractive. Money fixes everything. The ones that have been dithering will regret it. * All Research Output generated by all Research Groups will have to be in the IRs for the RQF. This may amount to 50% of the university research production over six years, or more or less depending on how research intensive it is. There are two corollaries: (a) this is Mandate by Money, and (b) there will be frantic activity over 2007 to put in the backlog of 2002-2006 publications. * Since one does not know what Research Output will be needed in 2014, and only a general clue in 2007, 100% institutional mandates are likely to spring up all over the place, in the form of Mandate by Administration. What I mean by this is that the deposition of the paper will be integrated with the already present administrative annual requirement to report the publication to the Australian Government. * Although it is nowhere stated explicitly that I can see, I read between the lines that the RQF may be expecting to get access to the publisher's pdf. This means that it will have to be in the repository as "restricted access" in most cases or as a link to an OA source. There is no reason why the OA postprint cannot be there as "open access" as well, of course, and if a citation advantage is to be got, it will need to be. Please feel free to blog this or forward this to anyone you think may be interested. My apologies for cross-posting. Arthur Sale Professor of Computing (Research) University of Tasmania From linda.butler at ANU.EDU.AU Fri Nov 17 01:35:09 2006 From: linda.butler at ANU.EDU.AU (Linda Butler) Date: Fri, 17 Nov 2006 17:35:09 +1100 Subject: Australia's RQF (fwd) In-Reply-To: Message-ID: An HTML attachment was scrubbed... URL: From subbiah_a at YAHOO.COM Fri Nov 17 01:49:51 2006 From: subbiah_a at YAHOO.COM (Subbiah Arunachalam) Date: Fri, 17 Nov 2006 06:49:51 +0000 Subject: Australia's RQF (fwd) In-Reply-To: <6.2.1.2.2.20061117172516.04bba980@anumail.anu.edu.au> Message-ID: Thanks very much Linda for your clarification. In India the office of the Principal Scientific Adviser to the Cabinet is looking at ways to evaluate quality and impact of research which go far beyond mere publication and citation counts of resaerch papers. The first phase of three independent studies is expected to be completed by 31 December 2006. Regards. Arun [Subbiah Arunachalam] --- Linda Butler wrote: --------------------------------- of Arthur Sale's points about the Australian RQF, particularly inrelation to IRs and the way in which panels will access submittedpublications, are accurate. However, his "definition" ofquality and impact in the RQF context is seriously misleading. Yes,the terms are used in an unusual way, but his attempt to paraphrase themeaning is way off. The definitions contained in the officialdocument are: ? the quality of original research including itsintrinsic merit and academic impact. Academic impact relates to the recognition of the originality of research by peersand its impact on the development of the same or related discipline areas within the communityof peers; and ? the impact or use of original research outside the peercommunity that will typically not be reported in traditional peer reviewed literature (that is, the extent towhich research is successfully applied during the assessment period for the RQF). Broaderimpact relates to the recognition by qualified end users that methodologically sound andrigorous research has been successfully applied to achieve social, economic, environmentaland/or cultural outcomes. Quality is NOT a solely metrics-based exercise. It is thepeer assessment of 4 outputs per active researcher (as in the RAE),informed by quantitative indicators supplied to the panel(citations, competitive grants, ranked outputs - details of proposedmeasures are on the DEST website in the background papers). Impact, the most difficult to assess, is judged from an"evidence-based statement of claims". Obviously, there isa lot of detail behind that statement - again, background papers areavailable on the DEST website. It will definitely not be judged inthe way outlined below. Linda Butler Research Evaluation and Policy Project The Australian National University At 03:34 PM 17/11/2006, you wrote: Adminstrative info forSIGMETRICS (for example unsubscribe): http://web.utk.edu/~gwhitney/sigmetrics.html ---------- Forwarded message ---------- Date: Fri, 17 Nov 2006 14:44:39 +1100 From: Arthur Sale To: AMERICAN-SCIENTIST-OPEN-ACCESS-FORUM at LISTSERVER.SIGMAXI.ORG The Australian Government has released a definitive, if incomplete, description of Australia's Research Quality Framework (RQF) which isour equivalent of the UK's RAE. If familiar with the RAE, you will recognizethe family resemblance. I extract the essentials of the RQF for aninternational readership, and analyze some of the consequences likely to flow from it.To see the documentation, see http://www.dest.gov.au/sectors/research_sector/policies_issues_reviews/key_i ssues/research_quality_framework/rqf_development_2006.htm. ESSENTIAL POINTS 1. The first RQF assessment will be basedon submissions by the 38 Australian universities by 30 April 2008. Funding based on theassessment will flow in calendar year 2009. Six years will elapse before thenext assessment (ie 2014), but there is provision to shorten this. 2. The Unit of Assessment is the ResearchGroup. Research Groups will be defined by up to three RFCD four-digit codes (to allow for multi-disciplinary groups). The RFCD classification is uniquelyAustralian, and for example there are six four-digit codes in the field of ICT. Engineering has more but for example Civil Engineering is one. If youare interested in the codes see http://www.research.utas.edu.au/publications/docs/14_rfcd.doc, thefour digit codes are the sub-headings. 3. Each Research Group will be allocated toand assessed by one of 13 Panels. The Panel is determined by the primary RFCD code. ThusMathematics, Computing and Information Technology is Panel 4. 4. Each University will submit an EvidencePortfolio (EP) for each identified Research Group. There is provision for cross-universityResearch Groups. 5. The ratings will be based on Quality andImpact separately. These words have peculiar (ie not common-usage) meanings. Approximately,Quality is a bag of quantifiable metrics, and Impact is all the soft thingslike Fellowships of Academies, Honors, journal associate editorships, etc.The relative importance of Quality and Impact will vary by Panel and is similarly not yet resolved. Quality is based on the best fourpublications (Research Output) of each researcher in the group over the six years 2002-2007, on a full list of all Research Output from the groupincluding honorary and emeritus professors, and on competitive grants receivedover the period. Impact is covered in the Context Statement of the EP 6. Impact for each Research Group will beassessed on a scale of 1 (not important) to 5 (prestigious).. 7. Impact is rated A (outstanding) to E(poor). 8. Research Groups which rate below 2 forQuality, or below D for Impact, will attract no funding to their university, though the twofactors are separately aggregated for the University. The weighting of fundingis stated to be linear with rating, but the gradient will be determinedduring 2007. 9. The Panels require access to theelectronic versions of any of the Research Output within four working days. The Panels will (a) rankthe outputs by things like journal impact factors, journal standing, etc,(b) assess citation counts, both in aggregate and by the percentage that fallin the top decile for the discipline, and (c) competitive grantincome. 10. The RQF is based on a semi-centralized IT model (or semi-decentralized). In other words, the full-texts of the researchoutputs (publications) will be held in IRs in each university, while the RQF secretariat will run a repository with all the EPs and develop thecitation counts independent of the universities (in conjunction with Thomson Scientific and possibly EndNote Web). The Australian Government willbe approached for funds to universities to establish these IRs. ANALYSIS FOR OPEN ACCESS * The RQF will actually usecitation metrics in the assessment, not just test them as a "shadow exercise" as in the next RAE. Thiswill mean that the OA citation advantage will suddenly look very attractive to Australian universities, though it is a bit late to do anything aboutit five years into a six-year window. However, with 2014 in mind, there willbe pressure to increase citations. * Every university will have tohave an IR to hold the full-text of Research Outputs. About half already do, with EPrints and DSpace beingthe most popular software with a few Fedora-based repositories andoutsourced ProQuest hosts. There will be funding to establish repositories. * I expect a mad scramble inthe smaller universities, with outsourcing and hosting solutions being very attractive. Money fixes everything. The ones that have been dithering will regret it. * All Research Output generatedby all Research Groups will have to be in the IRs for the RQF. This may amount to 50% of the universityresearch production over six years, or more or less depending on how research intensive it is. There are two corollaries: (a) this is Mandate byMoney, and (b) there will be frantic activity over 2007 to put in the backlogof 2002-2006 publications. * Since one does not know whatResearch Output will be needed in 2014, and only a general clue in 2007, 100% institutional mandatesare likely to spring up all over the place, in the form of Mandate by Administration. What I mean by this is that the deposition of the paperwill be integrated with the already present administrative annual requirementto report the publication to the Australian Government. * Although it is nowhere statedexplicitly that I can see, I read between the lines that the RQF may be expecting to get access to the publisher's pdf. This means that it will have to be in the repositoryas "restricted access" in most cases or as a link to an OA source.There is no reason why the OA postprint cannot be there as "open access" aswell, of course, and if a citation advantage is to be got, it will need tobe. Please feel free to blog this or forward this to anyone you think maybe interested. My apologies for cross-posting. Arthur Sale Professor of Computing (Research) University of Tasmania Linda Butler Research Evaluation and Policy Project Research School of Social Sciences The Australian National University ACT 0200 Australia Tel: 61 2 61252154 Fax: 61 2 61259767 http://repp.anu.edu.au Send instant messages to your online friends http://uk.messenger.yahoo.com From harnad at ECS.SOTON.AC.UK Fri Nov 17 06:45:16 2006 From: harnad at ECS.SOTON.AC.UK (Stevan Harnad) Date: Fri, 17 Nov 2006 11:45:16 +0000 Subject: Australia's RQF In-Reply-To: <6.2.1.2.2.20061117172516.04bba980@anumail.anu.edu.au> Message-ID: The UK RAE is planning to scrap the time-consuming and costly panel re-review of already-peer-reviewed articles in favour of metrics because metrics have been shown to correlate highly with the RAE panel rankings anyway (although it is not yet decided what combination of metrics will be appropriate to each discipline). Harnad, S., Carr, L., Brody, T. & Oppenheim, C. (2003) Mandated online RAE CVs Linked to University Eprint Archives: Improving the UK Research Assessment Exercise whilst making it cheaper and easier. Ariadne 35 (April 2003). http://www.ariadne.ac.uk/issue35/harnad/ Shadbolt, N., Brody, T., Carr, L. and Harnad, S. (2006) The Open Research Web: A Preview of the Optimal and the Inevitable, in Jacobs, N., Eds. Open Access: Key Strategic, Technical and Economic Aspects, chapter 20. Chandos. http://eprints.ecs.soton.ac.uk/12453/ I trust that Australia's RQF is not going to mechanically recapitulate the many years that the RAE wasted of its researchers' time submitting for and performing panel re-review. RQF plans are probably just a bit out of phase right now, and Australia will catch up in time for its first RQF exercise or soon thereater. By then the arbitrary constraint of submitting only 4 papers will also be mooted by Open Access submission of all research output self-archived in each instution's Institutional Repository. See remarks below. On Fri, 17 Nov 2006, Linda Butler wrote: > Many of Arthur Sale's points about > the Australian RQF, particularly in relation to IRs and the way in which panels > will access submitted publications, are accurate. However, his "definition" of > quality and impact in the RQF context is seriously misleading. Yes, the terms > are used in an unusual way, but his attempt to paraphrase the meaning is way > off. The definitions contained in the official document are: > > the quality of original research including its intrinsic merit and academic > impact. Academic impact relates to the recognition of the originality of > research by peers and its impact on the development of the same or related > discipline areas within the community of peers; That, presumably, is what journal peer review has already done for a researcher's published papers. Journals differ in their peer-review quality standards, but that too can be triangulated via metrics. The best of journals will have refereed their content by consulting the top experts in each subspecialty, worldwide, not an assembled panel of national representatives to the discipline from the UK or AUSTRALIA, re-reviewing all content in their discipline. The fact that the RAE panels (having wasted the researchers' time and their own in re-reviewing already peer-reviewed publications) nevertheless come up rankings that agree substantially with metrics -- with prior funding counts, regrettably, because those probably explicitly influenced their rankings, but also with citation counts, which they are explicitly forbidden to consult, hence showing that human judgment in skimming and ranking the 4 papers per researcher averages out to the same outcome as the human judgment involved in deciding what to cite -- is another indication that the panel review is superfluous in most of the disciplines tested so far. For some disciplines new combinations of metrics will no doubt have to be tested and validated, and that is partly the reason the next RAE will be a parallel panel/metric exercise, to cross-validate the metric rankings with the panel rankings. > the impact or use of original research outside the peer community that will > typically not be reported in traditional peer reviewed literature (that is, > the extent to which research is successfully applied during the assessment > period for the RQF). Sounds like another candidate metric... > Broader impact relates to the recognition by qualified end users that > methodologically sound and rigorous research has been successfully applied > to achieve social, economic, environmental and/or cultural outcomes. Sounds again like peer review, in the case of peer-reviewed publication. For unpublished research, other metrics (patents, downloads) are possible. There may be a few specialties in which human evaluation is the only option, but that tail should certainly not be allowed to wag the RQF dog: Specific exceptions can simply be made for those specialities, until and unless a valid combination of metrics is found. > Quality is NOT a solely metrics-based exercise.It is the peer assessment of 4 > outputs per active researcher (as in the RAE), informed by quantitative > indicators supplied to the panel (citations, competitive grants, ranked outputs > - details of proposed measures are on the DEST website in the background > papers). Quality has already been peer-reviewed for peer-reviewed publications, hence panel re-review is not recourse to metrics instead of human judgment, it is an exercise in (blunt) redundancy (in most cases). > Impact, the most difficult to assess, is judged from an "evidence-based > statement of claims". Obviously, there is a lot of detail behind that > statement - again, background papers are available on the DEST website. It > will definitely not be judged in the way outlined below. "Evidence-based statement of claims": Sounds like a tall order for a small panel of national peers hand-re-reviewing a set of mostly already peer-reviewed papers, and a time-consuming one. Let's hope the RQF will learn from the RAE's long, costly and wasteful history, rather than just repeating it. The growing body of Open Access scientometrics that will become available in coming years will make it possible for enterprising data-miners (possibly PGs of the same researchers that are wasting their research time submitting to and performing the panel reviews) to demonstrate prominently just how redundant the panel rankings really are. Pertinent Prior American Scientist Open Access Forum Topic Threads: UK "RAE" Evaluations (began Nov 2000) http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#1018 Scientometric OAI Search Engines (began Aug 2002) http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#2238 Australia stirs on metrics (June 2006) http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/5417.html Big Brother and Digitometrics (began May 2001) http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#1298 UK Research Assessment Exercise (RAE) review (began Oct 2002) http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#2326 Need for systematic scientometric analyses of open-access data (began Dec 2002) http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#2522 Potential Metric Abuses (and their Potential Metric Antidotes) (began Jan 2003) http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#2643 Future UK RAEs to be Metrics-Based (began Mar 2006) http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#5251 Let 1000 RAE Metric Flowers Bloom: Avoid Matthew Effect as Self-Fulfilling Prophecy (Jun 2006) http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/5418.html Stevan Harnad > Linda Butler > Research Evaluation and Policy Project > The Australian National University > > At 03:34 PM 17/11/2006, you wrote: > Adminstrative info for SIGMETRICS (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html > > ---------- Forwarded message ---------- > Date: Fri, 17 Nov 2006 14:44:39 +1100 > From: Arthur Sale > To: AMERICAN-SCIENTIST-OPEN-ACCESS-FORUM at LISTSERVER.SIGMAXI.ORG > > The Australian Government has released a definitive, if incomplete, > description of Australia's Research Quality Framework (RQF) which > is our > equivalent of the UK's RAE. If familiar with the RAE, you will > recognize the > family resemblance. I extract the essentials of the RQF for an > international > readership, and analyze some of the consequences likely to flow > from it. To > see the documentation, see > http://www.dest.gov.au/sectors/research_sector/policies_issues_reviews/key_i > ssues/research_quality_framework/rqf_development_2006.htm. > > ESSENTIAL POINTS > > 1.????? The first RQF assessment will be based on submissions by > the 38 > Australian universities by 30 April 2008. Funding based on the > assessment > will flow in calendar year 2009. Six years will elapse before the > next > assessment (ie 2014), but there is provision to shorten this. > > 2.????? The Unit of Assessment is the Research Group. Research > Groups will > be defined by up to three RFCD four-digit codes (to allow for > multi-disciplinary groups). The RFCD classification is uniquely > Australian, > and for example there are six four-digit codes in the field of ICT. > Engineering has more but for example Civil Engineering is one. If > you are > interested in the codes see > http://www.research.utas.edu.au/publications/docs/14_rfcd.doc, the > four > digit codes are the sub-headings. > > 3.????? Each Research Group will be allocated to and assessed by > one of 13 > Panels. The Panel is determined by the primary RFCD code. Thus > Mathematics, > Computing and Information Technology is Panel 4. > > 4.????? Each University will submit an Evidence Portfolio (EP) for > each > identified Research Group. There is provision for cross-university > Research > Groups. > > 5.????? The ratings will be based on Quality and Impact separately. > These > words have peculiar (ie not common-usage) meanings. Approximately, > Quality > is a bag of quantifiable metrics, and Impact is all the soft things > like > Fellowships of Academies, Honors, journal associate editorships, > etc. The > relative importance of Quality and Impact will vary by Panel and is > similarly not yet resolved. Quality is based on the best four > publications > (Research Output) of each researcher in the group over the six > years > 2002-2007, on a full list of all Research Output from the group > including > honorary and emeritus professors, and on competitive grants > received over > the period. Impact is covered in the Context Statement of the EP > > 6.????? Impact for each Research Group will be assessed on a scale > of 1 (not > important) to 5 (prestigious).. > > 7.????? Impact is rated A (outstanding) to E (poor). > > 8.????? Research Groups which rate below 2 for Quality, or below D > for > Impact, will attract no funding to their university, though the two > factors > are separately aggregated for the University. The weighting of > funding is > stated to be linear with rating, but the gradient will be > determined during > 2007. > > 9.????? The Panels require access to the electronic versions of any > of the > Research Output within four working days. The Panels will (a) rank > the > outputs by things like journal impact factors, journal standing, > etc, (b) > assess citation counts, both in aggregate and by the percentage > that fall in > the top decile for the discipline, and (c) competitive grant > income. > > 10.? The RQF is based on a semi-centralized IT model (or > semi-decentralized). In other words, the full-texts of the research > outputs > (publications) will be held in IRs in each university, while the > RQF > secretariat will run a repository with all the EPs and develop the > citation > counts independent of the universities (in conjunction with Thomson > Scientific and possibly EndNote Web). The Australian Government > will be > approached for funds to universities to establish these IRs. > > ANALYSIS FOR OPEN ACCESS > > *??????? The RQF will actually use citation metrics in the > assessment, not > just test them as a "shadow exercise" as in the next RAE. This will > mean > that the OA citation advantage will suddenly look very attractive > to > Australian universities, though it is a bit late to do anything > about it > five years into a six-year window. However, with 2014 in mind, > there will be > pressure to increase citations. > > *??????? Every university will have to have an IR to hold the > full-text of > Research Outputs. About half already do, with EPrints and DSpace > being the > most popular software with a few Fedora-based repositories and > outsourced > ProQuest hosts. There will be funding to establish repositories. > > *??????? I expect a mad scramble in the smaller universities, with > outsourcing and hosting solutions being very attractive. Money > fixes > everything. The ones that have been dithering will regret it. > > *??????? All Research Output generated by all Research Groups will > have to > be in the IRs for the RQF. This may amount to 50% of the university > research > production over six years, or more or less depending on how > research > intensive it is. There are two corollaries: (a) this is Mandate by > Money, > and (b) there will be frantic activity over 2007 to put in the > backlog of > 2002-2006 publications. > > *??????? Since one does not know what Research Output will be > needed in > 2014, and only a general clue in 2007, 100% institutional mandates > are > likely to spring up all over the place, in the form of Mandate by > Administration. What I mean by this is that the deposition of the > paper will > be integrated with the already present administrative annual > requirement to > report the publication to the Australian Government. > > *??????? Although it is nowhere stated explicitly that I can see, I > read > between the lines that the RQF may be expecting to get access to > the > publisher's pdf. This means that it will have to be in the > repository as > "restricted access" in most cases or as a link to an OA source. > There is no > reason why the OA postprint cannot be there as "open access" as > well, of > course, and if a citation advantage is to be got, it will need to > be. > > Please feel free to blog this or forward this to anyone you think > may be > interested. My apologies for cross-posting. > > Arthur Sale > Professor of Computing (Research) > University of Tasmania > > Linda Butler > Research Evaluation and Policy Project > Research School of Social Sciences > The Australian National University > ACT 0200? Australia > Tel: 61 2 61252154??? Fax: 61 2 61259767 > http://repp.anu.edu.au > > > From harnad at ECS.SOTON.AC.UK Fri Nov 17 07:34:29 2006 From: harnad at ECS.SOTON.AC.UK (Stevan Harnad) Date: Fri, 17 Nov 2006 12:34:29 +0000 Subject: Australia's RQF In-Reply-To: <00af01c70a3e$9f832290$c85d7d9e@lboro.ac.uk> Message-ID: On Fri, 17 Nov 2006, C.Oppenheim wrote: > Whilst it is true the UK is dropping peer-assessed RAE in favour of metrics, > I doubt the reasoning was that it was convinced by the correlation between > RAE scores and metrics. I think the reason was to reduce the costs and > burden of the exercise. I am certain Charles is right. The panel re-reviewing was costly and burdensome, and it was not scientometric sophistication and prescience that drove the very sensible decision to scrap the panels for metrics, but economics and ergonomics. Evidence that it was not scientometric sagesse is that the panel-scrappers were ready to jump headlong into the use of prior-funding metrics alone (which in some fields correlate almost 100% with the panel rankings). That would have been foolish in the extreme, generating a whopping Matthew Effect (prior funding can be and is explicitly counted by the panels, whereas citation-counting has been forbidden!), and reducing the UK Dual Funding System -- (1) RCUK-based competitive proposals plus (2) RAE-based top-sliced performance-based funding -- to just the one form of funding (1). And it certainly would not have even been possible in all disciplines. Fortunately, UUK (and others) objected, and it will not be uni-metric uni-funding: Open Access will allow 1000 metric flowers to bloom, and rich discipline-specific bouquets will be picked through objective testing and validation. "Metrics" are Plural, Not Singular: Valid Objections From UUK About RAE" http://openaccess.eprints.org/index.php?/archives/137-guid.html I have no doubt that (with the help of quick-thinkers like Arthur Sale), Australia too will get into phase with these present and future developments. Stevan Harnad > Charles > > Professor Charles Oppenheim > Head > Department of Information Science > Loughborough University > Loughborough > Leics LE11 3TU > > Tel 01509-223065 > Fax 01509-223053 > e mail C.Oppenheim at lboro.ac.uk > ----- Original Message ----- > From: "Stevan Harnad" > To: "ASIS&T Special Interest Group on Metrics" > Cc: "AmSci Forum" > Sent: Friday, November 17, 2006 11:45 AM > Subject: Re: Australia's RQF > > > The UK RAE is planning to scrap the time-consuming and costly panel > re-review of > already-peer-reviewed articles in favour of metrics because metrics have > been > shown to correlate highly with the RAE panel rankings anyway (although it is > not > yet decided what combination of metrics will be appropriate to each > discipline). > > Harnad, S., Carr, L., Brody, T. & Oppenheim, C. (2003) Mandated > online RAE CVs Linked to University Eprint Archives: Improving > the UK Research Assessment Exercise whilst making it cheaper and > easier. Ariadne 35 (April 2003). > http://www.ariadne.ac.uk/issue35/harnad/ > > Shadbolt, N., Brody, T., Carr, L. and Harnad, S. (2006) The Open > Research Web: A Preview of the Optimal and the Inevitable, in Jacobs, > N., Eds. Open Access: Key Strategic, Technical and Economic Aspects, > chapter 20. Chandos. http://eprints.ecs.soton.ac.uk/12453/ > > I trust that Australia's RQF is not going to mechanically recapitulate > the many years that the RAE wasted of its researchers' time submitting for > and performing panel re-review. RQF plans are probably just a bit out of > phase > right now, and Australia will catch up in time for its first RQF exercise > or soon thereater. By then the arbitrary constraint of submitting only > 4 papers will also be mooted by Open Access submission of all research > output self-archived in each instution's Institutional Repository. > > See remarks below. > > On Fri, 17 Nov 2006, Linda Butler wrote: > > > Many of Arthur Sale's points about > > the Australian RQF, particularly in relation to IRs and the way in which > > panels > > will access submitted publications, are accurate. However, his > > "definition" of > > quality and impact in the RQF context is seriously misleading. Yes, the > > terms > > are used in an unusual way, but his attempt to paraphrase the meaning is > > way > > off. The definitions contained in the official document are: > > > > the quality of original research including its intrinsic merit and > > academic > > impact. Academic impact relates to the recognition of the originality of > > research by peers and its impact on the development of the same or related > > discipline areas within the community of peers; > > That, presumably, is what journal peer review has already done for a > researcher's published papers. Journals differ in their peer-review > quality standards, but that too can be triangulated via metrics. The > best of journals will have refereed their content by consulting the > top experts in each subspecialty, worldwide, not an assembled panel of > national representatives to the discipline from the UK or AUSTRALIA, > re-reviewing all content in their discipline. > > The fact that the RAE panels (having wasted the researchers' time > and their own in re-reviewing already peer-reviewed publications) > nevertheless come up rankings that agree substantially with metrics -- > with prior funding counts, regrettably, because those probably explicitly > influenced their rankings, but also with citation counts, which they > are explicitly forbidden to consult, hence showing that human judgment > in skimming and ranking the 4 papers per researcher averages out to the > same outcome as the human judgment involved in deciding what to cite -- > is another indication that the panel review is superfluous in most of > the disciplines tested so far. For some disciplines new combinations > of metrics will no doubt have to be tested and validated, and that is > partly the reason the next RAE will be a parallel panel/metric exercise, > to cross-validate the metric rankings with the panel rankings. > > > the impact or use of original research outside the peer community that > > will > > typically not be reported in traditional peer reviewed literature (that > > is, > > the extent to which research is successfully applied during the assessment > > period for the RQF). > > Sounds like another candidate metric... > > > Broader impact relates to the recognition by qualified end users that > > methodologically sound and rigorous research has been successfully applied > > to achieve social, economic, environmental and/or cultural outcomes. > > Sounds again like peer review, in the case of peer-reviewed publication. For > unpublished research, other metrics (patents, downloads) are possible. There > may be a few specialties in which human evaluation is the only option, but > that tail should certainly not be allowed to wag the RQF dog: Specific > exceptions can simply be made for those specialities, until and unless a > valid combination of metrics is found. > > > Quality is NOT a solely metrics-based exercise.It is the peer assessment > > of 4 > > outputs per active researcher (as in the RAE), informed by quantitative > > indicators supplied to the panel (citations, competitive grants, ranked > > outputs > > - details of proposed measures are on the DEST website in the background > > papers). > > Quality has already been peer-reviewed for peer-reviewed publications, hence > panel re-review is not recourse to metrics instead of human judgment, it is > an > exercise in (blunt) redundancy (in most cases). > > > Impact, the most difficult to assess, is judged from an "evidence-based > > statement of claims". Obviously, there is a lot of detail behind that > > statement - again, background papers are available on the DEST website. It > > will definitely not be judged in the way outlined below. > > "Evidence-based statement of claims": Sounds like a tall order for a > small panel of national peers hand-re-reviewing a set of mostly already > peer-reviewed papers, and a time-consuming one. Let's hope the RQF will > learn from the RAE's long, costly and wasteful history, rather than just > repeating it. The growing body of Open Access scientometrics that will > become available in coming years will make it possible for enterprising > data-miners (possibly PGs of the same researchers that are wasting > their research time submitting to and performing the panel reviews) to > demonstrate prominently just how redundant the panel rankings really are. > > Pertinent Prior American Scientist Open Access Forum Topic Threads: > > UK "RAE" Evaluations (began Nov 2000) > http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#1018 > > Scientometric OAI Search Engines (began Aug 2002) > http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#2238 > > Australia stirs on metrics (June 2006) > http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/5417.html > > Big Brother and Digitometrics (began May 2001) > http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#1298 > > UK Research Assessment Exercise (RAE) review (began Oct 2002) > http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#2326 > > Need for systematic scientometric analyses of open-access > data (began Dec 2002) > http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#2522 > > Potential Metric Abuses (and their Potential Metric > Antidotes) (began Jan 2003) > http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#2643 > > Future UK RAEs to be Metrics-Based (began Mar 2006) > http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#5251 > > Let 1000 RAE Metric Flowers Bloom: Avoid Matthew Effect as > Self-Fulfilling Prophecy (Jun 2006) > http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/5418.html > > Stevan Harnad > > > Linda Butler > > Research Evaluation and Policy Project > > The Australian National University > > > > At 03:34 PM 17/11/2006, you wrote: > > Adminstrative info for SIGMETRICS (for example unsubscribe): > > http://web.utk.edu/~gwhitney/sigmetrics.html > > > > ---------- Forwarded message ---------- > > Date: Fri, 17 Nov 2006 14:44:39 +1100 > > From: Arthur Sale > > To: AMERICAN-SCIENTIST-OPEN-ACCESS-FORUM at LISTSERVER.SIGMAXI.ORG > > > > The Australian Government has released a definitive, if incomplete, > > description of Australia's Research Quality Framework (RQF) which > > is our > > equivalent of the UK's RAE. If familiar with the RAE, you will > > recognize the > > family resemblance. I extract the essentials of the RQF for an > > international > > readership, and analyze some of the consequences likely to flow > > from it. To > > see the documentation, see > > > > http://www.dest.gov.au/sectors/research_sector/policies_issues_reviews/key_i > > ssues/research_quality_framework/rqf_development_2006.htm. > > > > ESSENTIAL POINTS > > > > 1. The first RQF assessment will be based on submissions by > > the 38 > > Australian universities by 30 April 2008. Funding based on the > > assessment > > will flow in calendar year 2009. Six years will elapse before the > > next > > assessment (ie 2014), but there is provision to shorten this. > > > > 2. The Unit of Assessment is the Research Group. Research > > Groups will > > be defined by up to three RFCD four-digit codes (to allow for > > multi-disciplinary groups). The RFCD classification is uniquely > > Australian, > > and for example there are six four-digit codes in the field of ICT. > > Engineering has more but for example Civil Engineering is one. If > > you are > > interested in the codes see > > http://www.research.utas.edu.au/publications/docs/14_rfcd.doc, the > > four > > digit codes are the sub-headings. > > > > 3. Each Research Group will be allocated to and assessed by > > one of 13 > > Panels. The Panel is determined by the primary RFCD code. Thus > > Mathematics, > > Computing and Information Technology is Panel 4. > > > > 4. Each University will submit an Evidence Portfolio (EP) for > > each > > identified Research Group. There is provision for cross-university > > Research > > Groups. > > > > 5. The ratings will be based on Quality and Impact separately. > > These > > words have peculiar (ie not common-usage) meanings. Approximately, > > Quality > > is a bag of quantifiable metrics, and Impact is all the soft things > > like > > Fellowships of Academies, Honors, journal associate editorships, > > etc. The > > relative importance of Quality and Impact will vary by Panel and is > > similarly not yet resolved. Quality is based on the best four > > publications > > (Research Output) of each researcher in the group over the six > > years > > 2002-2007, on a full list of all Research Output from the group > > including > > honorary and emeritus professors, and on competitive grants > > received over > > the period. Impact is covered in the Context Statement of the EP > > > > 6. Impact for each Research Group will be assessed on a scale > > of 1 (not > > important) to 5 (prestigious).. > > > > 7. Impact is rated A (outstanding) to E (poor). > > > > 8. Research Groups which rate below 2 for Quality, or below D > > for > > Impact, will attract no funding to their university, though the two > > factors > > are separately aggregated for the University. The weighting of > > funding is > > stated to be linear with rating, but the gradient will be > > determined during > > 2007. > > > > 9. The Panels require access to the electronic versions of any > > of the > > Research Output within four working days. The Panels will (a) rank > > the > > outputs by things like journal impact factors, journal standing, > > etc, (b) > > assess citation counts, both in aggregate and by the percentage > > that fall in > > the top decile for the discipline, and (c) competitive grant > > income. > > > > 10. The RQF is based on a semi-centralized IT model (or > > semi-decentralized). In other words, the full-texts of the research > > outputs > > (publications) will be held in IRs in each university, while the > > RQF > > secretariat will run a repository with all the EPs and develop the > > citation > > counts independent of the universities (in conjunction with Thomson > > Scientific and possibly EndNote Web). The Australian Government > > will be > > approached for funds to universities to establish these IRs. > > > > ANALYSIS FOR OPEN ACCESS > > > > * The RQF will actually use citation metrics in the > > assessment, not > > just test them as a "shadow exercise" as in the next RAE. This will > > mean > > that the OA citation advantage will suddenly look very attractive > > to > > Australian universities, though it is a bit late to do anything > > about it > > five years into a six-year window. However, with 2014 in mind, > > there will be > > pressure to increase citations. > > > > * Every university will have to have an IR to hold the > > full-text of > > Research Outputs. About half already do, with EPrints and DSpace > > being the > > most popular software with a few Fedora-based repositories and > > outsourced > > ProQuest hosts. There will be funding to establish repositories. > > > > * I expect a mad scramble in the smaller universities, with > > outsourcing and hosting solutions being very attractive. Money > > fixes > > everything. The ones that have been dithering will regret it. > > > > * All Research Output generated by all Research Groups will > > have to > > be in the IRs for the RQF. This may amount to 50% of the university > > research > > production over six years, or more or less depending on how > > research > > intensive it is. There are two corollaries: (a) this is Mandate by > > Money, > > and (b) there will be frantic activity over 2007 to put in the > > backlog of > > 2002-2006 publications. > > > > * Since one does not know what Research Output will be > > needed in > > 2014, and only a general clue in 2007, 100% institutional mandates > > are > > likely to spring up all over the place, in the form of Mandate by > > Administration. What I mean by this is that the deposition of the > > paper will > > be integrated with the already present administrative annual > > requirement to > > report the publication to the Australian Government. > > > > * Although it is nowhere stated explicitly that I can see, I > > read > > between the lines that the RQF may be expecting to get access to > > the > > publisher's pdf. This means that it will have to be in the > > repository as > > "restricted access" in most cases or as a link to an OA source. > > There is no > > reason why the OA postprint cannot be there as "open access" as > > well, of > > course, and if a citation advantage is to be got, it will need to > > be. > > > > Please feel free to blog this or forward this to anyone you think > > may be > > interested. My apologies for cross-posting. > > > > Arthur Sale > > Professor of Computing (Research) > > University of Tasmania > > > > Linda Butler > > Research Evaluation and Policy Project > > Research School of Social Sciences > > The Australian National University > > ACT 0200 Australia > > Tel: 61 2 61252154 Fax: 61 2 61259767 > > http://repp.anu.edu.au > > > > > > > > > From isidro at CINDOC.CSIC.ES Fri Nov 17 09:29:30 2006 From: isidro at CINDOC.CSIC.ES (Isidro F. Aguillo) Date: Fri, 17 Nov 2006 15:29:30 +0100 Subject: New paper on OA advantage/ ISSI 2007 Conference Message-ID: The effect of 'Open access' and 'preprints' upon citation impact is an interesting issue that received much attention in messages submitted to the Sigmetrics list server. Recently I finished a paper on this issue. I decided to deposit it in arxiv.org. Abstract and full text (in PDF) are now available at:: http://arxiv.org/abs/cs.DL/0611060 Title and abstract are included at the bottom of this message. Please allow me to address a second point. The organisers of the 11th International Conference on Scientometrics and Informetrics (ISSI 2007, to be held in Madrid, 25-27 June 2007), Isabel Gomez, Maria Bordons and Isidro Aguillo (at CINDOC, CSIC, Madrid) and myself, as program chair, are considering to dedicate in this conference a session to information-scientific aspects of 'Open Access' and 'Usage/downloads' from digital libraries. We would certainly welcome good papers on these issues. If you are interested in contributing to such a session, please let me know, and submit a full paper or research-in-progress paper to the conference. See the Submission of papers page at the conference website (http://issi2007.cindoc.csic.es/ ) for details about how to submit a paper. Of course, papers on other topics covered by the conference are most welcome too! Deadline for submission is November 30, 2006. With kind regards, Henk F. Moed Centre for Science and Technology Studies (CWTS), Leiden University, The Netherlands Email: moed at cwts.leidenuniv.nl *cs.DL/0611060* *The effect of 'Open Access' upon citation impact: An analysis of ArXiv's Condensed Matter Section* Authors: *Henk F. Moed* Comments: Version 13 November 2006. 16 pages, 6 figures, 2 tables Subj-class: Digital Libraries; Information Retrieval This article statistically analyses how the citation impact of articles deposited in the Condensed Matter section of the preprint server ArXiv (hosted by Cornell University), and subsequently published in a scientific journal, compares to that of articles in the same journal that were not deposited in that archive. Its principal aim is to further illustrate and roughly estimate the effect of two factors, 'early view' and 'quality bias', upon differences in citation impact between these two sets of papers, using citation data from Thomson Scientific's Web of Science. It presents estimates for a number of journals in the field of condensed matter physics. In order to discriminate between an 'open access' effect and an early view effect, longitudinal citation data was analysed covering a time period as long as 7 years. Quality bias was measured by calculating ArXiv citation impact differentials at the level of individual authors publishing in a journal, taking into account co-authorship. The analysis provided evidence of a strong quality bias and early view effect. Correcting for these effects, there is in a sample of 6 condensed matter physics journals studied in detail, no sign of a general 'open access advantage' of papers deposited in ArXiv. The study does provide evidence that ArXiv accelerates citation, due to the fact that that ArXiv makes papers earlier available rather than that it makes papers freely available. . ********************************************************************** This email and any files transmitted with it are confidential and intended solely for the use of the individual or entity to whom they are addressed. If you have received this email in error please notify the system manager. ********************************************************************** -- *************************************** Isidro F. Aguillo isidro at cindoc.csic.es Ph:(+34) 91-5635482 ext. 313 Cybermetrics Research Group CINDOC-CSIC Joaquin Costa, 22 28002 Madrid. SPAIN http://www.webometrics.info http://www.cindoc.csic.es/cybermetrics http://internetlab.cindoc.csic.es **************************************** -------------- next part -------------- An HTML attachment was scrubbed... URL: From linda.butler at ANU.EDU.AU Fri Nov 17 18:11:31 2006 From: linda.butler at ANU.EDU.AU (Linda Butler) Date: Sat, 18 Nov 2006 10:11:31 +1100 Subject: Australia's RQF In-Reply-To: Message-ID: I think it is important to take account of what "phase", to use Stevan's term, each country is at. The UK is coming off several cycles of a traditional peer-review RAE, and is rightly questioning the cost of continuing with such an intensive process. It is hard to see the justification for continuing with the same system. HOWEVER, the UK does have the experience of several iterations of the RAE, and if the move to metrics throws up any anomalies, they will be relatively easy to detect. In contrast, Australia has had around 12 years of funding on the basis of a blunt formula that has little (I'm being generous here) to do with the quality of research. We don't have the UK's extensive knowledge set on the relative strengths and weaknesses of our departments. Some metrics may correlate well in some disciplines (maybe even many), but the correlation is rarely perfect. 'Quality' is a complex notion, and I don't share others' confidence that we can do away with the peer review element entirely. However, that does not mean the time-consuming assessment of a huge bundle of research outputs. What it does mean is that you still need a panel of experts to examine and interpret the metrics. For some panels, it might only take 5 minutes - for example, it's hard to see a bibliometric analysis of astronomy throwing up any surprises. But for others there might need to be considerably more effort. I'm currently working on an analysis of the 2002 RAE results in Political Science. If anyone can come up with any metric that would give War Studies at Kings College London a 5* rating I would be genuinely interested to hear. Australia is not moving "holus-bolus" to a UK RAE-clone. We can't afford it. We need metrics to help lighten the load on panels. But, at least for the first exercise, we need more than that. Add to that the fact that Australia is the first to systematically attempt to assess the "impact" (i.e. outside academia) of its research, and what is being proposed is hugely innovative. Lots of research councils, funding bodies, and others around the world are keen to see how this pans out - that is what they are increasingly seeking to do. And in Australia it is essential. There is a strong belief that merely demonstrating the quality of our research will not bring extra funds from government, as it has done in the UK. Rather what our government is looking for is a demonstration of the impact of that research in the wider community. It happens in spades - it's just that researchers/universities/etc have not been particularly successful at demonstrating the extent of it. Hopefully the RQF may go some way to addressing that and we may eventually get more research funding - we desperately need it. In the meantime, much work is being done in Australia to develop new metrics in those disciplines where the standard ones are not appropriate, and we have the strong support of researchers in these disciplines. So watch this space - Australia is not behind the game - it is ahead of it. Doesn't harm to be a bit provocative!!! Linda Butler Research Evaluation and Policy Project ANU On 17/11/2006, at 11:34 PM, Stevan Harnad wrote: > Adminstrative info for SIGMETRICS (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html > > On Fri, 17 Nov 2006, C.Oppenheim wrote: > >> Whilst it is true the UK is dropping peer-assessed RAE in favour >> of metrics, >> I doubt the reasoning was that it was convinced by the correlation >> between >> RAE scores and metrics. I think the reason was to reduce the >> costs and >> burden of the exercise. > > I am certain Charles is right. The panel re-reviewing was costly and > burdensome, and it was not scientometric sophistication and prescience > that drove the very sensible decision to scrap the panels for metrics, > but economics and ergonomics. Evidence that it was not scientometric > sagesse is that the panel-scrappers were ready to jump headlong into > the use of prior-funding metrics alone (which in some fields correlate > almost 100% with the panel rankings). > > That would have been foolish in the extreme, generating a whopping > Matthew > Effect (prior funding can be and is explicitly counted by the panels, > whereas citation-counting has been forbidden!), and reducing the UK > Dual Funding System -- (1) RCUK-based competitive proposals plus (2) > RAE-based top-sliced performance-based funding -- to just the one form > of funding (1). And it certainly would not have even been possible > in all disciplines. > > Fortunately, UUK (and others) objected, and it will not be uni-metric > uni-funding: Open Access will allow 1000 metric flowers to bloom, > and rich discipline-specific bouquets will be picked through objective > testing and validation. > > "Metrics" are Plural, Not Singular: Valid Objections From UUK > About RAE" > http://openaccess.eprints.org/index.php?/archives/137-guid.html > > I have no doubt that (with the help of quick-thinkers like Arthur > Sale), Australia too will get into phase with these present and future > developments. > > Stevan Harnad > >> Charles >> >> Professor Charles Oppenheim >> Head >> Department of Information Science >> Loughborough University >> Loughborough >> Leics LE11 3TU >> >> Tel 01509-223065 >> Fax 01509-223053 >> e mail C.Oppenheim at lboro.ac.uk >> ----- Original Message ----- >> From: "Stevan Harnad" >> To: "ASIS&T Special Interest Group on Metrics" >> >> Cc: "AmSci Forum" >> Sent: Friday, November 17, 2006 11:45 AM >> Subject: Re: Australia's RQF >> >> >> The UK RAE is planning to scrap the time-consuming and costly panel >> re-review of >> already-peer-reviewed articles in favour of metrics because >> metrics have >> been >> shown to correlate highly with the RAE panel rankings anyway >> (although it is >> not >> yet decided what combination of metrics will be appropriate to each >> discipline). >> >> Harnad, S., Carr, L., Brody, T. & Oppenheim, C. (2003) Mandated >> online RAE CVs Linked to University Eprint Archives: Improving >> the UK Research Assessment Exercise whilst making it cheaper and >> easier. Ariadne 35 (April 2003). >> http://www.ariadne.ac.uk/issue35/harnad/ >> >> Shadbolt, N., Brody, T., Carr, L. and Harnad, S. (2006) The Open >> Research Web: A Preview of the Optimal and the Inevitable, in >> Jacobs, >> N., Eds. Open Access: Key Strategic, Technical and Economic >> Aspects, >> chapter 20. Chandos. http://eprints.ecs.soton.ac.uk/12453/ >> >> I trust that Australia's RQF is not going to mechanically >> recapitulate >> the many years that the RAE wasted of its researchers' time >> submitting for >> and performing panel re-review. RQF plans are probably just a bit >> out of >> phase >> right now, and Australia will catch up in time for its first RQF >> exercise >> or soon thereater. By then the arbitrary constraint of submitting >> only >> 4 papers will also be mooted by Open Access submission of all >> research >> output self-archived in each instution's Institutional Repository. >> >> See remarks below. >> >> On Fri, 17 Nov 2006, Linda Butler wrote: >> >>> Many of Arthur Sale's points about >>> the Australian RQF, particularly in relation to IRs and the way >>> in which >>> panels >>> will access submitted publications, are accurate. However, his >>> "definition" of >>> quality and impact in the RQF context is seriously misleading. >>> Yes, the >>> terms >>> are used in an unusual way, but his attempt to paraphrase the >>> meaning is >>> way >>> off. The definitions contained in the official document are: >>> >>> the quality of original research including its intrinsic merit and >>> academic >>> impact. Academic impact relates to the recognition of the >>> originality of >>> research by peers and its impact on the development of the same >>> or related >>> discipline areas within the community of peers; >> >> That, presumably, is what journal peer review has already done for a >> researcher's published papers. Journals differ in their peer-review >> quality standards, but that too can be triangulated via metrics. The >> best of journals will have refereed their content by consulting the >> top experts in each subspecialty, worldwide, not an assembled >> panel of >> national representatives to the discipline from the UK or AUSTRALIA, >> re-reviewing all content in their discipline. >> >> The fact that the RAE panels (having wasted the researchers' time >> and their own in re-reviewing already peer-reviewed publications) >> nevertheless come up rankings that agree substantially with >> metrics -- >> with prior funding counts, regrettably, because those probably >> explicitly >> influenced their rankings, but also with citation counts, which they >> are explicitly forbidden to consult, hence showing that human >> judgment >> in skimming and ranking the 4 papers per researcher averages out >> to the >> same outcome as the human judgment involved in deciding what to >> cite -- >> is another indication that the panel review is superfluous in most of >> the disciplines tested so far. For some disciplines new combinations >> of metrics will no doubt have to be tested and validated, and that is >> partly the reason the next RAE will be a parallel panel/metric >> exercise, >> to cross-validate the metric rankings with the panel rankings. >> >>> the impact or use of original research outside the peer community >>> that >>> will >>> typically not be reported in traditional peer reviewed literature >>> (that >>> is, >>> the extent to which research is successfully applied during the >>> assessment >>> period for the RQF). >> >> Sounds like another candidate metric... >> >>> Broader impact relates to the recognition by qualified end users >>> that >>> methodologically sound and rigorous research has been >>> successfully applied >>> to achieve social, economic, environmental and/or cultural outcomes. >> >> Sounds again like peer review, in the case of peer-reviewed >> publication. For >> unpublished research, other metrics (patents, downloads) are >> possible. There >> may be a few specialties in which human evaluation is the only >> option, but >> that tail should certainly not be allowed to wag the RQF dog: >> Specific >> exceptions can simply be made for those specialities, until and >> unless a >> valid combination of metrics is found. >> >>> Quality is NOT a solely metrics-based exercise.It is the peer >>> assessment >>> of 4 >>> outputs per active researcher (as in the RAE), informed by >>> quantitative >>> indicators supplied to the panel (citations, competitive grants, >>> ranked >>> outputs >>> - details of proposed measures are on the DEST website in the >>> background >>> papers). >> >> Quality has already been peer-reviewed for peer-reviewed >> publications, hence >> panel re-review is not recourse to metrics instead of human >> judgment, it is >> an >> exercise in (blunt) redundancy (in most cases). >> >>> Impact, the most difficult to assess, is judged from an "evidence- >>> based >>> statement of claims". Obviously, there is a lot of detail behind >>> that >>> statement - again, background papers are available on the DEST >>> website. It >>> will definitely not be judged in the way outlined below. >> >> "Evidence-based statement of claims": Sounds like a tall order for a >> small panel of national peers hand-re-reviewing a set of mostly >> already >> peer-reviewed papers, and a time-consuming one. Let's hope the RQF >> will >> learn from the RAE's long, costly and wasteful history, rather >> than just >> repeating it. The growing body of Open Access scientometrics that >> will >> become available in coming years will make it possible for >> enterprising >> data-miners (possibly PGs of the same researchers that are wasting >> their research time submitting to and performing the panel >> reviews) to >> demonstrate prominently just how redundant the panel rankings >> really are. >> >> Pertinent Prior American Scientist Open Access Forum Topic Threads: >> >> UK "RAE" Evaluations (began Nov 2000) >> http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#1018 >> >> Scientometric OAI Search Engines (began Aug 2002) >> http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#2238 >> >> Australia stirs on metrics (June 2006) >> http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/5417.html >> >> Big Brother and Digitometrics (began May 2001) >> http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#1298 >> >> UK Research Assessment Exercise (RAE) review (began Oct 2002) >> http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#2326 >> >> Need for systematic scientometric analyses of open-access >> data (began Dec 2002) >> http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#2522 >> >> Potential Metric Abuses (and their Potential Metric >> Antidotes) (began Jan 2003) >> http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#2643 >> >> Future UK RAEs to be Metrics-Based (began Mar 2006) >> http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#5251 >> >> Let 1000 RAE Metric Flowers Bloom: Avoid Matthew Effect as >> Self-Fulfilling Prophecy (Jun 2006) >> http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/5418.html >> >> Stevan Harnad >> >>> Linda Butler >>> Research Evaluation and Policy Project >>> The Australian National University >>> >>> At 03:34 PM 17/11/2006, you wrote: >>> Adminstrative info for SIGMETRICS (for example unsubscribe): >>> http://web.utk.edu/~gwhitney/sigmetrics.html >>> >>> ---------- Forwarded message ---------- >>> Date: Fri, 17 Nov 2006 14:44:39 +1100 >>> From: Arthur Sale >>> To: AMERICAN-SCIENTIST-OPEN-ACCESS- >>> FORUM at LISTSERVER.SIGMAXI.ORG >>> >>> The Australian Government has released a definitive, if >>> incomplete, >>> description of Australia's Research Quality Framework (RQF) >>> which >>> is our >>> equivalent of the UK's RAE. If familiar with the RAE, you will >>> recognize the >>> family resemblance. I extract the essentials of the RQF for an >>> international >>> readership, and analyze some of the consequences likely to >>> flow >>> from it. To >>> see the documentation, see >>> >>> http://www.dest.gov.au/sectors/research_sector/ >>> policies_issues_reviews/key_i >>> ssues/research_quality_framework/rqf_development_2006.htm. >>> >>> ESSENTIAL POINTS >>> >>> 1. The first RQF assessment will be based on submissions by >>> the 38 >>> Australian universities by 30 April 2008. Funding based on the >>> assessment >>> will flow in calendar year 2009. Six years will elapse >>> before the >>> next >>> assessment (ie 2014), but there is provision to shorten this. >>> >>> 2. The Unit of Assessment is the Research Group. Research >>> Groups will >>> be defined by up to three RFCD four-digit codes (to allow for >>> multi-disciplinary groups). The RFCD classification is >>> uniquely >>> Australian, >>> and for example there are six four-digit codes in the field >>> of ICT. >>> Engineering has more but for example Civil Engineering is >>> one. If >>> you are >>> interested in the codes see >>> http://www.research.utas.edu.au/publications/docs/ >>> 14_rfcd.doc, the >>> four >>> digit codes are the sub-headings. >>> >>> 3. Each Research Group will be allocated to and assessed by >>> one of 13 >>> Panels. The Panel is determined by the primary RFCD code. Thus >>> Mathematics, >>> Computing and Information Technology is Panel 4. >>> >>> 4. Each University will submit an Evidence Portfolio (EP) for >>> each >>> identified Research Group. There is provision for cross- >>> university >>> Research >>> Groups. >>> >>> 5. The ratings will be based on Quality and Impact separately. >>> These >>> words have peculiar (ie not common-usage) meanings. >>> Approximately, >>> Quality >>> is a bag of quantifiable metrics, and Impact is all the >>> soft things >>> like >>> Fellowships of Academies, Honors, journal associate >>> editorships, >>> etc. The >>> relative importance of Quality and Impact will vary by >>> Panel and is >>> similarly not yet resolved. Quality is based on the best four >>> publications >>> (Research Output) of each researcher in the group over the six >>> years >>> 2002-2007, on a full list of all Research Output from the >>> group >>> including >>> honorary and emeritus professors, and on competitive grants >>> received over >>> the period. Impact is covered in the Context Statement of >>> the EP >>> >>> 6. Impact for each Research Group will be assessed on a scale >>> of 1 (not >>> important) to 5 (prestigious).. >>> >>> 7. Impact is rated A (outstanding) to E (poor). >>> >>> 8. Research Groups which rate below 2 for Quality, or below D >>> for >>> Impact, will attract no funding to their university, though >>> the two >>> factors >>> are separately aggregated for the University. The weighting of >>> funding is >>> stated to be linear with rating, but the gradient will be >>> determined during >>> 2007. >>> >>> 9. The Panels require access to the electronic versions of any >>> of the >>> Research Output within four working days. The Panels will >>> (a) rank >>> the >>> outputs by things like journal impact factors, journal >>> standing, >>> etc, (b) >>> assess citation counts, both in aggregate and by the >>> percentage >>> that fall in >>> the top decile for the discipline, and (c) competitive grant >>> income. >>> >>> 10. The RQF is based on a semi-centralized IT model (or >>> semi-decentralized). In other words, the full-texts of the >>> research >>> outputs >>> (publications) will be held in IRs in each university, >>> while the >>> RQF >>> secretariat will run a repository with all the EPs and >>> develop the >>> citation >>> counts independent of the universities (in conjunction with >>> Thomson >>> Scientific and possibly EndNote Web). The Australian >>> Government >>> will be >>> approached for funds to universities to establish these IRs. >>> >>> ANALYSIS FOR OPEN ACCESS >>> >>> * The RQF will actually use citation metrics in the >>> assessment, not >>> just test them as a "shadow exercise" as in the next RAE. >>> This will >>> mean >>> that the OA citation advantage will suddenly look very >>> attractive >>> to >>> Australian universities, though it is a bit late to do >>> anything >>> about it >>> five years into a six-year window. However, with 2014 in mind, >>> there will be >>> pressure to increase citations. >>> >>> * Every university will have to have an IR to hold the >>> full-text of >>> Research Outputs. About half already do, with EPrints and >>> DSpace >>> being the >>> most popular software with a few Fedora-based repositories and >>> outsourced >>> ProQuest hosts. There will be funding to establish >>> repositories. >>> >>> * I expect a mad scramble in the smaller universities, with >>> outsourcing and hosting solutions being very attractive. Money >>> fixes >>> everything. The ones that have been dithering will regret it. >>> >>> * All Research Output generated by all Research Groups will >>> have to >>> be in the IRs for the RQF. This may amount to 50% of the >>> university >>> research >>> production over six years, or more or less depending on how >>> research >>> intensive it is. There are two corollaries: (a) this is >>> Mandate by >>> Money, >>> and (b) there will be frantic activity over 2007 to put in the >>> backlog of >>> 2002-2006 publications. >>> >>> * Since one does not know what Research Output will be >>> needed in >>> 2014, and only a general clue in 2007, 100% institutional >>> mandates >>> are >>> likely to spring up all over the place, in the form of >>> Mandate by >>> Administration. What I mean by this is that the deposition >>> of the >>> paper will >>> be integrated with the already present administrative annual >>> requirement to >>> report the publication to the Australian Government. >>> >>> * Although it is nowhere stated explicitly that I can see, I >>> read >>> between the lines that the RQF may be expecting to get >>> access to >>> the >>> publisher's pdf. This means that it will have to be in the >>> repository as >>> "restricted access" in most cases or as a link to an OA >>> source. >>> There is no >>> reason why the OA postprint cannot be there as "open >>> access" as >>> well, of >>> course, and if a citation advantage is to be got, it will >>> need to >>> be. >>> >>> Please feel free to blog this or forward this to anyone you >>> think >>> may be >>> interested. My apologies for cross-posting. >>> >>> Arthur Sale >>> Professor of Computing (Research) >>> University of Tasmania >>> >>> Linda Butler >>> Research Evaluation and Policy Project >>> Research School of Social Sciences >>> The Australian National University >>> ACT 0200 Australia >>> Tel: 61 2 61252154 Fax: 61 2 61259767 >>> http://repp.anu.edu.au >>> >>> >>> >> >> >> From harnad at ECS.SOTON.AC.UK Fri Nov 17 20:32:17 2006 From: harnad at ECS.SOTON.AC.UK (Stevan Harnad) Date: Sat, 18 Nov 2006 01:32:17 +0000 Subject: Australia's RQF In-Reply-To: <061D7162-6E2B-4ED8-95A3-2619A2C90EB4@anu.edu.au> Message-ID: Linda Butler's follow-up posting is very sensible: metrics, but under the scrutiny, for the time being, of panels. Australia is indeed being both independent and innovative here. (And of course everything Linda describes is perfectly congruent with what Arthur Sale described. The niggles were about the weasel-words "quality" and "impact" -- both vague, and slippery, with nothing much hanging on the distinction and examples, as formulated!) Stevan Harnad On Sat, 18 Nov 2006, Linda Butler wrote: > I think it is important to take account of what "phase", to use > Stevan's term, each country is at. > > The UK is coming off several cycles of a traditional peer-review RAE, > and is rightly questioning the cost of continuing with such an > intensive process. It is hard to see the justification for > continuing with the same system. > > HOWEVER, the UK does have the experience of several iterations of the > RAE, and if the move to metrics throws up any anomalies, they will be > relatively easy to detect. > > In contrast, Australia has had around 12 years of funding on the > basis of a blunt formula that has little (I'm being generous here) to > do with the quality of research. We don't have the UK's extensive > knowledge set on the relative strengths and weaknesses of our > departments. > > Some metrics may correlate well in some disciplines (maybe even > many), but the correlation is rarely perfect. 'Quality' is a complex > notion, and I don't share others' confidence that we can do away with > the peer review element entirely. However, that does not mean the > time-consuming assessment of a huge bundle of research outputs. What > it does mean is that you still need a panel of experts to examine and > interpret the metrics. > > For some panels, it might only take 5 minutes - for example, it's > hard to see a bibliometric analysis of astronomy throwing up any > surprises. > > But for others there might need to be considerably more effort. I'm > currently working on an analysis of the 2002 RAE results in Political > Science. If anyone can come up with any metric that would give War > Studies at Kings College London a 5* rating I would be genuinely > interested to hear. > > Australia is not moving "holus-bolus" to a UK RAE-clone. We can't > afford it. We need metrics to help lighten the load on panels. But, > at least for the first exercise, we need more than that. > > Add to that the fact that Australia is the first to systematically > attempt to assess the "impact" (i.e. outside academia) of its > research, and what is being proposed is hugely innovative. Lots of > research councils, funding bodies, and others around the world are > keen to see how this pans out - that is what they are increasingly > seeking to do. And in Australia it is essential. There is a strong > belief that merely demonstrating the quality of our research will not > bring extra funds from government, as it has done in the UK. Rather > what our government is looking for is a demonstration of the impact > of that research in the wider community. It happens in spades - it's > just that researchers/universities/etc have not been particularly > successful at demonstrating the extent of it. Hopefully the RQF may > go some way to addressing that and we may eventually get more > research funding - we desperately need it. > > In the meantime, much work is being done in Australia to develop new > metrics in those disciplines where the standard ones are not > appropriate, and we have the strong support of researchers in these > disciplines. So watch this space - Australia is not behind the game > - it is ahead of it. Doesn't harm to be a bit provocative!!! > > Linda Butler > Research Evaluation and Policy Project > ANU > > > On 17/11/2006, at 11:34 PM, Stevan Harnad wrote: > > > Adminstrative info for SIGMETRICS (for example unsubscribe): > > http://web.utk.edu/~gwhitney/sigmetrics.html > > > > On Fri, 17 Nov 2006, C.Oppenheim wrote: > > > >> Whilst it is true the UK is dropping peer-assessed RAE in favour > >> of metrics, > >> I doubt the reasoning was that it was convinced by the correlation > >> between > >> RAE scores and metrics. I think the reason was to reduce the > >> costs and > >> burden of the exercise. > > > > I am certain Charles is right. The panel re-reviewing was costly and > > burdensome, and it was not scientometric sophistication and prescience > > that drove the very sensible decision to scrap the panels for metrics, > > but economics and ergonomics. Evidence that it was not scientometric > > sagesse is that the panel-scrappers were ready to jump headlong into > > the use of prior-funding metrics alone (which in some fields correlate > > almost 100% with the panel rankings). > > > > That would have been foolish in the extreme, generating a whopping > > Matthew > > Effect (prior funding can be and is explicitly counted by the panels, > > whereas citation-counting has been forbidden!), and reducing the UK > > Dual Funding System -- (1) RCUK-based competitive proposals plus (2) > > RAE-based top-sliced performance-based funding -- to just the one form > > of funding (1). And it certainly would not have even been possible > > in all disciplines. > > > > Fortunately, UUK (and others) objected, and it will not be uni-metric > > uni-funding: Open Access will allow 1000 metric flowers to bloom, > > and rich discipline-specific bouquets will be picked through objective > > testing and validation. > > > > "Metrics" are Plural, Not Singular: Valid Objections From UUK > > About RAE" > > http://openaccess.eprints.org/index.php?/archives/137-guid.html > > > > I have no doubt that (with the help of quick-thinkers like Arthur > > Sale), Australia too will get into phase with these present and future > > developments. > > > > Stevan Harnad > > > >> Charles > >> > >> Professor Charles Oppenheim > >> Head > >> Department of Information Science > >> Loughborough University > >> Loughborough > >> Leics LE11 3TU > >> > >> Tel 01509-223065 > >> Fax 01509-223053 > >> e mail C.Oppenheim at lboro.ac.uk > >> ----- Original Message ----- > >> From: "Stevan Harnad" > >> To: "ASIS&T Special Interest Group on Metrics" > >> > >> Cc: "AmSci Forum" > >> Sent: Friday, November 17, 2006 11:45 AM > >> Subject: Re: Australia's RQF > >> > >> > >> The UK RAE is planning to scrap the time-consuming and costly panel > >> re-review of > >> already-peer-reviewed articles in favour of metrics because > >> metrics have > >> been > >> shown to correlate highly with the RAE panel rankings anyway > >> (although it is > >> not > >> yet decided what combination of metrics will be appropriate to each > >> discipline). > >> > >> Harnad, S., Carr, L., Brody, T. & Oppenheim, C. (2003) Mandated > >> online RAE CVs Linked to University Eprint Archives: Improving > >> the UK Research Assessment Exercise whilst making it cheaper and > >> easier. Ariadne 35 (April 2003). > >> http://www.ariadne.ac.uk/issue35/harnad/ > >> > >> Shadbolt, N., Brody, T., Carr, L. and Harnad, S. (2006) The Open > >> Research Web: A Preview of the Optimal and the Inevitable, in > >> Jacobs, > >> N., Eds. Open Access: Key Strategic, Technical and Economic > >> Aspects, > >> chapter 20. Chandos. http://eprints.ecs.soton.ac.uk/12453/ > >> > >> I trust that Australia's RQF is not going to mechanically > >> recapitulate > >> the many years that the RAE wasted of its researchers' time > >> submitting for > >> and performing panel re-review. RQF plans are probably just a bit > >> out of > >> phase > >> right now, and Australia will catch up in time for its first RQF > >> exercise > >> or soon thereater. By then the arbitrary constraint of submitting > >> only > >> 4 papers will also be mooted by Open Access submission of all > >> research > >> output self-archived in each instution's Institutional Repository. > >> > >> See remarks below. > >> > >> On Fri, 17 Nov 2006, Linda Butler wrote: > >> > >>> Many of Arthur Sale's points about > >>> the Australian RQF, particularly in relation to IRs and the way > >>> in which > >>> panels > >>> will access submitted publications, are accurate. However, his > >>> "definition" of > >>> quality and impact in the RQF context is seriously misleading. > >>> Yes, the > >>> terms > >>> are used in an unusual way, but his attempt to paraphrase the > >>> meaning is > >>> way > >>> off. The definitions contained in the official document are: > >>> > >>> the quality of original research including its intrinsic merit and > >>> academic > >>> impact. Academic impact relates to the recognition of the > >>> originality of > >>> research by peers and its impact on the development of the same > >>> or related > >>> discipline areas within the community of peers; > >> > >> That, presumably, is what journal peer review has already done for a > >> researcher's published papers. Journals differ in their peer-review > >> quality standards, but that too can be triangulated via metrics. The > >> best of journals will have refereed their content by consulting the > >> top experts in each subspecialty, worldwide, not an assembled > >> panel of > >> national representatives to the discipline from the UK or AUSTRALIA, > >> re-reviewing all content in their discipline. > >> > >> The fact that the RAE panels (having wasted the researchers' time > >> and their own in re-reviewing already peer-reviewed publications) > >> nevertheless come up rankings that agree substantially with > >> metrics -- > >> with prior funding counts, regrettably, because those probably > >> explicitly > >> influenced their rankings, but also with citation counts, which they > >> are explicitly forbidden to consult, hence showing that human > >> judgment > >> in skimming and ranking the 4 papers per researcher averages out > >> to the > >> same outcome as the human judgment involved in deciding what to > >> cite -- > >> is another indication that the panel review is superfluous in most of > >> the disciplines tested so far. For some disciplines new combinations > >> of metrics will no doubt have to be tested and validated, and that is > >> partly the reason the next RAE will be a parallel panel/metric > >> exercise, > >> to cross-validate the metric rankings with the panel rankings. > >> > >>> the impact or use of original research outside the peer community > >>> that > >>> will > >>> typically not be reported in traditional peer reviewed literature > >>> (that > >>> is, > >>> the extent to which research is successfully applied during the > >>> assessment > >>> period for the RQF). > >> > >> Sounds like another candidate metric... > >> > >>> Broader impact relates to the recognition by qualified end users > >>> that > >>> methodologically sound and rigorous research has been > >>> successfully applied > >>> to achieve social, economic, environmental and/or cultural outcomes. > >> > >> Sounds again like peer review, in the case of peer-reviewed > >> publication. For > >> unpublished research, other metrics (patents, downloads) are > >> possible. There > >> may be a few specialties in which human evaluation is the only > >> option, but > >> that tail should certainly not be allowed to wag the RQF dog: > >> Specific > >> exceptions can simply be made for those specialities, until and > >> unless a > >> valid combination of metrics is found. > >> > >>> Quality is NOT a solely metrics-based exercise.It is the peer > >>> assessment > >>> of 4 > >>> outputs per active researcher (as in the RAE), informed by > >>> quantitative > >>> indicators supplied to the panel (citations, competitive grants, > >>> ranked > >>> outputs > >>> - details of proposed measures are on the DEST website in the > >>> background > >>> papers). > >> > >> Quality has already been peer-reviewed for peer-reviewed > >> publications, hence > >> panel re-review is not recourse to metrics instead of human > >> judgment, it is > >> an > >> exercise in (blunt) redundancy (in most cases). > >> > >>> Impact, the most difficult to assess, is judged from an "evidence- > >>> based > >>> statement of claims". Obviously, there is a lot of detail behind > >>> that > >>> statement - again, background papers are available on the DEST > >>> website. It > >>> will definitely not be judged in the way outlined below. > >> > >> "Evidence-based statement of claims": Sounds like a tall order for a > >> small panel of national peers hand-re-reviewing a set of mostly > >> already > >> peer-reviewed papers, and a time-consuming one. Let's hope the RQF > >> will > >> learn from the RAE's long, costly and wasteful history, rather > >> than just > >> repeating it. The growing body of Open Access scientometrics that > >> will > >> become available in coming years will make it possible for > >> enterprising > >> data-miners (possibly PGs of the same researchers that are wasting > >> their research time submitting to and performing the panel > >> reviews) to > >> demonstrate prominently just how redundant the panel rankings > >> really are. > >> > >> Pertinent Prior American Scientist Open Access Forum Topic Threads: > >> > >> UK "RAE" Evaluations (began Nov 2000) > >> http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#1018 > >> > >> Scientometric OAI Search Engines (began Aug 2002) > >> http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#2238 > >> > >> Australia stirs on metrics (June 2006) > >> http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/5417.html > >> > >> Big Brother and Digitometrics (began May 2001) > >> http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#1298 > >> > >> UK Research Assessment Exercise (RAE) review (began Oct 2002) > >> http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#2326 > >> > >> Need for systematic scientometric analyses of open-access > >> data (began Dec 2002) > >> http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#2522 > >> > >> Potential Metric Abuses (and their Potential Metric > >> Antidotes) (began Jan 2003) > >> http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#2643 > >> > >> Future UK RAEs to be Metrics-Based (began Mar 2006) > >> http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#5251 > >> > >> Let 1000 RAE Metric Flowers Bloom: Avoid Matthew Effect as > >> Self-Fulfilling Prophecy (Jun 2006) > >> http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/5418.html > >> > >> Stevan Harnad > >> > >>> Linda Butler > >>> Research Evaluation and Policy Project > >>> The Australian National University > >>> > >>> At 03:34 PM 17/11/2006, you wrote: > >>> Adminstrative info for SIGMETRICS (for example unsubscribe): > >>> http://web.utk.edu/~gwhitney/sigmetrics.html > >>> > >>> ---------- Forwarded message ---------- > >>> Date: Fri, 17 Nov 2006 14:44:39 +1100 > >>> From: Arthur Sale > >>> To: AMERICAN-SCIENTIST-OPEN-ACCESS- > >>> FORUM at LISTSERVER.SIGMAXI.ORG > >>> > >>> The Australian Government has released a definitive, if > >>> incomplete, > >>> description of Australia's Research Quality Framework (RQF) > >>> which > >>> is our > >>> equivalent of the UK's RAE. If familiar with the RAE, you will > >>> recognize the > >>> family resemblance. I extract the essentials of the RQF for an > >>> international > >>> readership, and analyze some of the consequences likely to > >>> flow > >>> from it. To > >>> see the documentation, see > >>> > >>> http://www.dest.gov.au/sectors/research_sector/ > >>> policies_issues_reviews/key_i > >>> ssues/research_quality_framework/rqf_development_2006.htm. > >>> > >>> ESSENTIAL POINTS > >>> > >>> 1. The first RQF assessment will be based on submissions by > >>> the 38 > >>> Australian universities by 30 April 2008. Funding based on the > >>> assessment > >>> will flow in calendar year 2009. Six years will elapse > >>> before the > >>> next > >>> assessment (ie 2014), but there is provision to shorten this. > >>> > >>> 2. The Unit of Assessment is the Research Group. Research > >>> Groups will > >>> be defined by up to three RFCD four-digit codes (to allow for > >>> multi-disciplinary groups). The RFCD classification is > >>> uniquely > >>> Australian, > >>> and for example there are six four-digit codes in the field > >>> of ICT. > >>> Engineering has more but for example Civil Engineering is > >>> one. If > >>> you are > >>> interested in the codes see > >>> http://www.research.utas.edu.au/publications/docs/ > >>> 14_rfcd.doc, the > >>> four > >>> digit codes are the sub-headings. > >>> > >>> 3. Each Research Group will be allocated to and assessed by > >>> one of 13 > >>> Panels. The Panel is determined by the primary RFCD code. Thus > >>> Mathematics, > >>> Computing and Information Technology is Panel 4. > >>> > >>> 4. Each University will submit an Evidence Portfolio (EP) for > >>> each > >>> identified Research Group. There is provision for cross- > >>> university > >>> Research > >>> Groups. > >>> > >>> 5. The ratings will be based on Quality and Impact separately. > >>> These > >>> words have peculiar (ie not common-usage) meanings. > >>> Approximately, > >>> Quality > >>> is a bag of quantifiable metrics, and Impact is all the > >>> soft things > >>> like > >>> Fellowships of Academies, Honors, journal associate > >>> editorships, > >>> etc. The > >>> relative importance of Quality and Impact will vary by > >>> Panel and is > >>> similarly not yet resolved. Quality is based on the best four > >>> publications > >>> (Research Output) of each researcher in the group over the six > >>> years > >>> 2002-2007, on a full list of all Research Output from the > >>> group > >>> including > >>> honorary and emeritus professors, and on competitive grants > >>> received over > >>> the period. Impact is covered in the Context Statement of > >>> the EP > >>> > >>> 6. Impact for each Research Group will be assessed on a scale > >>> of 1 (not > >>> important) to 5 (prestigious).. > >>> > >>> 7. Impact is rated A (outstanding) to E (poor). > >>> > >>> 8. Research Groups which rate below 2 for Quality, or below D > >>> for > >>> Impact, will attract no funding to their university, though > >>> the two > >>> factors > >>> are separately aggregated for the University. The weighting of > >>> funding is > >>> stated to be linear with rating, but the gradient will be > >>> determined during > >>> 2007. > >>> > >>> 9. The Panels require access to the electronic versions of any > >>> of the > >>> Research Output within four working days. The Panels will > >>> (a) rank > >>> the > >>> outputs by things like journal impact factors, journal > >>> standing, > >>> etc, (b) > >>> assess citation counts, both in aggregate and by the > >>> percentage > >>> that fall in > >>> the top decile for the discipline, and (c) competitive grant > >>> income. > >>> > >>> 10. The RQF is based on a semi-centralized IT model (or > >>> semi-decentralized). In other words, the full-texts of the > >>> research > >>> outputs > >>> (publications) will be held in IRs in each university, > >>> while the > >>> RQF > >>> secretariat will run a repository with all the EPs and > >>> develop the > >>> citation > >>> counts independent of the universities (in conjunction with > >>> Thomson > >>> Scientific and possibly EndNote Web). The Australian > >>> Government > >>> will be > >>> approached for funds to universities to establish these IRs. > >>> > >>> ANALYSIS FOR OPEN ACCESS > >>> > >>> * The RQF will actually use citation metrics in the > >>> assessment, not > >>> just test them as a "shadow exercise" as in the next RAE. > >>> This will > >>> mean > >>> that the OA citation advantage will suddenly look very > >>> attractive > >>> to > >>> Australian universities, though it is a bit late to do > >>> anything > >>> about it > >>> five years into a six-year window. However, with 2014 in mind, > >>> there will be > >>> pressure to increase citations. > >>> > >>> * Every university will have to have an IR to hold the > >>> full-text of > >>> Research Outputs. About half already do, with EPrints and > >>> DSpace > >>> being the > >>> most popular software with a few Fedora-based repositories and > >>> outsourced > >>> ProQuest hosts. There will be funding to establish > >>> repositories. > >>> > >>> * I expect a mad scramble in the smaller universities, with > >>> outsourcing and hosting solutions being very attractive. Money > >>> fixes > >>> everything. The ones that have been dithering will regret it. > >>> > >>> * All Research Output generated by all Research Groups will > >>> have to > >>> be in the IRs for the RQF. This may amount to 50% of the > >>> university > >>> research > >>> production over six years, or more or less depending on how > >>> research > >>> intensive it is. There are two corollaries: (a) this is > >>> Mandate by > >>> Money, > >>> and (b) there will be frantic activity over 2007 to put in the > >>> backlog of > >>> 2002-2006 publications. > >>> > >>> * Since one does not know what Research Output will be > >>> needed in > >>> 2014, and only a general clue in 2007, 100% institutional > >>> mandates > >>> are > >>> likely to spring up all over the place, in the form of > >>> Mandate by > >>> Administration. What I mean by this is that the deposition > >>> of the > >>> paper will > >>> be integrated with the already present administrative annual > >>> requirement to > >>> report the publication to the Australian Government. > >>> > >>> * Although it is nowhere stated explicitly that I can see, I > >>> read > >>> between the lines that the RQF may be expecting to get > >>> access to > >>> the > >>> publisher's pdf. This means that it will have to be in the > >>> repository as > >>> "restricted access" in most cases or as a link to an OA > >>> source. > >>> There is no > >>> reason why the OA postprint cannot be there as "open > >>> access" as > >>> well, of > >>> course, and if a citation advantage is to be got, it will > >>> need to > >>> be. > >>> > >>> Please feel free to blog this or forward this to anyone you > >>> think > >>> may be > >>> interested. My apologies for cross-posting. > >>> > >>> Arthur Sale > >>> Professor of Computing (Research) > >>> University of Tasmania > >>> > >>> Linda Butler > >>> Research Evaluation and Policy Project > >>> Research School of Social Sciences > >>> The Australian National University > >>> ACT 0200 Australia > >>> Tel: 61 2 61252154 Fax: 61 2 61259767 > >>> http://repp.anu.edu.au > >>> > >>> > >>> > >> > >> > >> > From loet at LEYDESDORFF.NET Sun Nov 19 03:16:19 2006 From: loet at LEYDESDORFF.NET (Loet Leydesdorff) Date: Sun, 19 Nov 2006 09:16:19 +0100 Subject: No subject Message-ID: Now available at http://www.leydesdorff.net/jcr05/cited/index.htm and http://www.leydesdorff.net/jcr05/citing/index.htm : Local Citation Impact Environments of 7,525 Scientific Journals in 2005 ("cited") Local Citation Activity Environments of 7,525 Scientific Journals in 2005 ("citing") One can click on any of the journal names and obtain the Pajek file corresponding to the citation impact environment of the journal ("cited" or "citing," respectively). See for further explanation: "Visualization of the Citation Impact Environment of Scientific Journals: An online mapping exercise," Journal of the Amererican Society for Information Science and Technology (forthcoming). The 2005 matrices are based on taking the one-percent threshold of "total citations" after correction for within-journal citations. This main-diagonal value is sometimes so large that it overshadows the environment and therefore it is no longer included in setting the threshold for the delineation of the set. * apologies for cross-postings _____ Loet Leydesdorff Amsterdam School of Communications Research (ASCoR) Kloveniersburgwal 48, 1012 CX Amsterdam Tel.: +31-20- 525 6598; fax: +31-20- 525 3681 loet at leydesdorff.net ; http://www.leydesdorff.net/ Now available: The Knowledge-Based Economy: Modeled, Measured, Simulated. 385 pp.; US$ 18.95 The Self-Organization of the Knowledge-Based Society; The Challenge of Scientometrics -------------- next part -------------- An HTML attachment was scrubbed... URL: From harnad at ECS.SOTON.AC.UK Mon Nov 20 08:07:06 2006 From: harnad at ECS.SOTON.AC.UK (Stevan Harnad) Date: Mon, 20 Nov 2006 13:07:06 +0000 Subject: Two Happy Accidents Demonstrate Power of "Eprint Request" Button In-Reply-To: <443A528E.8030905@ecs.soton.ac.uk> Message-ID: ** Apologies for Cross-Posting ** Increasing Institutional Repository Content with "email eprint" Button http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/5293.html New Request Copy feature in DSpace http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/5297.html Here are two rather remarkable anecdotes about the recently created "EMAIL EPRINT" button that allows any would-be user webwide to email a semi-automatic "eprint request" to the author of any eprint in an IR that has been deposited as "Closed Access" rather than "Open Access" to request an individual copy for personal use. (The author need merely click on an "approval" URL in his email message in order to fulfil the request.) Two recent "accidents," occurring independently at two different institutions, provide dramatic evidence of the potential power of this feature: The button is intended to tide over researcher usage needs during any embargo interval. As such, it is expected to apply only to a minority of deposits (as the majority of journals already endorse immediate Open Access-setting: http://romeo.eprints.org/stats.php ). The two accident-anecdotes come from University of Southampton and Universit? du Qu?bec ? Montr?al: Southampton has many IRs: A departmental IR (Department of Electronics and Computer Science) http://eprints.ecs.soton.ac.uk/ already has an immediate full-text deposit mandate, but the university-wide IR http://eprints.soton.ac.uk/ does not yet have a mandate, so it has many deposits for which only the metadata are accessible, many of them deposited via library mediation rather than by the authors themselves. This will soon change to direct author deposit, but meanwhile, "The Button" was implemented, and the result was such a huge flood of eprint requests that the proxy depositors were overwhelmed and the feature quickly had to be turned off! The Button will of course be restored -- with the LDAP feature used to redirect the eprint requests to the authors rather than the library mediators -- but the accident was instructive in revealing the nuclear power of the button! Authors, we expect, will be gratified by the countable measures of interest in their work, and we will make a countable metric out of the number of eprint requests. Authors will be able to opt out of receiving eprint requests -- but we confidently expect that few will choose to do so! (Our confidence is based on many factors, take your pick: (1) Authors' known habit of looking first at the bibliography of any article or book in their field, to see "Do they cite me?" (2) Authors' known habit of googling themselves as well as looking up their own citation-counts in Web of Science and now in Google Scholar. (3) Employers' and funders' growing use of research performance metrics to supplement publication counts in employment, promotion and funding decisions...) Much the same thing happened at UQaM http://www.acceslibre.uqam.ca/ but this time it was while a new IR was still under construction, and its designers were still just testing out its features with dummy demo papers (some of them real!). "The Button" again unleashed an immediate torrent of eprint requests for the bona fide papers, so the feature had to be (tremulously, but temporarily) disabled! Caveat Emptor! Stevan Harnad From harnad at ECS.SOTON.AC.UK Mon Nov 20 16:49:42 2006 From: harnad at ECS.SOTON.AC.UK (Stevan Harnad) Date: Mon, 20 Nov 2006 21:49:42 +0000 Subject: Self-Archiving Impact Advantage: Quality Advantage or Quality Bias? Message-ID: Self-Archiving Impact Advantage: Quality Advantage or Quality Bias? Stevan Harnad SUMMARY: In astrophysics, Kurtz found that articles that were self-archived by their authors in Arxiv were downloaded and cited twice as much as those that were not. He traced this enhanced citation impact to two factors: (1) Early Access (EA): The self-archived preprint was accessible earlier than the publisher's version (which is accessible to all research-active astrophysicists as soon as it is published, thanks to Kurtz's ADS system). (Hajjem, however, found that in other fields, which self-archive only published postprints and do have accessibility/affordability problems with the publisher's version, self-archived articles still have enhanced citation impact.) Kurtz's second factor was: (2) Quality Bias (QB), a selective tendency for higher quality articles to be preferentially self-archived by their authors, as inferred from the fact that the proportion of self-archived articles turns out to be higher among the more highly cited articles. (The very same finding is of course equally interpretable as (3) Quality Advantage (QA), a tendency for higher quality articles to benefit more than lower quality articles from being self-archived.) In condensed-matter physics, Moed has confirmed that the impact advantage occurs early (within 1-3 years of publication). After article-age is adjusted to reflect the date of deposit rather than the date of publication, the enhanced impact of self-archived articles is again interpretable as QB, with articles by more highly cited authors (based only on their non-archived articles) tending to be self-archived more. (But since the citation counts for authors and for their articles are correlated, one would expect much the same outcome from QA too.) The only way to test QA vs. QB is to compare the impact of self-selected self-archiving with mandated self-archiving (and no self-archiving). (The outcome is likely to be that both QA and QB contribute, along with EA, to the impact advantage.) Michael Kurtz's papers have confirmed that in astronomy/astrophysics (astro), articles that have been self-archived -- let's call this "Arxived" to mark it as the special case of depositing in the central Physics Arxiv -- are cited (and downloaded) twice as much as non-Arxived articles. Let's call this the "Arxiv Advantage" (AA). http://arxiv.org/ Henneken, E. A., Kurtz, M. J., Eichhorn, G., Accomazzi, A., Grant, C., Thompson, D., and Murray, S. S. (2006) Effect of E-printing on Citation Rates in Astronomy and Physics. Journal of Electronic Publishing, Vol. 9, No. 2 http://arxiv.org/abs/cs/0604061 Henneken, E. A., Kurtz, M. J., Warner, S., Ginsparg, P., Eichhorn, G., Accomazzi, A., Grant, C. S., Thompson, D., Bohlen, E. and Murray, S. S. (2006) E-prints and Journal Articles in Astronomy: a Productive Co-existence (submitted to Learned Publishing) http://arxiv.org/abs/cs/0609126 Kurtz, M. J., Eichhorn, G., Accomazzi, A., Grant, C. S., Demleitner, M., Murray, S. S. (2005) The Effect of Use and Access on Citations. Information Processing and Management, 41 (6): 1395-1402 http://cfa-www.harvard.edu/~kurtz/kurtz-effect.pdf Kurtz analyzed AA and found that it consisted of at least 2 components: (1) EARLY ACCESS (EA): There is no detectable AA for old articles in astro: AA occurs while an article is young (1-3 years). Hence astro articles that were made accessible as preprints before publication show more AA: This is the Early Access effect (EA). But EA alone does not explain why AA effects (i.e., enhanced citation counts) persist cumulatively and even keep growing, rather than simply being a phase-advancing of otherwise un-enhanced citation counts, in which case simply re-calculating an article's age so as to begin at preprint deposit time instead of publication time should eliminate all AA effects -- which it does not. (2) QUALITY BIAS (QB): (Kurtz called the second component "Self-Selection Bias" for quality, but I call it self-selection Quality Bias, QB): If we compare articles within roughly the same citation/quality bracket (i.e., articles having the same number of citations), the proportion of Arxived articles becomes higher in the higher citation brackets, especially the top 200 papers. Kurtz interprets this is as resulting from authors preferentially Arxiving their higher-quality preprints (Quality Bias). Of course the very same outcome is just as readily interpretable as resulting from Quality Advantage (QA) (rather than Quality Bias (QB)): i.e., that the Arxiving benefits better papers more. (Making a low-quality paper more accessible by Arxiving it does not guarantee more citations, whereas making a high-quality paper more accessible is more likely to do so, perhaps roughly in proportion to its higher quality, allowing it to be used and cited more according to its merit, unconstrained by its accessibility/affordability.) There is no way, on the basis of existing data, to decide between QA and QB. The only way to measure their relative contributions would be to control the self-selection factor: randomly imposing Arxiving on half of an equivalent sample of articles of the same age (from preprinting age to 2-3 years postpublication, reckoning age from deposit date, to control also for age/EA effects), and comparing also with self-selected Arxiving. We are trying an approximation to this method, using articles deposited in Institutional Repositories of institutions that mandate self-archiving (and comparing their citation counts with those of articles from the same journal/issue that have not been self-archived), but the sample is still small and possibly unrepresentative, with many gaps and other potential liabilities. So a reliable estimate of the relative size of QA and QB still awaits future research, when self-archiving mandates will have become more widely adopted. Henk Moed's data on Arxiving in Condensed Matter physics (cond-mat) replicates Kurtz's findings in astro (and Davis/Fromerth's, in math): Moed, H. F. (2006, preprint) The effect of 'Open Access' upon citation impact: An analysis of ArXiv's Condensed Matter Section http://arxiv.org/abs/cs.DL/0611060 Davis, P. M. and Fromerth, M. J. (2007) Does the arXiv lead to higher citations and reduced publisher downloads for mathematics articles? Scientometics, accepted for publication. http://arxiv.org/abs/cs.DL/0603056 See critiques: http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#5221 http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/5440.html Moed too has shown that in cond-mat the AA effect (which he calls CID "Citation Impact Differential") occurs early (1-3 years) rather than late (4-6 years), and that there is more Arxiving by authors of higher-quality (based on higher citation counts for their non-Arxived articles) than by lower-quality authors. But this too is just as readily interpretable as the result of QB or QA (or both): We would of course expect a high correlation between an author's individual articles' citation counts and the author's average citation count, whether the author's citation count is based on Arxived or non-Arxived articles. These are not independent variables. (Less easily interpretable -- but compatible with either QA or QB interpretations -- is Moed's finding of a smaller AA for the "more productive" authors. Moed's explanations in terms of co-authorships between more productive and less productive authors, senior and junior, seem a little complicated.) The basic question is this: Once the AA has been adjusted for the "head-start" component of the EA (by comparing articles of equal age -- the age of Arxived articles being based on the date of deposit of the preprint rather than the date of publication of the postprint), how big is that adjusted AA, at each article age? For that is the AA without any head-start. Kurtz never thought the EA component was merely a head start, however, for the AA persists and keeps growing, and is present in cumulative citation counts for articles at every age since Arxiving began. This non-EA AA is either QB or QA or both. (It also has an element of Competitive Advantage, CA, which would disappear once everything was self-archived, but let's ignore that for now.) Harnad, S. (2005) OA Impact Advantage = EA + (AA) + (QB) + QA + (CA) + UA. Preprint. http://eprints.ecs.soton.ac.uk/12085/ Moed's analysis, like Kurtz's, cannot decide between QB and QA. The fact that most of the AA comes in an article's first 3 years rather than its second 3 years simply shows that both astro and cond-mat are fast-developing fields. The fact that highly-cited articles (Kurtz) and articles by highly-cited authors (Moed) are more likely to be Arxived certainly does not settle the question of cause and effect: It is just as likely that better articles benefit more from Arxiving (QA) as that better authors/articles tend to Arxive/be-Arxived more (QB). Nor is Arxiv the only test of the self-archiving Open Access Advantage. (Let's call this OAA, generalizing from the mere Arxiving Advantage, AA): We have found an OAA with much the same profile as the AA in 10 further fields, for articles of all ages (from 1 year old to 10 years old), and as far as we know, with the exception of Economics, these are not fields with a preprinting culture (i.e., they don't self-archive prepublication preprints but only postpublication postprints). Hence the consistent pattern of OAA across all fields and across articles of all ages is very unlikely to have been just a head-start (EA) effect. Hajjem, C., Harnad, S. and Gingras, Y. (2005) Ten-Year Cross-Disciplinary Comparison of the Growth of Open Access and How it Increases Research Citation Impact. IEEE Data Engineering Bulletin 28(4) pp. 39-47. http://eprints.ecs.soton.ac.uk/11688/ Is the OAA, then, QB or QA (or both)? There is no way to determine this unless the causality is controlled by randomly imposing the self-archiving on a subset of a sufficiently large and representative random sample of articles of all ages (but especially newborn ones) and comparing the effect across time. In the meantime, here are some factors worth taking into account: (1) Both astro and and cond-mat are fields where it has been repeatedly claimed that the accessibility/affordability problem for published postprints is either nonexistent (astro) or less pronounced than in other fields. Hence the only scope for an OAA in astro and cond-mat is at the prepublication preprint stage. (2) In many other fields, however, not only is there no prepublication preprint self-archiving at all, but there is a much larger accessibility/affordability barrier for potential users of the published article. Hence there is far more scope for OAA and especially QA (and CA): Access is a necessary (though not a sufficient) causal precondition for impact (usage and citation). It is hence a mistake to overgeneralize the phys/math AA findings to OAA in general. We need to wait till we have actual data before we can draw confident conclusions about the degree to which the AA or the OAA are a result of QB or QA or both (and/or other factors, such as CA). For the time being, I find the hypothesis of a causal QA (plus CA) effect, successfully sought by authors because they are desirous of reaching more users, far more plausible and likely than the hypothesis of an a-causal QB effect in which the best authors are self-archiving merely out of superstition or vanity! (And I suspect the truth is a combination of both QA/CA and QB.) Stevan Harnad From garfield at CODEX.CIS.UPENN.EDU Tue Nov 21 14:22:29 2006 From: garfield at CODEX.CIS.UPENN.EDU (=?windows-1252?Q?henry_Small?=) Date: Tue, 21 Nov 2006 14:22:29 -0500 Subject: EUGENE GARFIELD DOCTORAL DISSERTATION SCHOLARSHIP - 2007 Message-ID: EUGENE GARFIELD DOCTORAL DISSERTATION SCHOLARSHIP - 2007 Call for Nominations 1. Nature of the Award The scholarship will consist of an award of $3,000 (donated by the Eugene Garfield Foundation) to be used for travel or other research related expenses of the grant recipient, contingent upon the recipient's attending the ISSI biennial conference. The next meeting of ISSI will be held on June 25-27, 2007 in Madrid, Spain (www.issi2007.cindoc.csic.es/). 2. Purpose of the Award The purpose of this scholarship is to foster research in bibliometrics, scientometrics, informetrics and webometrics by encouraging and assisting doctoral students in the field with their dissertation research. 3. Eligibility 3.1 The scholarship recipient must meet the following qualifications: (a) Be an active doctoral candidate pursuing research using bibliometric, scientometric, informetric or webometric methodology in a degree-granting institution; (b) Have a doctoral dissertation proposal accepted by the institution. 3.2 The applicant need not be a Student Member or a Regular Member of ISSI to be considered for this scholarship. 4. Administration The award is sponsored by the Eugene Garfield Foundation with the cooperation of the Chemical Heritage Foundation, and is administered by the Board of the International Society for Scientometrics and Informetrics (ISSI, www.issi-society.info/). 5. Nominations Submissions should include the following: (a) The doctoral research proposal, including a description of the research, methodology, and significance, 10 pages or less in length, double- spaced, and in English; (b)An abstract of the paper to be presented at the ISSI Conference; (c)A cover letter from the dissertation advisor endorsing the proposal; and (d)An up-to-date curriculum vitae. 5. Submission instructions and deadline Deadline for submissions is January 31, 2007. All proposals should be submitted on-line to http://www.Softconf.com/start/ISSI2007/Garfield/. From garfield at CODEX.CIS.UPENN.EDU Tue Nov 21 16:07:30 2006 From: garfield at CODEX.CIS.UPENN.EDU (=?windows-1252?Q?henry_Small?=) Date: Tue, 21 Nov 2006 16:07:30 -0500 Subject: Figueredo-Gaspari E "Curricular evaluation of scientific publications " MEDICINA CLINICA 125 (17): 661-665 NOV 12 2005 Message-ID: E-mail Addresses: eduardofigueredo at hotmail.com The author has kindly provided an extended summary in English: Title: Curricular evaluation of scientific publications Author(s): Figueredo-Gaspari E Source: MEDICINA CLINICA 125 (17): 661-665 NOV 12 2005 Document Type: Editorial Material Language: Spanish Cited References: 30 Times Cited: 2 Addresses: Figueredo-Gaspari E (reprint author), P Del Palmeral,4,Edificio Capri,6, Almeria 04720, Spain Hosp Torrecardenas, Serv Anestesiol & Reanimac, Almeria, Spain E-mail Addresses: eduardofigueredo at hotmail.com Publisher: EDICIONES DOYMA S/L, TRAV DE GRACIA 17-21, 08021 BARCELONA, SPAIN Subject Category: MEDICINE, GENERAL & INTERNAL IDS Number: 995KH ISSN: 0025-7753 ENGLISH SUMMARY: Curricular evaluation of scientific publications The aim of this survey was to ascertain the opinion investigators have with regard to the evaluation of scientific publications. Two members from the Editorial Boards of the 42 Spanish journals included in Medline were interviewed. Each one of the 84 investigators should fulfil specific requirements as author of scientific articles (minimum ten published articles). Usually one of those interviewed was the Editor of the journal. Results were as follows: a) Which authors should obtain credit for their participation in the articles? -The first author only: 2,5% of those interviewed -The two first authors: 5% of those interviewed -The three first authors: 21% of those interviewed -The four first authors: 5% of those interviewed -The five first authors: 4% of those interviewed -The six first authors: 5% of those interviewed -All the authors: 67% of those interviewed b) How should the credit be distributed among the authors? -All the authors should receive the same credit for the article: 35% of respondents -The credit should be distributed in decreasing form according to the order of authorship listed in the by-line: 65% of respondents c) See Figure 1: ?Evaluation of the journals included in MedLine (but not in Science Citation Index). As a reference value, 10 points for journals included in Science Citation Index.? Answers are shown in percentages of respondents. d) See Figure 2: ?Evaluation of the journals included in the Spanish Index Medicus (but not in MedLine neither SCI). As reference value, 10 points is considered for journals included in Science Citation Index.? Answers are shown in percentages of respondents. e) Should those articles cited by other authors/articles receive additional credit? -Yes: 84.5% -No: 15.5% f) See figure 3: ?Number of cites that should receive the articles to obtain additional credit.? Answers are shown in percentages of respondents. g) See figure 4: ?Evaluation of ?Case reports?. As reference value, 10 points is considered for original articles.? Answers are shown in percentages of respondents. h) Evaluation of ?review articles?. As a reference value, 10 points for original articles. -More than 10 points: 9.5% of those interviewed -10 points: 22.5% of those interviewed -8-9.99 points: 12% of those interviewed -6-7.99 points: 24% of those interviewed -4-5.99 points: 20% of those interviewed -2-3.99 points: 8% of those interviewed -0-1.99 points: 3.5% of those interviewed i) See figure 5: ?Evaluation of ?Letters to the Editor?. As reference value, 10 points is considered for original articles.? Answers are shown in percentages of respondents. j) See figure 6: ?More appropriate methods for the evaluation of the scientific publications.? Answers are shown in percentages of respondents. a) Critical reading and direct evaluation by experts in the field b) Critical reading + bibliometric indicators c) Exclusively bibliometric indicators Discussion: This article discusses the need to design an integral model of evaluation of the scientific publications. It should be carried out with a combined set of fair, reliable and clear bibliometric indicators. ________________________________________________________ cited references: B OFICIAL COMUNIDAD : 2004 B OFICIAL ESTADO 47 : 2004 OBSERVATOIRE SCI TEC : 1998 *COUNC SCI TECHN A SCI TECHN EXC PUBL S : 2001 *NBEET 21 NBEET : 1993 BORDONS M Evaluation of scientific activity through bibliometric indicators REVISTA ESPANOLA DE CARDIOLOGIA 52 : 790 1999 BROAD WJ SCIENCE 211 : 137 1981 BURMAN KD HANGING FROM THE MASTHEAD - REFLECTIONS ON AUTHORSHIP ANNALS OF INTERNAL MEDICINE 97 : 602 1982 BUTLER L A list of published papers is no measure of value - The present system rewards quantity, not quality - but hasty changes could be as bad. NATURE 419 : 877 2002 BUTLER L STRATEGIC ASSESSMENT CAMI J Evaluation of biomedical research MEDICINA CLINICA 117 : 510 2001 CAMI J Impactolatry: diagnosis and treatment MEDICINA CLINICA 109 : 515 1997 CONN DA IMPORTANCE OF COMPONENTS OF THE CURRICULUM VITAE IN DETERMINING APPOINTMENT TO SENIOR REGISTRAR POSTS ANAESTHESIA 49 : 623 1994 FIGUEREDO E REV ESP ANESTESIOL R 51 : 429 2004 FIGUEREDO E REV ESP ANESTESIOL R 46 : 378 1999 GARFIELD E CURR CONTENTS 30 : 52 1982 GLANZEL W Journal impact measures in bibliometric research SCIENTOMETRICS 53 : 171 2002 IRVINE J EVALUATION SCI RES : 1989 LINDSEY D PRODUCTION AND CITATION MEASURES IN THE SOCIOLOGY OF SCIENCE - THE PROBLEM OF MULTIPLE AUTHORSHIP SOCIAL STUDIES OF SCIENCE 10 : 145 1980 LONG JS ON ADJUSTING PRODUCTIVITY MEASURES FOR MULTIPLE AUTHORSHIP SCIENTOMETRICS 4 : 379 1982 MOED HF THE APPLICATION OF BIBLIOMETRIC INDICATORS - IMPORTANT FIELD-DEPENDENT AND TIME-DEPENDENT FACTORS TO BE CONSIDERED SCIENTOMETRICS 8 : 177 1985 NARIN F EVALUATIVE BIBLIOMET : 1976 OJASOO T The impact factor of medical journals, a bibliometrical indicator to be handled with care PRESSE MEDICALE 31 : 775 2002 PETROIANU A REV ASSOC MED BRAS 49 : 173 2003 RENNIE D When authorship fails - A proposal to make contributors accountable JAMA-JOURNAL OF THE AMERICAN MEDICAL ASSOCIATION 278 : 579 1997 RINIA EJ Comparative analysis of a set of bibliometric indicators and central peer review criteria - Evaluation of condensed matter physics in the Netherlands RESEARCH POLICY 27 : 95 1998 SEGLEN PO FROM BAD TO WORSE - EVALUATION BY JOURNAL IMPACT TRENDS IN BIOCHEMICAL SCIENCES 14 : 326 1989 TRENCHARD PM HIERARCHICAL BIBLIOMETRY - A NEW OBJECTIVE-MEASURE OF INDIVIDUAL SCIENTIFIC PERFORMANCE TO REPLACE PUBLICATION COUNTS AND TO COMPLEMENT CITATION MEASURES JOURNAL OF INFORMATION SCIENCE 18 : 69 1992 TRUEBA FJ A robust formula to credit authors for their publications SCIENTOMETRICS 60 : 181 2004 VANLEEUWEN T HDB QUANTITATIVE SCI : 373 2004 From katy at INDIANA.EDU Wed Nov 22 15:57:30 2006 From: katy at INDIANA.EDU (Katy Borner) Date: Wed, 22 Nov 2006 15:57:30 -0500 Subject: "Places & Spaces: Mapping Science " exhibit on view at the New York Hal l of Science, Dec. 9, 2006 =?windows-1252?Q?=96?= Feb. 25, 2007 Message-ID: The "Places & Spaces: Mapping Science" exhibit will be on display -- with science puzzle maps for kids -- at the New York Hall of Science, Dec. 9, 2006 ? Feb. 25, 2007. Enjoy, Julie Smith & Katy Borner ------------------------------------------------------------------------------------------------------------ CONTACTS: Mary Record 718.699.0005 ext. 323 Carol Nordin 718.699.0005 ext. 342 For Immediate Release: Tuesday, November 21, 2006 For a New Take on Maps Chart Your Way to the New York Hall of Science Places & Spaces: Mapping Science on view December 9, 2006 ? February 25, 2007 Want to see science from above? Curious to see what impact one single person or invention can have? Keen to find pockets of innovation? Desperate for better tools to manage the flood of information? Or are you simply fascinated by maps? Then visit the Places & Spaces: Mapping Science exhibition, which aims to demonstrate the power of maps to navigate and make sense of physical places and abstract topic spaces. The display at the New York Hall of Science features the first two out of 10 iterations of the Places & Spaces exhibition entitled The Power of Maps and The Power of Reference Systems. Also shown are an Illuminated Diagram display by W. Bradford Paley, Kevin Boyack, John Burgoon, Peter Kennard and Richard Klavans, and Worldprocessor globes by Ingo G?nther. On display for the first time are hands-on science maps for kids with paintings by Fileve Palmer. Come explore 20 large-format, high-resolution maps that demonstrate the power of maps for navigating and managing mankind?s collective scholarly knowledge and aim to inspire a discussion about a spatial reference system for science. See where science gets done, how the different areas of science interrelate, and how knowledge diffuses in geospatial and topic space by playing with the interactive Illuminated Diagram display. Discover zones of inventions and patenting activity while spinning the beautiful and informative Worldprocessor globes. Solve the hands-on science map puzzle by placing major scientists, inventors and inventions at their proper places on a world map and on a map of science. Look for the many hints hidden in the beautiful paintings to find the perfect place for each puzzle piece. Pick up one of the handouts and make your very own map of science. What science experiments do you like best? Where would your favorite science teachers go? What area of science do you want to explore next? Visit http://www.scimaps.org/exhibit/nyscience and http://www.nyscience.org to learn more. Places & Spaces: Mapping Science is curated by Dr. Katy B?rner and Julie Smith, Indiana University. The exhibition advisors for The New York Hall of Science display are Marcia Rudy & Stephen Uzzo. The exhibit is sponsored by National Science Foundation awards IIS-0238261 and CHE-0524661; Cyberinfastructure for Network Science Center, University Information Technology Services, and School of Library and Information Science, Indiana University; Thomson Scientific; and The New York Hall of Science. -- Katy Borner, Associate Professor Information Science & Cognitive Science Indiana University, SLIS 10th Street & Jordan Avenue Phone: (812) 855-3256 Fax: -6166 Main Library 021 E-mail: katy at indiana.edu Bloomington, IN 47405, USA WWW: ella.slis.indiana.edu/~katy InfoVis Lab/CNS Center Open House is on Oct 30th, 2006 http://ella.slis.indiana.edu/~katy/gallery/06-openhouse/ -------------- next part -------------- A non-text attachment was scrubbed... Name: ForaNewTakeonMapsChartYourWaytotheHall.PDF Type: application/pdf Size: 492428 bytes Desc: not available URL: From garfield at CODEX.CIS.UPENN.EDU Wed Nov 22 16:38:21 2006 From: garfield at CODEX.CIS.UPENN.EDU (=?windows-1252?Q?Eugene_Garfield?=) Date: Wed, 22 Nov 2006 16:38:21 -0500 Subject: Young AP "Library quarterly, 1956-2004: An exploratory bibliometric analysis " Library Quarterly 76(1):10-18 January 2006 Message-ID: E-mail Addresses: ayoung at niu.edu The publisher of Library Quarterly - The University of Chicago Press - has kindly lifted access control to the article for readers of this list. Full text of the article is available at : Title: Library quarterly, 1956-2004: An exploratory bibliometric analysis Author(s): Young AP (Young, Arthur P.) Source: LIBRARY QUARTERLY 76 (1): 10-18 JAN 2006 Document Type: Article Language: English Cited References: 7 Times Cited: 0 Abstract: Library Quarterly's seventy-fifth anniversary invites an analysis of the journal's bibliometric dimension, including contributor attributes, various author rankings, and citation impact. Eugene Garfield's HistCite software, linked to Thomson Scientific's Web of Science, as made available by Garfield, for the period 1956-2004, was used as the core database for analysis in this essay. A brief comparison of Library Quarterly contributor citation impact and that of College & Research Libraries is also provided. Library Quarterly continues to attract a roster of highly productive, international scholars. Addresses: Young AP (reprint author), No Illinois Univ, De Kalb, IL 60115 USA No Illinois Univ, De Kalb, IL 60115 USA E-mail Addresses: ayoung at niu.edu Publisher: UNIV CHICAGO PRESS, 1427 E 60TH ST, CHICAGO, IL 60637-2954 USA Subject Category: INFORMATION SCIENCE & LIBRARY SCIENCE IDS Number: 078EY ISSN: 0024-2519 CITED REFERENCES: BARILAN J Informetric theories and methods for exploring the Internet: An analytical survey of recent research literature LIBRARY TRENDS 50 : 371 2002 COLEMAN A J ED LIBR INFORMATIO GARFIELD E HISTCITE SYSTEM MAPP : 2004 KOHL DF RATINGS OF JOURNALS BY ARL LIBRARY DIRECTORS AND DEANS OF LIBRARY AND INFORMATION-SCIENCE SCHOOLS COLLEGE & RESEARCH LIBRARIES 46 : 40 1985 NISONGER TE The perception of library and information science journals by LIS education deans and ARL library directors: A replication of the Kohl-Davis study COLLEGE & RESEARCH LIBRARIES 66 : 341 2005 PERSSON O BIBLIOMETRIC NOTES 7 : 1 2005 SELLERS SL Evaluation of social work journal quality: Citation versus reputation approaches JOURNAL OF SOCIAL WORK EDUCATION 40 : 143 2004 From David.Watkins at SOLENT.AC.UK Thu Nov 23 03:50:00 2006 From: David.Watkins at SOLENT.AC.UK (David Watkins) Date: Thu, 23 Nov 2006 08:50:00 +0000 Subject: Access problem Message-ID: I use Lotus Notes 5.6 by default, set up on my university server, with only limited tweaking availble locally. The upside is rock steady reliability. There is a 64k limit on the size of any field or paragraph. About once a month I therefore cannot open post from this list. Do others have this problem? Is there a workaround? (NB It is not possible to forward the incoming posts - same error message) From kretschmer.h at T-ONLINE.DE Sun Nov 26 07:23:48 2006 From: kretschmer.h at T-ONLINE.DE (kretschmer.h@t-online.de) Date: Sun, 26 Nov 2006 13:23:48 +0100 Subject: Website for the New Delhi Conference and 8th COLLNET Meeting March 2007 In-Reply-To: <455DC74A.5010607@cindoc.csic.es> Message-ID: An HTML attachment was scrubbed... URL: From camille.roth at POLYTECHNIQUE.EDU Tue Nov 28 12:53:37 2006 From: camille.roth at POLYTECHNIQUE.EDU (Camille Roth) Date: Tue, 28 Nov 2006 18:53:37 +0100 Subject: CFP: ICFCA workshop on social network analysis & concept lattices Message-ID: (apologies for cross-posting) -------------------- ICFCA 2007 Workshop "Social Network Analysis and Conceptual Structures: Exploring Opportunities" -------------------- http://camille.roth.free.fr/confs/icfcasna.html in conjunction with The 5th International Conference on Formal Concept Analysis (ICFCA 2007) February 12-16, 2007; Clermont-Ferrand, France http://www.isima.fr/icfca07/ CALL FOR PAPERS The recent years have seen a renewed interest in an interdisciplinary effort aiming at analyzing social networks, in which both mathematical sociology and computer science play a key role, relying altogether extensively on graph theory. This effort has mainly been fueled and supported by significant advances in computing capabilities and electronic data availability for several social systems: scientists, webloggers, online customers, computer-based collaboration-enhancing devices, inter alia. In particular, knowledge networks, i.e. interaction networks where agents produce or exchange knowledge, are the focus of many current studies, both qualitative and quantitative. Among these, community- detection issues such as finding agents sharing sets of identical patterns are a key topic. Social network analysis is proficient in methods aimed at discovering, describing, and plausibly organizing various kinds of social communities. At the same time, conceptual structures can yield a fruitful insight in this regard, be it in relation to affiliation networks (actors belonging to the same organizations, participating in identical events) or to epistemic communities (i.e. agents dealing with identical topics, such as scientific communities or weblogs). And, indeed, some applications of concept (or Galois) lattices in sociology have been proposed since the early 1990s; yet, in that context social aspects of community structures are usually of prime interest: leaders, peripheral members, cooperation within and between different groups. On the other hand, conceptual structures are typically focused around taxonomies -- possibly useful to describe actors in terms of centers of interest, for instance -- rather than focused on interactions. More broadly, notions pertaining to social network analysis seem presently to remain somehow outside the mainstream research of the concept lattice community. The aim of this workshop is to investigate the opportunities for formal concept analysis in social networks by proposing possible bridges between these frameworks and by presenting issues of mathematical sociology which could benefit from conceptual structures, so as to eventually facilitate collaboration between the two fields. Therefore, we particularly welcome submissions of the survey type describing the state of the art in any of the fields listed below along with submissions specifying a concrete problem that still needs an efficient solution. Submissions may but do not have to address the possible use of formal concept analysis in these fields. TARGET AUDIENCE Social scientists using or willing to use formal techniques in any of the fields listed below; researchers in discrete structures and formal concept analysis interested in applications in social sciences. TOPICS Knowledge networks / epistemic networks Collective construction of knowledge, social cognition Social epistemology applied to social networks Social network analysis of communities of practice Information diffusion in social networks Affiliation networks Social network-based methods for community detection Web communities, open-source development communities Web blog analysis Social networking websites Collaboration-enhancing tools (in organizations, on the web, inter alia) Knowledge exchange devices Semantic web and social networks Knowledge management using social data Building semantics from collaborative environments Taxonomies and ontologies for scientific domains Network analysis for folksonomies Systems for folksonomy building Evolution of network structures SUBMISSION PROCEDURE Papers no longer than 16 pages should be submitted no later than January 5, 2007 to sna.fca at gmail.com in Adobe PDF or Postscript format. Papers should also be formatted according to the official formatting guidelines of the main conference (LNCS). Short papers are also welcome. IMPORTANT DATES Submission deadline: January 5, 2007 Notification of acceptance: January 22, 2007 ORGANIZERS Sergei Obiedkov (Higher School of Economics, Russia) - sergei.obj at gmail.com Camille Roth (University of Modena, Italy & CREA/CNRS, France) - roth at shs.polytechnique.fr PROGRAM COMMITTEE Alain Degenne (CNRS, France) Vincent Duquenne (University of Paris VI/CNRS, France) Peter Eklund (University of Wollongong, Australia) Linton C. Freeman (UC Irvine, USA) Andreas Hotho (University of Kassel, Germany) Jeffrey H. Johnson (Open University, UK) Cliff Joslyn (Los Alamos National Laboratory, USA) Sergei Kuznetsov (Higher School of Economics & VINITI, Russia) Amedeo Napoli (LORIA/CNRS, France) Jean Sallantin (LIRMM/CNRS, France) Gerd Stumme (University of Kassel, Germany) From loet at LEYDESDORFF.NET Thu Nov 30 13:40:16 2006 From: loet at LEYDESDORFF.NET (Loet Leydesdorff) Date: Thu, 30 Nov 2006 19:40:16 +0100 Subject: FW: [Asis-l] FW: Dr. Eugene Garfield Wins the Online Information Lifetime Achievement Award (fwd) Message-ID: Dear Gene, Congratulations! Loet ________________________________ Loet Leydesdorff Amsterdam School of Communications Research (ASCoR) Kloveniersburgwal 48, 1012 CX Amsterdam Tel.: +31-20- 525 6598; fax: +31-20- 525 3681 loet at leydesdorff.net ; http://www.leydesdorff.net/ ---------- Forwarded message ---------- Date: Thu, 30 Nov 2006 09:54:36 -0500 From: Richard Hill To: asis-l at asis.org Subject: [Asis-l] FW: Dr. Eugene Garfield Wins the Online Information Lifetime Achievement Award _____ From: rodney.yancey at thomson.com [mailto:rodney.yancey at thomson.com] Sent: Thursday, November 30, 2006 9:52 AM To: Richard Hill Subject: Dr. Eugene Garfield Wins the Online Information Lifetime Achievement Award Dr. Eugene Garfield Wins the Online Information Lifetime Achievement Award PHILADELPHIA and LONDON, Nov. 30 /PRNewswire/ -- Thomson Scientific, part of The Thomson Corporation (NYSE: TOC; TSX: TOC) and leading provider of information solutions to the worldwide research and business communities, today announced that Dr. Eugene Garfield is the 2006 recipient of the Online Information Lifetime Achievement Award in recognition of more than 50 years of dedication, leadership and innovation in the information industry. The 2006 International Information Industry Awards were held at the Royal Lancaster Hotel in London, UK on Wednesday, November 29th. Often dubbed the "Father of Scientometrics and Bibliometrics," Dr. Garfield is founder & chairman emeritus of the Institute for Scientific Information (ISI(R)) - now Thomson Scientific. Garfield's career in scientific communication and information science began in 1951 when he joined the Welch Medical Indexing Project at Johns Hopkins University, USA. The project planted the seeds for several major advances in scientific communication and information science that have distinguished Dr. Garfield's career. In 1958, Garfield was contacted by Joshua Lederberg, who was interested in knowing what happened to the citation index Garfield proposed in 1955 in the journal, Science. This, eventually led to a meeting with the National Institutes of Health (NIH) to produce and distribute a Genetics Citation Index, including a multi-disciplinary index to the science literature of 1961. Undaunted by the NIH and National Science Foundation's (NSF) disinterest in publishing the latter index, Garfield began regularly publishing the Science Citation Index(R) (SCI(R)) in 1964 through the Institute for Scientific Information. The SCI(R) soon distinguished itself from other literature indexes and was recognized as a basic and fundamental innovation in scientific communication and information science. From 1961 on, Garfield's career is marked by the constant enhancement of existing resources combined with the extraordinary development of new information tools for researchers, including Current Contents(R), plus citation indexes for the social sciences (SSCI(R)) and arts and humanities (A&HCI(R)). During the past decades, as the volume of literature has been growing exponentially, Garfield's innovations have made it possible for researchers to cope with and keep up with articles directly relevant to their interests. Current Contents has become a vital and basic component of clinical research and the research laboratory. The SCI has become an important tool for navigating the scientific literature. With the advent of the Internet the SCI, SSCI, and A&HCI were integrated into the online information solution - Web of Science(R). Web of Science is the leading resource that provides seamless access to current and retrospective multidisciplinary information from the most prestigious, high-impact research journals in the world. Web of Science also provides a unique search method, cited reference searching. With it, users can navigate forward, backward, and through the literature, searching all disciplines and time spans (back to 1900) to uncover all the information relevant to their research. Today, Web of Science is a key component of ISI Web of Knowledge(SM) - the ground-breaking, integrated research platform that facilitates discovery by offering seamless navigation to high-quality, multidisciplinary journal, patent, and web content; evaluation tools; and bibliographic management products. At age 81, Dr. Garfield maintains a heavy schedule of invited speeches and presentations before high-level medical, scientific, and information symposia and conferences. He has been the recipient of numerous awards and recognitions. He has published over 1,000 weekly essays in Current Contents and has also published and edited commentaries by the authors of over 5,000 Citation Classics. Dr Garfield continues to be active in scientific communication and information science. In 1986, he founded The Scientist, a bi-weekly newspaper for research professionals. It reports on news and developments relevant to the professional and practical interests of scientists, providing a unique forum for the discussion of issues important both to the research community and society. Accepting the award, Dr. Garfield expressed his gratitude to Online Information, colleagues, the research and information communities, employees, and friends around the world who supported him throughout his career and inspired his significant contributions to the information industry. He saved his most important thanks for last to his family. About The Thomson Corporation The Thomson Corporation (http://www.thomson.com) is a global leader in providing essential electronic workflow solutions to business and professional customers. With operational headquarters in Stamford, Conn., Thomson provides value-added information, software tools and applications to more than 20 million users in the fields of law, tax, accounting, financial services, scientific research and healthcare. The Corporation's common shares are listed on the New York and Toronto stock exchanges (NYSE: TOC; TSX: TOC). Thomson Scientific is a business of The Thomson Corporation. Its information solutions assist professionals at every stage of research and development-from discovery to analysis to product development and distribution. Thomson Scientific information solutions can be found at http://www.scientific.thomson.com. Contacts: The Americas Rodney Yancey +1 215-823-5397 rodney.yancey at thomson.com Europe Kim Yeatman +44 207 424 2474 kim.yeatman at thomson.com SOURCE Thomson Scientific -0- 11/30/2006 /Web site: http://www.scientific.thomson.com http://www.thomson.com / You are receiving this transmission from PR Newswire on behalf of the issuer of the information contained in this email. If you would like to stop receiving information of this nature via email for this issuer, click here , for auto-removal. -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: .txt URL: From rigic at EXCITE.COM Thu Nov 30 16:48:52 2006 From: rigic at EXCITE.COM (Rajko) Date: Thu, 30 Nov 2006 16:48:52 -0500 Subject: FW: [Asis-l] FW: Dr. Eugene Garfield Wins the Online Information Lifetime Achievement Award (fwd) Message-ID: Dear Gene, Congratulations! Ad multos annos! Sincerely, Rajko Igic, Chicago --- On Thu 11/30, Loet Leydesdorff loet at LEYDESDORFF.NET wrote: From: Loet Leydesdorff [mailto: loet at LEYDESDORFF.NET] To: SIGMETRICS at listserv.utk.edu Date: Thu, 30 Nov 2006 19:40:16 +0100 Subject: [SIGMETRICS] FW: [Asis-l] FW: Dr. Eugene Garfield Wins the Online Information Lifetime Achievement Award (fwd) the Online Information Lifetime Achievement Award



PHILADELPHIA and LONDON, Nov. 30 /PRNewswire/ -- Thomson Scientific,
part of The Thomson Corporation (NYSE: TOC; TSX: TOC) and leading provider
of information solutions to the worldwide research and business communities,
today announced that Dr. Eugene Garfield is the 2006 recipient of the Online
Information Lifetime Achievement Award in recognition of more than 50 years
of dedication, leadership and innovation in the information industry. The
2006 International Information Industry Awards were held at the Royal
Lancaster Hotel in London, UK on Wednesday, November 29th.

Often dubbed the "Father of Scientometrics and Bibliometrics," Dr.
Garfield is founder chairman emeritus of the Institute for Scientific
Information (ISI(R)) - now Thomson Scientific. Garfield's career in
scientific communication and information science began in 1951 when he
joined the Welch Medical Indexing Project at Johns Hopkins University, USA.
The project planted the seeds for several major advances in scientific
communication and information science that have distinguished Dr. Garfield's
career.

In 1958, Garfield was contacted by Joshua Lederberg, who was interested
in knowing what happened to the citation index Garfield proposed in 1955 in
the journal, Science. This, eventually led to a meeting with the National
Institutes of Health (NIH) to produce and distribute a Genetics Citation
Index, including a multi-disciplinary index to the science literature of
1961. Undaunted by the NIH and National Science Foundation's (NSF)
disinterest in publishing the latter index, Garfield began regularly
publishing the Science Citation Index(R) (SCI(R)) in 1964 through the
Institute for Scientific Information. The SCI(R) soon distinguished itself
from other literature indexes and was recognized as a basic and fundamental
innovation in scientific communication and information science.

From 1961 on, Garfield's career is marked by the constant enhancement of
existing resources combined with the extraordinary development of new
information tools for researchers, including Current Contents(R), plus
citation indexes for the social sciences (SSCI(R)) and arts and humanities
(AHCI(R)).

During the past decades, as the volume of literature has been growing
exponentially, Garfield's innovations have made it possible for researchers
to cope with and keep up with articles directly relevant to their interests.
Current Contents has become a vital and basic component of clinical research
and the research laboratory.




The SCI has become an important tool for navigating the scientific
literature. With the advent of the Internet the SCI, SSCI, and AHCI were
integrated into the online information solution - Web of Science(R). Web of
Science is the leading resource that provides seamless access to current and
retrospective multidisciplinary information from the most prestigious,
high-impact research journals in the world. Web of Science also provides a
unique search method, cited reference searching. With it, users can navigate
forward, backward, and through the literature, searching all disciplines and
time spans (back to 1900) to uncover all the information relevant to their
research.

Today, Web of Science is a key component of ISI Web of Knowledge(SM) -
the ground-breaking, integrated research platform that facilitates discovery
by offering seamless navigation to high-quality, multidisciplinary journal,
patent, and web content; evaluation tools; and bibliographic management
products.

At age 81, Dr. Garfield maintains a heavy schedule of invited speeches
and presentations before high-level medical, scientific, and information
symposia and conferences. He has been the recipient of numerous awards and
recognitions. He has published over 1,000 weekly essays in Current Contents
and has also published and edited commentaries by the authors of over 5,000
Citation Classics. Dr Garfield continues to be active in scientific
communication and information science. In 1986, he founded The Scientist, a
bi-weekly newspaper for research professionals. It reports on news and
developments relevant to the professional and practical interests of
scientists, providing a unique forum for the discussion of issues important
both to the research community and society.

Accepting the award, Dr. Garfield expressed his gratitude to Online
Information, colleagues, the research and information communities,
employees, and friends around the world who supported him throughout his
career and inspired his significant contributions to the information
industry. He saved his most important thanks for last to his family.



About The Thomson Corporation

The Thomson Corporation (http://www.thomson.com) is a global leader in
providing essential electronic workflow solutions to business and
professional customers. With operational headquarters in Stamford, Conn.,
Thomson provides value-added information, software tools and applications to
more than 20 million users in the fields of law, tax, accounting, financial
services, scientific research and healthcare. The Corporation's common
shares are listed on the New York and Toronto stock exchanges (NYSE: TOC;
TSX: TOC).

Thomson Scientific is a business of The Thomson Corporation. Its
information solutions assist professionals at every stage of research and
development-from discovery to analysis to product development and
distribution. Thomson Scientific information solutions can be found at
http://www.scientific.thomson.com.



Contacts:



The Americas

Rodney Yancey

+1 215-823-5397

rodney.yancey at thomson.com



Europe

Kim Yeatman

+44 207 424 2474

kim.yeatman at thomson.com



SOURCE Thomson Scientific

-0- 11/30/2006

/Web site: http://www.scientific.thomson.com

http://www.thomson.com /




You are receiving this transmission from PR Newswire on behalf of the issuer
of the information contained in this email. If you would like to stop
receiving information of this nature via email for this issuer, click here
erald_id=E1220151email_address=rhill at asis.orgoptout_html=optout1.htm ,
for auto-removal.
Attachment: .txt (0.24KB)
_______________________________________________ Join Excite! - http://www.excite.com The most personalized portal on the Web! From nouruzi at GMAIL.COM Thu Nov 30 18:00:04 2006 From: nouruzi at GMAIL.COM (Alireza Noruzi) Date: Fri, 1 Dec 2006 00:00:04 +0100 Subject: FW: [Asis-l] FW: Dr. Eugene Garfield Wins the Online Information Lifetime Achievement Award (fwd) In-Reply-To: <005001c714ae$fb65b440$1302a8c0@loet> Message-ID: Dear All, I warmly congratulate Dr. Eugene Garfield on winning the award. You know that Dr. Garfield has opened many doors for research and applications in scientometrics, informetrics and bibliometrics. Best regards, Alireza ---------------------------------- Alireza Noruzi Editor-in-Chief of Webology On 11/30/06, Loet Leydesdorff wrote: > > Adminstrative info for SIGMETRICS (for example unsubscribe): > http://web.utk.edu/~gwhitney/sigmetrics.html > > > Dear Gene, > > Congratulations! > > > Loet > > ________________________________ > > Loet Leydesdorff > Amsterdam School of Communications Research (ASCoR) > Kloveniersburgwal 48, 1012 CX Amsterdam > Tel.: +31-20- 525 6598; fax: +31-20- 525 3681 > loet at leydesdorff.net ; http://www.leydesdorff.net/ > > > ---------- Forwarded message ---------- > Date: Thu, 30 Nov 2006 09:54:36 -0500 > From: Richard Hill > To: asis-l at asis.org > Subject: [Asis-l] FW: Dr. Eugene Garfield Wins the Online Information > Lifetime Achievement Award > _____ > > From: rodney.yancey at thomson.com [mailto:rodney.yancey at thomson.com] > Sent: Thursday, November 30, 2006 9:52 AM > To: Richard Hill > Subject: Dr. Eugene Garfield Wins the Online Information Lifetime > Achievement Award > > > > > > Dr. Eugene Garfield Wins the Online Information Lifetime Achievement Award > > > > PHILADELPHIA and LONDON, Nov. 30 /PRNewswire/ -- Thomson Scientific, > part of The Thomson Corporation (NYSE: TOC; TSX: TOC) and leading provider > of information solutions to the worldwide research and business > communities, > today announced that Dr. Eugene Garfield is the 2006 recipient of the > Online > Information Lifetime Achievement Award in recognition of more than 50 > years > of dedication, leadership and innovation in the information industry. The > 2006 International Information Industry Awards were held at the Royal > Lancaster Hotel in London, UK on Wednesday, November 29th. > > Often dubbed the "Father of Scientometrics and Bibliometrics," Dr. > Garfield is founder & chairman emeritus of the Institute for Scientific > Information (ISI(R)) - now Thomson Scientific. Garfield's career in > scientific communication and information science began in 1951 when he > joined the Welch Medical Indexing Project at Johns Hopkins University, > USA. > The project planted the seeds for several major advances in scientific > communication and information science that have distinguished Dr. > Garfield's > career. > > In 1958, Garfield was contacted by Joshua Lederberg, who was interested > in knowing what happened to the citation index Garfield proposed in 1955 > in > the journal, Science. This, eventually led to a meeting with the National > Institutes of Health (NIH) to produce and distribute a Genetics Citation > Index, including a multi-disciplinary index to the science literature of > 1961. Undaunted by the NIH and National Science Foundation's (NSF) > disinterest in publishing the latter index, Garfield began regularly > publishing the Science Citation Index(R) (SCI(R)) in 1964 through the > Institute for Scientific Information. The SCI(R) soon distinguished > itself > from other literature indexes and was recognized as a basic and > fundamental > innovation in scientific communication and information science. > > From 1961 on, Garfield's career is marked by the constant enhancement > of > existing resources combined with the extraordinary development of new > information tools for researchers, including Current Contents(R), plus > citation indexes for the social sciences (SSCI(R)) and arts and humanities > (A&HCI(R)). > > During the past decades, as the volume of literature has been growing > exponentially, Garfield's innovations have made it possible for > researchers > to cope with and keep up with articles directly relevant to their > interests. > Current Contents has become a vital and basic component of clinical > research > and the research laboratory. > > > > > The SCI has become an important tool for navigating the scientific > literature. With the advent of the Internet the SCI, SSCI, and A&HCI were > integrated into the online information solution - Web of Science(R). Web > of > Science is the leading resource that provides seamless access to current > and > retrospective multidisciplinary information from the most prestigious, > high-impact research journals in the world. Web of Science also provides a > unique search method, cited reference searching. With it, users can > navigate > forward, backward, and through the literature, searching all disciplines > and > time spans (back to 1900) to uncover all the information relevant to their > research. > > Today, Web of Science is a key component of ISI Web of Knowledge(SM) - > the ground-breaking, integrated research platform that facilitates > discovery > by offering seamless navigation to high-quality, multidisciplinary > journal, > patent, and web content; evaluation tools; and bibliographic management > products. > > At age 81, Dr. Garfield maintains a heavy schedule of invited speeches > and presentations before high-level medical, scientific, and information > symposia and conferences. He has been the recipient of numerous awards > and > recognitions. He has published over 1,000 weekly essays in Current > Contents > and has also published and edited commentaries by the authors of over > 5,000 > Citation Classics. Dr Garfield continues to be active in scientific > communication and information science. In 1986, he founded The Scientist, > a > bi-weekly newspaper for research professionals. It reports on news and > developments relevant to the professional and practical interests of > scientists, providing a unique forum for the discussion of issues > important > both to the research community and society. > > Accepting the award, Dr. Garfield expressed his gratitude to Online > Information, colleagues, the research and information communities, > employees, and friends around the world who supported him throughout his > career and inspired his significant contributions to the information > industry. He saved his most important thanks for last to his family. > > > > About The Thomson Corporation > > The Thomson Corporation (http://www.thomson.com) is a global leader in > providing essential electronic workflow solutions to business and > professional customers. With operational headquarters in Stamford, Conn., > Thomson provides value-added information, software tools and applications > to > more than 20 million users in the fields of law, tax, accounting, > financial > services, scientific research and healthcare. The Corporation's common > shares are listed on the New York and Toronto stock exchanges (NYSE: TOC; > TSX: TOC). > > Thomson Scientific is a business of The Thomson Corporation. Its > information solutions assist professionals at every stage of research and > development-from discovery to analysis to product development and > distribution. Thomson Scientific information solutions can be found at > http://www.scientific.thomson.com. > > > > Contacts: > > > > The Americas > > Rodney Yancey > > +1 215-823-5397 > > rodney.yancey at thomson.com > > > > Europe > > Kim Yeatman > > +44 207 424 2474 > > kim.yeatman at thomson.com > > > > SOURCE Thomson Scientific > > -0- 11/30/2006 > > /Web site: http://www.scientific.thomson.com > > http://www.thomson.com -------------- next part -------------- An HTML attachment was scrubbed... URL: