Baudoin L, Haeffner-Cavaillon N, Pinhas N, Mouchet S, Kordon C "
Eugene Garfield
garfield at CODEX.CIS.UPENN.EDU
Thu May 19 13:19:51 EDT 2005
The Authors have very kindly provided an extended summary in English which
follows the Abstract below:
E-mail Addresses: kordon at necker.fr
Title: Bibliometric indicators: realities, myth and prospective
Author(s): Baudoin L, Haeffner-Cavaillon N, Pinhas N, Mouchet S, Kordon C
Source: M S-MEDECINE SCIENCES 20 (10): 909-915 OCT 2004
Document Type: Review
Language: French
Cited References: 13
Abstract:
The impact factor of scientific reviews, calculated by the Institute for
Scientific Information (61), is increasingly used to evaluate the
performance of scientists and programmes. Bibliometric indicators,
originally designed for other purposes than individual evaluation, are very
useful tools provided their interpretation is not extrapolated beyond their
limits of validity. Here we present a critical analysis of appropriate uses
and misuses of bibliometric data based on case studies. We also outline
anticipated consequences of new information technologies, such as electronic
journals or open access schemes, on the mode of science production,
evaluation and dissemination in biomedical sciences.
EXTENDED SUMMARY IN ENGLISH PROVIDED BY THE AUTHORS :
ENGLISH SUMMARY:
Bibliometric Indicators in Biomedical Research
Realities, Myths and Prospective
Abstract
The impact factor of scientific publications, calculated by the Institute
for Scientific Information (ISI), is increasingly used to evaluate the
performance of scientists and research programs. Bibliometric indicators,
originally designed for other purposes than individual evaluation, are
useful tools provided their interpretation is not extrapolated beyond their
limits of validity. Here we present a critical analysis of appropriate uses
and misuses of bibliometric data based on case studies. We also outline
anticipated consequences of new information technologies, such as electronic
journals or open access schemes, on the mode of science production,
evaluation and dissemination in biomedical sciences.
The inflation in the number of the scientific papers, together with growing
needs for scientific expertise prompted universities and research
institutions to search for readily available criteria for evaluating the
performance of scientists and their projects. In this view, commercialized
bibliometric indicators appeared as useful tools, with their apparent
objectivity reflecting the users interest in the published results through
citations. However, the secular use of these indicators makes more and
more clear that they can be diverted from the initial objective of their
inventor, Eugene Garfield.
The shortcomings of the main journal indicators (impact factor, immediacy
index, cited and citing half-life) calculated by ISI have been largely
discussed. Bibliometric indicators give a neutral estimation of the
exploitation of scientific results, but scarcely take into account the
diversity of publication strategies and the requirements of different
disciplines. For example, impact factors of generalist and specialist
journals differ considerably. This is also the case of journals that publish
above all clinical or review articles.
In biomedical research the journals can be classified into three broad
categories :
- top multidisciplinary journals of very high-impact (Nature, Science,
PNAS...) that publish original analyses or obvious discoveries (but
sometimes results that are questionable or based on superficial and quickly
forgotten analyses).
- field generalists that are devoted to a single discipline (The Journal
of Immunology, The Journal of Biological Chemistry, Endocrinology, Brain...)
or are even more specialized (Cerebral Cortex, The Journal of Cognitive
Neuroscience, Neuroimage...).
- finally, low-impact or no-impact journals whose readership is thin on
the ground (Neurotoxicology, Human Movement Science
) and that sometimes
emerge from rival schools of thought. They also include non-English-language
journals.
Each category therefore occupies a special niche.
Clearly science cant advance by relying solely on hot articles.
Innovations often result from findings whose importance was not at first
apparent. The discovery of Helicobacter pylori pathogenesis by B. Marshall
illustrates this point. A first paper, although published in Lancet, did not
attract much interest. Attention to this work was received after the authors
swallowed a culture of Helicobacter to prove that the bacterium was the
cause of gastric ulceration and the results were published in the low-impact
Medical Journal of Australia (IF < 2). This article received 780 citations
and the discovery made a revolution in gastroenterology !
What is the reliability of the indicators? Several technical shortcomings of
the impact factor have been reported previously (inconsistency between
citing and cited items, too short window, etc.). According to its mode of
calculation, the impact factor is not representative of individual papers
citation rate. On the other hand, the impact factor of the small size
journals is rather sensitive to the appearance of highly cited papers
(illustrated with Frontiers in Neuroendocrinology that doubled its impact
factor in 2001 due to two highly cited articles in 1999). Moreover numerous
are the cases of manipulating the journal impact factor for the purpose of
better funding or journals marketing policy.
Many examples demonstrate that one should listen to the advice of the impact
factors designer to not to attribute excessive significance to his
indicator. The misuse of impact factors for the evaluation of individual
researchers presents a real danger for the development of science.
In contrast, the use of bibliometric indicators to access the performance of
a large institute, an university or even a country seems methodologically
better justified. In the original article
(http://www.edk.fr/reserve/revues/ms_papier/e-docs/00/00/05/EA/document_article.md),
several figures illustrate these contentions
The new landscape of the electronic publishing: The interpretation of
bibliometric indicators may soon be left in the wake of upheavals in
scientific publishing. Internet access facilitates literature searches by
scientists and instilles new practices of sharing knowledge. Moreover, the
advent of open access schemes challenges the traditional business model of
scientific publishing,
where publishers are often paid twice for article distribution, first by the
producers of knowledge in the form of page charges and then again by the
journals subscribers. In this new publishing model, the authors bear the
publishing costs and readers have free access. New initiatives have emerged
in the biomedical field, foremost among them BioMed Central and the Public
Library of Science (PLOS).
In 1991, the physicists were the first to launch a free server of preprints.
The formula of preprints is less easy to implement in biology because of a
greater risk of a scoop. Thus the biologists are turning to new electronic
journals whose quality is guaranteed by peer review. These journals will be
indexed and, therefore, cited like traditional journals (this is already
widespread). They also cut costs and save time. The National Institutes of
Health in the United States and Signal Hill in Europe are working along the
same lines by setting up quality electronic journals. Reasonable production
costs and open access to electronic journals, as published by BioMed
Central, will perhaps enhance the diversity of publishers/chances for
survival of little publishers, and above all will improve information access
in the developing countries. There are still many questions to answer. Will
the electronic papers be more cited than traditional ones? The existing
examples seem to be contradictory.
What impact will these changes have on reading habits and citation practice,
given that what is read will increasingly be limited to what is easily
available online? Growth in knowledge dissemination initiates the emergence
of new indicators (such as access counts) that should put the reference
indicator, the impact factor, into perspective.
Conclusions and prospective: Because of the historical precedence of the ISI
database and the absence of other quantitative criteria, the impact factor
has gradually gained predominance as a key indicator of scientific
evaluation. But electronic publishing opens up new vistas in evaluation and
should lead to refinement of information indicators and hence a more
apposite role for the impact factor. Ongoing changes are also likely to
affect the relations between science and society. For example, the logistic
analysis developed by archaeologists can unearth relatively reliable data
embedded in more speculative findings. By reducing their number, logistic
analysis should increase the transparency of information sources. New
indicators could also incorporate societal parameters into a more global
approach, as all forms of sharing activities, knowledge dissemination, and
training. But these developments will only be effective if public research
institutions and national agencies play their part to the full, notably by
rapidly adapting their assessment procedures to the new order.
Addresses: Kordon C (reprint author), INSERM, Grp Bibliometrie, 101 Rue
Tolbiac, Paris, F-75654 13 France
INSERM, Grp Bibliometrie, Paris, F-75654 13 France
Inst Necker, Paris, F-75015 France
E-mail Addresses: kordon at necker.fr
Publisher: MASSON EDITEUR, 21 STREET CAMILLE DESMOULINS, ISSY, 92789
MOULINEAUX CEDEX 9, FRANCE
IDS Number: 862ZW
ISSN: 0767-0974
CITED REFERENCES:
AMIN M, 2000, PERSPECTIVES PUBLISH, V1, P1.
ANDERSON K, PUBLISHING ONLINE ON.
BURKE J, 1995, CONNECTIONS.
GARDIN JC, 2002, EXPLANATORY POWER MO, P267.
GARFIELD E, 1989, CURR CONTENTS, V14, P3.
GARFIELD E, 1996, BRIT MED J, V313, P411.
GARFIELD E, 1998, UNFALLCHIRURG, V101, P413.
JACSO P, 2001, CORTEX, V37, P590.
LAWRENCE PA, 2003, NATURE, V422, P259.
LAWRENCE S, 2001, NATURE, V411, P521.
LEWISON G, 2002, SCIENTOMETRICS, V53, P229.
MARSHALL BJ, 1984, LANCET, V1, P1311.
MARSHALL BJ, 1985, MED J AUSTRALIA, V142, P436.
More information about the SIGMETRICS
mailing list