THES article on research access Friday June 6 2003

Stevan Harnad harnad at ECS.SOTON.AC.UK
Tue Jun 10 17:44:19 EDT 2003


The brief article (full-text below) appears today, Friday June 6, 2003
in the Times Higher Education Supplement.
        Toll access: http://makeashorterlink.com/?Y5DE124D4
        Toll-free access to fuller versions, with links:
                http://www.ecs.soton.ac.uk/~harnad/Temp/thes.html  and
                http://www.ecs.soton.ac.uk/~harnad/Temp/theshort.html

        "Why I believe that all UK research output should be online"
                        Stevan Harnad

Unlike journalists or book authors, researchers receive no royalties or
fees for their writings. They write for "research impact", the sum of
all the effects of their work on the work of others and on the society
that funds it. So how research is read, used, cited and built on in
further research and applications needs to be measured.

One natural way to measure research impact would be to adopt the
approach of the web search engine Google. Google measures the importance
of a website. It does this by rank-ordering search results according to
how many other websites link to them: the more links, the higher the
rank. This works amazingly well, but it is far too crude for measuring
research impact, which is about how much a paper is being used by other
researchers. There is, however, a cousin of web links that researchers
have been using for decades as a measure of impact: citations.

Citations reference the building blocks that a piece of research uses to
make its own contribution to knowledge. The more often a paper is used
as a building block, the higher its research impact. Citation counts are
powerful measures of impact. One study has shown that in the field of
psychology, citation counts predict the outcome of the research
assessment exercise with an accuracy of more than 80 per cent.

The RAE involves ranking all departments in all universities by their
research impact and then funding them accordingly. Yet it does not count
citations. Instead, it requires universities to spend vast amounts of
time compiling dossiers of all sorts of performance indicators. Then
still more time and effort is expended by teams of assessors assessing
and ranking all the dossiers.

In many cases, citation counts alone would save at least 80 per cent of
all that time and effort. But the Google-like idea also suggests ways to
do even better, enriching citation counts by another measure of impact:
how often a paper is read. Web "hits" (downloads) predict citations that
will come later. To be used and cited, a paper first has to be accessed
and read. And downloads are also usage (and hence impact) measures in
their own right.

Google also uses "hubs" and "authorities" to weight link counts. Not all
links are equal. It means more to be linked to by a high-link site than
a low-link site. This is the exact equivalent to co-citation analysis,
in which it matters more if you are cited by a Nobel laureate than by a
new postdoc.

What this new world of webmetrics needs to be mined and used to
encourage and reward research is not a four-yearly exercise in
paperwork. All university research output should be continuously
accessible and hence assessable online: not only the references cited
but the full text. Then computer programs can be used to extract a whole
spectrum of impact indicators, adjustable for any differences between
disciplines.

Nor are time-saving, efficiency, power and richness of these webmetric
impact indicators their only or even principal benefits. For the
citation counts of papers whose full texts are already freely accessible
on the web are more than 300 per cent higher than those that are not. So
all of UK research stands to increase its impact dramatically by being
put online. Every researcher should have a standardised electronic CV,
continuously updated with all the RAE performance indicators listed and
every journal paper linked to its full-text in that university's online
"eprint" archive. Webmetric assessment engines can do all the rest.

At Southampton University, we have designed (free) software for creating
the RAE CVs and eprint archives along with citebase, a webmetric engine
that analyses citations and downloads. The only thing still needed is a
national policy of self-archiving all research output to enhance and
assess its impact.

Details: http://www.ariadne.ac.uk/issue35/harnad



More information about the SIGMETRICS mailing list