Open Research Metrics (fwd)
Stevan Harnad
harnad at ECS.SOTON.AC.UK
Sun Dec 10 13:24:33 EST 2006
---------- Forwarded message ----------
Date: Sun, 10 Dec 2006 08:34:32 -0500
From: Andrew McCallum <mccallum at CS.UMASS.EDU>
To: AMERICAN-SCIENTIST-OPEN-ACCESS-FORUM at LISTSERVER.SIGMAXI.ORG
Subject: Re: Open Research Metrics
On Research Metrics:
I also believe that current standard impact measures are inadequate,
and that Open Access will enable many new kinds of impact measures.
In a 2006 JCDL paper, we explored several new types of impact
measures that are made possible by access to the full text of the
paper, and statistical analysis of "topics".
In the paper we use a new topic model that leverages n-grams to
discover interpretable, fine-grained topics in over a million
research papers. We then use these topic divisions as well as
automated citation analysis to extend three existing bibliometric
impact measures, and create three new ones: Topical Diversity,
Topical Transfer, Topical Precedence.
http://www.cs.umass.edu/~mccallum/papers/impact-jcdl06.pdf
Feedback welcome!
Best,
Andrew
"Bibliometric Impact Measures Leveraging Topic Analysis."
Gideon Mann, David Mimno and Andrew McCallum.
Joint Conference on Digital Libraries (JCDL) 2006
Abstract:
Measurements of the impact and history of research literature provide
a useful complement to scientific digital library collections.
Bibliometric indicators have been extensively studied, mostly in the
context of journals. However, journal-based metrics poorly capture
topical distinctions in fast-moving fields, and are increasingly
problematic with the rise of open-access publishing. Recent
developments in latent topic models have produced promising results
for automatic sub-field discovery. The fine-grained, faceted topics
produced by such models provide a clearer view of the topical
divisions of a body of research literature and the interactions
between those divisions. We demonstrate the usefulness of topic
models in measuring impact by applying a new phrase-based topic
discovery model to a collection of 300,000 Computer Science
publications, collected by the Rexa automatic citation indexing system.
On Dec 9, 2006, at 6:37 AM, Stevan Harnad wrote:
> On Fri, 8 Dec 2006, Peter Suber wrote:
>
>> If the metrics have a stronger OA connection, can you say something
>> short (by email or on the blog) that I could quote for readers who
>> aren't clued in, esp. readers outside the UK?
>
> Dear Peter,
>
> Sure (and I'll blog this too, hyperlinked):
>
> (1) In the UK (Research Assessment Exercise, RAE) and Australia
> (Research
> Quality Framework, RQF) all researchers and institutions are
> evaluated for
> "top-sliced" funding, over and above competitive research proposals.
>
> (2) Everywhere in the world, researchers and research institutions
> have
> research performance evaluations, on which careers/salaries,
> research funding
> and institutional/departmental ratings depend.
>
> (3) There is now a natural synergy growing between OA self-archiving,
> Institutional Repositories (IRs), OA self-archiving mandates, and the
> online "metrics" toward which both the RAE/RQF and research
> evaluation in
> general are moving.
>
> (4) Each institution's IR is the natural place from which to derive
> and
> display research performance indicators: publication counts, citation
> counts, download counts, and many new metrics, rich and diverse ones,
> that will be mined from the OA corpus, making research evaluation much
> more open, sensitive to diversity, adapted to each discipline,
> predictive,
> and equitable.
>
> (5) OA Self-Archiving not only allows performance indicators (metrics)
> to be collected and displayed, and new metrics to be developed, but OA
> also enhances metrics (research impact), both competitively (OA vs.
> NOA)
> and absolutely (Quality Advantage: OA benefits the best work the most,
> and Early Advantage), as well as making possible the data-mining of
> the
> OA corpus for research purposes. (Research Evaluation, Research
> Navigation, and Research Data-Mining are also very closely related.)
>
> (6) This powerful and promising synergy between Open Research and Open
> Metrics is hence also a strong incentive for institutional and funder
> OA mandates, which will in turn hasten 100% OA: Their connection needs
> to be made clear, and the message needs to be spread to researchers,
> their institutions, and their funders.
>
> Best wishes,
>
> Stevan
>
> PS Needless to say, closed, internal, non-displayed metrics are also
> feasible, where appropriate.
>
More information about the SIGMETRICS
mailing list