Past performance, peer review, and projectselection: A case study in the social and behavioral sciences: PDF version

Peter van den Besselaar p.vandenbesselaar at RATHENAU.NL
Mon Apr 6 03:06:25 EDT 2009


Adminstrative info for SIGMETRICS (for example unsubscribe): http://web.utk.edu/~gwhitney/sigmetrics.html
Dear David,

The paper is on my website (see url in the mail). Some clarification:

- we take into account only recent past performance, and not the whole life time performance. One may expect that recent performance relates to the quality of the proposal and to the decision about the proposal. In the case we studied, the funding body does take into account both quality of the proposal and quality of the researcher.

- we show that past performance and quality of the proposal are independent (low correlation).

The question we try to answer us: Do recent past performance and quality of the proposal (referee scores) explain success? The findings:

1. researchers with weak past performance and with low referee scores are generally unsuccessful.

2. Comparing the successful applicants with an equally large group of the best scoring rejected applicants shows:
- no difference in referee score and citations received between the two groups.
- the rejected group has in average more publications in the recent past.

Hope this clarifies.

Best regards,

Peter


Peter van den Besselaar
----------------------------
Professor, Head of Department
Rathenau Instituut
Dept. Science System Assessment

________________________________
From: David Wojick
Date: Sun, 5 Apr 2009 23:26:11 +0200
To: SIGMETRICS at LISTSERV.UTK.EDU<SIGMETRICS at LISTSERV.UTK.EDU>
Subject: Re: [SIGMETRICS] Past performance, peer review, and project selection: A case study in the social and behavioral sciences: PDF version

Dear Peter van den Besselaar,

This is very interesting, but after reading only the abstract I find it puzzling. There seems to be an assumption that past performance should be a significant factor in the success of proposals. One hopes that proposals are selected on their own merit, more or less independently of past performance. If so then the empirical question is whether past performance influences the quality of present proposals, not whether it influences their selection.

Moreover, one should not be surprised to find that past performance does not correlate with present quality, for a variety of reasons. For example, past performance may be based on important discoveries that only occur once or a few times for a given researcher, so new proposals weaken with time. Or the focus of science may shift so that past discoveries, and their performers, are no longer relevant. In other words, the dynamics of science might tend to work against a correlation between past performance and proposal selection. If so then the lack of such correlation is not a criticism of the proposal selection body. Past performance and present quality of proposals are simply independent variables. But perhaps I misunderstand the abstract.

Cheers, David

David Wojick, Ph.D.
391 Flickertail Lane
Star Tannery VA USA
http://www.osti.gov/innovation/

Apr 5, 2009 04:23:27 PM, SIGMETRICS at listserv.utk.edu wrote:

Adminstrative info for SIGMETRICS (for example unsubscribe):
http://web.utk.edu/~gwhitney/sigmetrics.html

Past performance, peer review, and project selection: A case study in the social and behavioral sciences

Peter van den Besselaar & Loet Leydesdorff

Abstract
Does past performance influence success in grant applications? In this study we test whether the grant allocation decisions of the Netherlands Research Council for the Economic and Social Sciences correlate with the past performances of the applicants in terms of publications and citations, and with the results of the peer review process organized by the Council. We show that the Council is successful in distinguishing grant applicants with above-average performance from those with below-average performance, but within the former group no correlation could be found between past performance and receiving a grant. When comparing the best performing researchers who were denied funding with the group of researchers who received it, the rejected researchers significantly outperformed the funded ones. Furthermore, the best rejected proposals score on average as high on the outcomes of the peer review process as the accepted proposals. Finally, we found that the Council under study!
 successfully corrected for gender effects during the selection process. We explain why these findings may be more general than for this case only. However, if research councils are not able to select the ‘best’ researchers, perhaps they should reconsider their mission. In a final section with policy implications, we discuss the role of research councils at the level of the science system in terms of variation, innovation, and quality control.

PDF version available at:
http://home.medewerker.uva.nl/p.a.a.vandenbesselaar/bestanden/20090327%20magw.pdf


Peter van den Besselaar
---------------------------------------
professor, head of department

Address:
Rathenau Instituut
Dpt. Science System Assessment
PO Box 95366, 2509 CJ Den Haag, The Netherlands

email: p.vandenbesselaar at rathenau.nl
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.asis.org/pipermail/sigmetrics/attachments/20090406/5d1466e2/attachment.html>


More information about the SIGMETRICS mailing list