Past performance, peer review, and projectselection: A case study in the social and behavioral sciences: PDF version
Peter van den Besselaar
p.vandenbesselaar at RATHENAU.NL
Tue Apr 7 02:35:40 EDT 2009
Adminstrative info for SIGMETRICS (for example unsubscribe): http://web.utk.edu/~gwhitney/sigmetrics.html
Dear David,
It is not so much on making a mistake, but about the de facto difference between the idea of competition in order to have the best proposals/researchers selected and the observed results of the selection process.
We studied a variety of funding schemes, more particularly the open competition, and the career grants. Whereas in the former the quality of the proposal is dominant, in the latter it is about selecting the best and most promising (young) researchers. Consequently, one would expect a strong relation between success and referee scores in the open competition and a strong relation between past performance and success in the career grants.
Interestingly, this is not the case. In both cases, success within the top half cannot be explained in terms of recent past performance and also not in terms of referee scores. The council is able to remove the weaker proposals and researchers, but not to select the best out of the large group op good one's.
I do not consider this as a mistake, but as a de facto gap between what councils do (removing the tail of the distribution) and what they claim to do (selecting the best - which may be an impossible task anyhow).
In other words, past performance and review scores do enter into the decision making process but do not have systematic influence on the decision taken.
This has a few implications. As obtaining grants from the research council heavily influences the career of researchers, two problems have to be solved:
- is enough funding available in order to have good researchers their grant applications approved on a regular basis?
- is the system open enough, or does someone's position in the network (of decisionmakers) explain success in grant applications?
I am currently doing a project that tries to answer the second question.
Best regards
Peter
-----
Peter van den Besselaar
----------------------------
Professor, Head of Department
Rathenau Instituut
Dept. Science System Assessment
________________________________
From: David Wojick
Date: Tue, 7 Apr 2009 00:44:28 +0200
To: SIGMETRICS at LISTSERV.UTK.EDU<SIGMETRICS at LISTSERV.UTK.EDU>
Subject: Re: [SIGMETRICS] Past performance, peer review, and project selection: A case study in the social and behavioral sciences: PDF version
Dear Jim,
I never meant to suggest that past performance plays no role in proposal selection, for of course it does, and it should. That is why I chose the term "more or less." But this role may be either negative or positive, for various reasons, in different cases. This study seems to suggest that it is not strongly positive, while the authors seem to suggest that there is something wrong with this result. That is my puzzlement. The finding may be important but the conclusion seems strange and disconnected from the findings.
Cheers, David
David Wojick, Ph.D.
391 Flickertail Lane
Star Tannery VA USA
http://www.osti.gov/innovation/
Apr 6, 2009 06:39:11 AM, SIGMETRICS at listserv.utk.edu wrote:
Adminstrative info for SIGMETRICS (for example unsubscribe): http://web.utk.edu/~gwhitney/sigmetrics.html
Re - David's point that research proposals should be judged on their merit, more or less independently of past performance.
Some (most?) research councils in the UK require applicants to list their previous applications to them, and their success or not at obtaining a grant from them - so it is hard not to imagine that previous performance plays a part in the refereeing process - undesirable as this might be!
Jim
James Hartley
School of Psychology
Keele University
Staffordshire
ST5 5BG
UK
j.hartley at psy.keele.ac.uk<mailto:j.hartley at psy.keele.ac.uk>
http://www.keele.ac.uk/depts/ps/people/JHartley/index.htm
----- Original Message -----
From: David Wojick<mailto:dwojick at HUGHES.NET>
To: SIGMETRICS at listserv.utk.edu<mailto:SIGMETRICS at listserv.utk.edu>
Sent: Sunday, April 05, 2009 10:26 PM
Subject: Re: [SIGMETRICS] Past performance, peer review, and project selection: A case study in the social and behavioral sciences: PDF version
Adminstrative info for SIGMETRICS (for example unsubscribe): http://web.utk.edu/~gwhitney/sigmetrics.html
Dear Peter van den Besselaar,
This is very interesting, but after reading only the abstract I find it puzzling. There seems to be an assumption that past performance should be a significant factor in the success of proposals. One hopes that proposals are selected on their own merit, more or less independently of past performance. If so then the empirical question is whether past performance influences the quality of present proposals, not whether it influences their selection.
Moreover, one should not be surprised to find that past performance does not correlate with present quality, for a variety of reasons. For example, past performance may be based on important discoveries that only occur once or a few times for a given researcher, so new proposals weaken with time. Or the focus of science may shift so that past discoveries, and their performers, are no longer relevant. In other words, the dynamics of science might tend to work against a correlation between past performance and proposal selection. If so then the lack of such correlation is not a criticism of the proposal selection body. Past performance and present quality of proposals are simply independent variables. But perhaps I misunderstand the abstract.
Cheers, David
David Wojick, Ph.D.
391 Flickertail Lane
Star Tannery VA USA
http://www.osti.gov/innovation/
Apr 5, 2009 04:23:27 PM, SIGMETRICS at listserv.utk.edu wrote:
Adminstrative info for SIGMETRICS (for example unsubscribe):
http://web.utk.edu/~gwhitney/sigmetrics.html
Past performance, peer review, and project selection: A case study in the social and behavioral sciences
Peter van den Besselaar & Loet Leydesdorff
Abstract
Does past performance influence success in grant applications? In this study we test whether the grant allocation decisions of the Netherlands Research Council for the Economic and Social Sciences correlate with the past performances of the applicants in terms of publications and citations, and with the results of the peer review process organized by the Council. We show that the Council is successful in distinguishing grant applicants with above-average performance from those with below-average performance, but within the former group no correlation could be found between past performance and receiving a grant. When comparing the best performing researchers who were denied funding with the group of researchers who received it, the rejected researchers significantly outperformed the funded ones. Furthermore, the best rejected proposals score on average as high on the outcomes of the peer review process as the accepted proposals. Finally, we found that the Council under study!
successfully corrected for gender effects during the selection process. We explain why these findings may be more general than for this case only. However, if research councils are not able to select the ‘best’ researchers, perhaps they should reconsider their mission. In a final section with policy implications, we discuss the role of research councils at the level of the science system in terms of variation, innovation, and quality control.
PDF version available at:
http://home.medewerker.uva.nl/p.a.a.vandenbesselaar/bestanden/20090327%20magw.pdf
Peter van den Besselaar
---------------------------------------
professor, head of department
Address:
Rathenau Instituut
Dpt. Science System Assessment
PO Box 95366, 2509 CJ Den Haag, The Netherlands
email: p.vandenbesselaar at rathenau.nl
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.asis.org/pipermail/sigmetrics/attachments/20090407/f8f47221/attachment.html>
More information about the SIGMETRICS
mailing list