Visitors Now: | |
Total Visits: | |
Total Stories: |
Story Views | |
Now: | |
Last Hour: | |
Last 24 Hours: | |
Total: |
[See part 1, part 2 and part 3 from a few months ago.]
I’m horrified, but not as surprised as I would like to be, by a new paper (Welch 2012) which analyses peer-reviewer recommendations for eight prestigious journals in the field of economics.
The principle finding is that the reviewers’ recommendations were made up of 1/3 signal (i.e. consistent judgements on the quality of the manuscript) and 2/3 noise (i.e. randomness). Of that 2/3 noise, 1/3 was down to reviewer bias (some are nicer, some are nastier) and 2/3 seemed to be purely random.
And to quote directly from the study:
The bias measured by average generosity of the referee on other papers is about as important in predicting a referee’s recommendation as the opinion of another referee on the same paper.
What this means is that the likelihood of a submission being accepted depends more on a coin-toss than it does on how good your work is. Which seems to validate my earlier speculation that
The best analogy for our current system of pre-publication peer-review is that it’s a hazing ritual. It doesn’t exist because of any intrinsic value it has, and it certainly isn’t there for the benefit of the recipient. It’s basically a way to draw a line between In and Out. Something for the inductee to endure as a way of proving he’s made of the Right Stuff.
So: the principle value of peer-review is that it provides an opportunity for authors to demonstrate that they are prepared to undergo peer-review.
There’s more discussion of this over on the Dynamic Ecology blog.
It’s also well worth reading Brian McGill’s comment on that post: he quotes multiple reviewers of a manuscript that he submitted, completely contradicting each other. Yes, this is merely anecdote, not data; but I have to admit that it chimes with my own experience.
If this research is correct, and if it applies to science as as it does to economics, then here is one horrible consequence: it suggests that best way to get your papers into the high-impact journals that make a career (Science, Nature, etc.) is not necessarily to do great research, but just to be very persistent in submitting everything to them. Keep rolling the dice till you get a double six. I would hate to think that prestige is allocated, and fields are shaped, on that basis.
I’d be really interested to know, from those of you who’ve had papers published in Science or Nature, roughly how many submissions you’ve made for each acceptance in those venues; and to what extent you feel that the ones that were accepted represent your best work.
References
2012-11-26 08:23:50
Source: http://svpow.com/2012/11/26/well-that-about-wraps-it-up-for-peer-review/
I have been collecting published complaints on this subject. Most people have NO IDEA how corrupt and ineffective this system really is.
http://scripturalphysics.org/4v4a/ADVPWR.html#Addendum
http://scripturalphysics.org/4v4a/ADVPWR.html#SuppressionScientificProcess