Tri States Public Radio Staff
Fri October 4, 2013
Open-Access Journals Hit By Journalist's Sting
Originally published on Fri October 4, 2013 9:36 am
DAVID GREENE, HOST:
This is MORNING EDITION from NPR News. Good morning, I'm David Greene.
RENEE MONTAGNE, HOST:
And I'm Renee Montagne.
Publication is the coin of the realm for scientists. It's how they make their careers.
GREENE: Traditionally, it's been hard to get research published at least until the last decade and explosion of open-access journals online.
MONTAGNE: Unlike Nature or the Journal of the American Medical Association, open access journals don't charge readers. Instead, they may charge the researchers they publish - sometimes thousands of dollars.
GREENE: And there's another important difference. Often open-access journals don't do what's known as peer review.
MONTAGNE: Reporter John Bohannon, of Science magazine, found out just casual the vetting process can be with Web journals, compared to how it's long been with peer-reviewed journals.
JOHN BOHANNON: The basic idea is if you're a general editor, a paper submission comes in. First, you do a smell test and as long as it looks reasonable, and has the potential to be important enough or interesting enough to be published in your journal, then you send it out to peer reviewers - which are at least one, sometimes as many as five or six independent scientists, and they remain anonymous. And those scientists go to town on that paper.
They try and find anything that could be wrong with it. They nitpick wording. They question the underlying assumptions. And they send all this feedback to the editor, who then sends it on to the scientist anonymously. And it's a back-and-forth. The results, everyone agrees, is that papers come out better because of it.
MONTAGNE: Right, so given the rise of open-access journals, you decided to do an experiment. That is, send in for publication a fake experiment. And, as you describe it, it is a sting operation.
BOHANNON: That's right. So I created a paper that purportedly tests the effect of a chemical on cancer cells. And that chemical was extracted from a lichen; one of those scrubby, moss- like, little things that grow on rocks. And then I created a computer program to scale this up. So I created hundreds of very similar but fake papers from fake African scientists.
MONTAGNE: And then, the key to all of this is that you wrote what amounted to a hopelessly flawed experiment.
BOHANNON: That's right. It was a paper which was credible. It looked like real paper, not a joke. But if you peer-reviewed it, you would within five minutes see that it was so flawed that it could never be published.
MONTAGNE: Tell me one red flag.
BOHANNON: OK, if you're claiming to have evidence that some chemical is a promising new drug, well, you better have tested at least on healthy cells. Because even if you show that it hurts cancer cells, how do you know what you have there isn't just a poison? So that's one thing that's just awful about the paper, is that it doesn't compare cancer cells to healthy cells at all.
But another one is right there in the first data graph, shows this chemical being tested on cancer cells across a huge range of concentrations. And that every single one of those doses, it has the same moderate effect on the cancer cells. And the graph claims to show a dose-dependent effect. Now, any real scientist who's reading it as a peer reviewer will say, hang on, that is the opposite of the dose-dependent effect.
MONTAGNE: And you turned it into how many journals? And how many approved it and were going to publish it?
BOHANNON: I submitted it to over 300 journals. And, in fact, just this morning - an hour before we're talking - one more acceptance rolled in.
MONTAGNE: And that brought your number of acceptance to what?
BOHANNON: One hundred and fifty-eight acceptances versus 98 rejections.
MONTAGNE: We should say that the journal where you work, Science, does do peer review. These open-access journals also say they do peer review, but what you're finding is that some don't - or at least some don't do it well. What is the take away?
BOHANNON: The take away shouldn't be that open-access is broken and not worth trying. Open-access is great and everyone believes that. It's just a question of how to implement it. I mean, there were a lot of bad journals revealed by this experiment. But on the other hand, there were a lot of good ones. I was so happy, for example, to get the rejection letter from a Hindawi journal. Hindawi is this huge operation in Cairo, and has been criticized for being low quality and spamming scientists. And yet they provided great peer review.
MONTAGNE: Also, researchers who submit their work are not necessarily scammers. Some would really benefit from valuable peer review and they're not getting it, which is quite unfair in that respect.
BOHANNON: That's right. Yeah absolutely, some of the victims here are surely the scientists who have paid good money to have their work peer-reviewed and published in these journals. Many of them, probably naively, think that they're taking part in a scientific enterprise and, in fact, they're just getting duped.
But my hope is that now that we have a map of at least some of the good versus bad journals, scientists can submit their papers to one of the good guys and for the same amount of money, get the real deal.
MONTAGNE: Thank you very much for joining us.
BOHANNON: Thank you, Renee.
MONTAGNE: John Bohannon is a molecular biologist and visiting researcher at Harvard University. He's also a correspondent for "Science" magazine. And his article, "Who's Afraid of Peer Review, " is in the current issue. Transcript provided by NPR, Copyright NPR.