Scientific journals such as Nature, Science, PNAS (Proceedings of the Natural Academy of Sciences) have come to establish a monopoly on publishing cutting-edge research across scientific disciplines. Journal submissions have stringent criteria on citations, abstracts, research notes, extensive methods sections, algorithms used, analytical tools, copyrights and reprints - all of which are meant to augment the 'peer-review' system. The surge in scientific development across the developing world takes advantage of such a system, researchers in India and Israel being big hopefuls for submissions to the most selective journals in the field. Even so, on the 22nd of August, two submissions from the Indian Institute of Technology (IIT) Kanpur were retracted by the journal Biotechnology Advances, on the grounds that some material had been plagiarized from, of all the credible sources, Wikipedia. This is not the first time Indian universities have been victim to plagiarism allegations. Clearly, there is a discrepancy in scientific standards.
This is not helped by the fact that standards never remain static. Scientific journals, even though they have never been lax on plagiarism, have changing expectations. For instance, in recent years clarifying laboratory method down to the last microliter has become a great deal more necessary than before. An editorial in Nature in July 2006 spoke about how far too few submissions laid out exactly what their reagents were. One of the reasons for these changes is the outstripping of old standards by new technologies, which is what the editorial ascribed their concern to. For the editorial, this was especially pertinent in the context of new techniques relating to RNAi, a revolutionary development of the past decade which allows gene expression to be tracked. Where technology previously only abstracted the purpose of one's publication, explaining the technologies involved is now the only means by which scientists can truly regulate each other. As a result, long appendices on technologies used in a lab are now commonplace accompaniments to research papers.
Of course, there are far too many scientists in the world, too few journals and a glut of submissions so having more and more obviously transparent experimental systems is just easier for editors and reviewers to deal with. This is great, because it establishes a fairer system for all. The PNAS, for instance, recently got rid of backdoor submissions - submissions by scientists who instead of submitting the right way, sent their submissions through friends and colleagues who were Academy members. But it isn't all peaches and cream - the system is also still extremely elitist and often counter-productive. Opining in Nature, Peter A. Lawrence wrote in 2003:
"Scientists are increasingly desperate to publish in a few top journals and are wasting time and energy manipulating their manuscripts and courting editors. These trends are fuelled by the increasing pressure in biomedical science to publish in the leading journals. Even our language reflects this obsession — we say that Jim Jargon did well as a graduate student because he published a "Cell paper", illustrating that we now consider the journal to be more important than the scientific message. If we publish in a top journal we have arrived, if we don't we haven't."
Much of this is understandable as one grasps the iron grip on scientific intellectualism that top leading journals, and the largely Western universities that are represented in them, have over journals elsewhere. As scientific journals' expectations keep changing and the global scientific community adapts to these expectations, it must also be recognized that there will almost certainly be an erosion of power and re-evaluation of standards due to the rise of scientific communities elsewhere. This could be for the better and for the worse. Perhaps we would be better off arguing the inductive method.