Vol. 16 nº 2 - Apr/May/Jun de 2022
Views & Reviews Pages 129 to 134

Individual integrity and public morality in scientific publishing
Integridade individual e moralidade p√ļblica na publica√ß√£o cient√≠fica

Authors: Sergio Della-Sala


Descriptors: Open Access Publishing; Predatory Publishers; Plan-S; Integrity; Scientific Publishing.
Publicação de Acesso Aberto; Editores Predatórios; Plano-S; Integridade; Publicação Científica.

Science and science reporting are under threat. Knowingly or not, researchers and clinicians are part of this debacle. This is not due so much to the notorious replication crisis, as to our acceptance of lowering common morality for personal gains, including the widespread, deprecable phenomenon of predatory publishing. Rather than fiercefully countering this loathsome practice, academics are accepting, often supporting a masquerade solution: paying several thousand dollars to publish for all their own papers. This new policy will create a disparity across richer and poorer disciplines; will result in concentrating even more in the hands of large, rich, Western institutions, also penalising younger researchers; will kill observational studies and exploratory research; and will make disseminating science depending more on finances than on quality. This article calls for the full awareness of the academic community on the risks of the current situation in scientific publishing.

A ci√™ncia e os relat√≥rios cient√≠ficos est√£o amea√ßados. Conscientemente ou n√£o, pesquisadores e m√©dicos fazem parte desse desastre. Isso n√£o se deve tanto √† not√≥ria crise de replica√ß√£o, mas √† nossa aceita√ß√£o de rebaixar a moralidade comum para ganhos pessoais, incluindo o fen√īmeno generalizado e depreci√°vel da publica√ß√£o predat√≥ria. Em vez de combater ferozmente essa pr√°tica repugnante, os acad√™micos est√£o aceitando, muitas vezes at√© apoiando uma solu√ß√£o de disfarce: pagar v√°rios milhares de d√≥lares para publicar seus pr√≥prios artigos. Essa nova pol√≠tica criar√° uma disparidade entre as disciplinas mais ricas e mais pobres, resultar√° na concentra√ß√£o ainda maior nas m√£os de grandes e ricas institui√ß√Ķes ocidentais, penalizando tamb√©m os pesquisadores mais jovens; matar√° os estudos observacionais e a pesquisa explorat√≥ria e far√° com que a divulga√ß√£o cient√≠fica dependa mais das finan√ßas do que da qualidade. Este artigo apela √† plena consci√™ncia da comunidade acad√™mica sobre os riscos da situa√ß√£o atual da publica√ß√£o cient√≠fica.


Science and science reporting are under threat. Knowingly or not, researchers and clinicians are part of this debacle. This is not due so much to the notorious replication crisis1, as to our acceptance of lowering common morality for personal gains2. This article aimed at urging our community to raise its morality bar to rescue itself from the abyss of ridicule towards which we are heading at full speed. I first list behaviours that we should all avoid or abide with, and then I discuss in more detail the current situation in publishing, which calls for the full awareness of the academic community.


Aim at good science not at ‚?ogood results‚??

Chris Chambers, describing the current methodological sins hampering the thoroughness of scientific publications, laid out his forthright manifesto on how to avoid the pitfalls of favouring ‚?ogood results‚?? over good science. The most frequent of such drawbacks are summarised in Table 1; see also the guidance offered by the Committee on Publication Ethics (COPE), the International Committee of Medical Journals Editor (ICMJE), the NIH Office of Research Integrity (ORI), the Guidelines for Responsible Conduct Regarding Scientific Communication of the Society for Neuroscience (SfN) or the Publication Practices & Responsible Authorship by the American Psychological Association (APA).

Data of published papers should be posted, always

Anyone skimming through the daily list of dubious papers, highlighted by the laudable Retraction Watch, should be alarmed by the sheer volume of blunder and fabrication tarnishing scientific articles. One way to contrast this dangerous drift is to require that all data on which a report is based be made available for scrutiny, re-analyses and criticisms. Authors should honour this golden rule, reviewers should demand to see the data, editors should insist that they be transparent, and publishers should assist their archiving in accessible repositories.

Honorary authorship should be avoided

Too often, names are added to the list of authors, even if their contribution does not qualify them as authors. An¬†author of a scientific paper is anyone who contributed substantially to the study, by designing it, collecting considerable amount of the data reported, analysing or interpreting them. All authors are accountable for the content of the manuscript they sign. Anybody else associated with the study should be acknowledged for their specific work, but not listed as an author, see, for instance, the recommendation of the ICMJE or the criteria laid out by CRediT (Contributor Roles Taxonomy). In particular, authorship should not be offered as an honorary homage to someone in a position of power, nor should it be used as a bargaining chip to obtain career or other advantages. In short, an author is someone who actively partook to the study, practically or conceptually; hence, for example, offering access to a group of patients does not qualify the clinician as author (although there is some ambiguity as to what it qualifies as ‚?oresources‚?? in CRediT). Moreover, if used thoroughly and systematically, CRediT may also provide a mechanism to reveal any unbalanced division of tasks and workload due to gender or other personal demographics of the researchers involved in the study11.

Ethical approval should be detailed

Ethics is relevant. Which ethics body approved the reported study or permitted the report of the observation should always be explicit in the manuscript. Avoid the clich√© of simply parroting the mantra phrase, ‚?oThe study received ethical approval and is conducted according to the Declaration of Helsinki.‚?? Be specific and consider ethics as integral part of the study process12, not a bureaucratic hurdle to overcome13.

Dissemination should be responsible

Scientists and clinicians blame journalists for poor science reporting in the media. However, often exaggeration in the news is due to the researchers bragging about their findings in academic press releases14. Researchers should publicise their results responsibly, showing their interest without embellishing them or overstating their reach. This becomes particularly relevant when promoting the outcome of an individual study, which has not been vetted by other laboratories or thoroughly replicated. Science should be disseminated only when is based on solid evidence15, and the reports should be comprehensible without trying too hard to be smart or sensationa16.

Peer review should be protected

The idea generally held about reviewing is that it would benefit from an overhaul, changing its status from a quasi-hobby to a mandatory duty of each academic. Reviewing (and editorial) time should figure in the workload models of universities, it should be taught formally to early career researchers, and possibly it should be financially rewarding for the individuals or their institutions17.

The process of peer reviewing is not perfect, does not prevent despicable errors, and does not impede very bad research from entering the literature18. Yet, if carried out conscientiously, it is the best quality control system we have for the scientific literature19. The process is as good as we make it. All researchers should do their share in reviewing papers in their field and should do so according to the golden rule that, when wearing the reviewer‚?Ts hat, they should behave as they would like others to behave when they are at the receiving end (wearing the author‚?Ts hat). Hence, reviewers should offer their feedback reasonably fast20 and should use a polite tone, be honest in their appraisal, and clear in their requests21.

The scientific community should resist the pressure to shortening reviewing time to deadlines incompatible with thoroughness. In Commencement Address at Harvard, Aleksandr Solzhenitsyn stated that, ‚?oHastiness and superficiality are the psychic diseases of the 20th century, and more than anywhere else this disease is reflected in the press‚?? (1978)22. This warning duly applies to the current urgency imposed by serious publishers of carrying out editorial duties fast rather than well. This is imposed to compete with the speed at which low-quality outlets are willing to accept papers for publication, often with no questions asked, provided their fees are paid (see below). Genuine publishers should ring-fence quality instead of entering this deranged marketplace.

Indeed, the publishing arena is now marred with the problem of a deluge of below-par publications in unscrupulous outlets. Let us trace our steps to analyse how we got here.


Plan S

At the end of 2018, the initiative cOAlition S, launched Plan S which establishes the principle that academic journals should gradually increase the quota of papers they publish in Open Access (OA) starting at the beginning of 2022. The outcome of this policy is that publishing each single academic paper will be charged several thousand dollars. Individual researchers, agencies funding their work, or the institution where they operate will have to bear such expenses. The reaction of the academic world has not been to fight against this decision, rather individual universities, institutions, learned societies and even individual research groups are trying to navigate the system by establishing bilateral deals with the publishing houses, allowing their affiliated researchers to publish their papers at discounted fees. These deals involve packages including a fixed number of papers that each group will be allowed to publish with a particular publisher at no extra cost. The benefit would be that all published material will be made available to everyone in OA.

However, the new policy will also carry severe consequences: (1) Institutions will not cover the entire costs of publications, part of which will have to be met by individual researchers, creating a disparity across richer and poorer disciplines23; (2) Publishing rights will be concentrated even more in the hands of large, rich, Western academic institutions, excluding researchers who carry out their studies in less privileged institutions around the world; (3) Observational studies, single cases, exploratory research, serendipitous findings, or any study not fully funded by granting bodies but also position papers, viewpoints, discussions, and commentaries will be discouraged; (4) Younger researchers with less access to large amounts of financial support for their research will be penalised, forcing them to team up with wealthier colleagues to see their results published24; and (5) Publishing will depend more on the availability of finances than on the quality of the work, distorting the concept of merit for careers, appointments and promotions.

This proves to be a typical case of the so-called Cobra Effect, which bedevils well-intended policies that fail to properly consider their unintended consequences.

The cobra effect

The cobra effect loosely refers to unintended and unforeseen consequences of policies designed in good faith and with the view of bettering the current situation25. The¬†term was originally introduced by Siebert26 to deride the unpredicted effects of poorly thought through financial incentives. It is based on a likely apocryphal anecdote about an attempt by the British Governor of colonial India to reduce the number of snakes roaming the street of New Delhi. He ruled that any citizen bringing to the city hall a dead cobra would get a cash reward. In no time the streets were cleared of snakes. However,¬†people liked the relatively easy money, and began to breed cobras in their backyards, to then kill them to cash them in. The British authorities felt ridiculed, and abruptly stopped any reward for serpents‚?T carcasses. Indians did not know what to do with the cobras in their garden cages and freed them. The outcome was that there were many more cobras gliding through the streets of New Delhi than when the original rule had been introduced. The unforeseen consequences of OA resonate with the Cobra Effect.

Open Access

Publishing in OA is on the increase. The lofty founding principles of OA were to counter the power and fight against the revenues of established, private publishing houses by making freely available all papers reporting studies funded by public money27. Initially, the idea was based both on the na√Įve concept that online publishing would not cost much, and that such costs could be sustained by international agencies sponsoring scientific publication world-wide like modern Mecenates.

However, there is no such a thing as a free lunch, and soon it became clear that the authors themselves had to fork out the expenses of OA, hence draining resources from the research process itself. Moreover, far from decreasing the market dominance of the established publishing companies, OA boosted their income by adding authors‚?T publishing fees to the subscriptions (the so-called ‚?ohybrid-journal‚?? format), whilst increasing academic costs. The most harmful outcome of OA though has been paving the way to predatory publishing.

Predatory publications

Predatory publishing is a pandemic that has infected science dissemination28. It is based on the OA model, whereby authors pay for the privilege of seeing their work in print, but, unlike the original OA vision, without the essential quality controls (Figure 1). These¬†journals do not run proper peer-review processes, nor do they exert a sound editorial checking29. The model is very much like that of vanity press: pay-to-publish. Anything gets published, as hilariously demonstrated by the wonderfully goliardic article by two American scientists, who fed up with the constant email solicitations to submit their work to one or another such journals, eventually submitted a paper composed only of the sentence ‚?oGet me off your f***ing mailing list‚?? repeated for 10 pages and illustrated by figures and graphs using the same text30. The roll-call of such imaginative hoaxes is long and ever increasing (see ‚?oList of scholarly publishing stings‚?? in Wikipedia), proving beyond doubt that hundreds of journals operate below morally acceptable quality standards, and making a mockery of serious¬†science.

Figure 1. The cover illustration of Cortex, vol. 90, May 2017, drawn by Dario Battisti, depicting ‚?oThe circus of predatory publishing.‚?? Available from: https://www.sciencedirect.com/science/article/pii/S0010945217301090. Accessed on: Jan 18, 2022.

Yet, blinded by their hubris31, or unashamed of taking shortcuts to boost their cv32, or allured by pecuniary gains33, researchers fall prey to or collude in these scams. They do not notice, or ignore, the flimsiness of the facades supporting these enterprises34, including fake impact factors35 and fake (or incompetent) editorial boards36. They are not deterred by the sloppy or non-existent vetting offered by these outlets. On the contrary, predatory publishers conquer larger and larger slices of the market, and paradoxically since these articles are OA, they end up being quoted even more than solid studies in non-OA journals37. The existence of such predatory outlets has also nurtured the phenomenon of paper mills, which offer shoddy, patchwork manuscripts for sale to unprincipled authors wishing to advance career effortlessly38.

The advent of these predatory outlets represents a real menace to the integrity of science dissemination39. The only way to dissuade scientists and academics from this immoral practice would be to make disadvantageous to publish in or edit for scam journals, which should count as a negative factor in appointments, advancements, awards and grant funding28. Authors should ignore papers appearing in predatory outlets40, even if those who published their work in such journals, unaware of the con, may feel some cognitive dissonance. However, countering their growth is challenging, not least because respectable publishing houses have launched many of their own OA spin-off journals, rendering the identification of predatory operations more ambiguous41.

The well-meant Plan S and the crooked predatory marked are two sides of the same coin: in a market dominated by pay-to-publish, who will have an interest in guaranteeing rigor and quality? Not publishing companies, who will gain more by publishing more, not the researchers who may jump at the chance of easy publication and not the readers who, not realising they may be exposed to drivel, will enjoy free access to journals previously hidden by paywalls.

Fortunately, not all is contemptible; there are also good OA initiatives, including journals managed by Learned Societies, as well as new formats promoting thorough science, like Pre-Registrations also available in OA regime.


To counter publication biases and poor methodology, the format of Registered Reports was nearly launched a decade ago. Registered Reports is a format of publication whereby the proposed experiments are peer reviewed before the research is carried out and, if accepted, will be published independently of the results. The format was first adopted by Cortex in 201242 and, thanks to the unflinching determination of Chris Chambers, has spread to hundreds of other outlets43. This format guarantees quality and is less prone to the hurry imposed by quick and dirty reviewing style to accept all submissions for publication, as it aims at constructively assisting authors to better their study before embarking in data collection. Registered Reports offer a bulwark against the tide of substandard reports, at least until predatory outlets will annex this format as well.

The scale and severity of the problem is daunting. The scientific community should actively discourage the shortcuts of deceiving publishing, promoting a thoughtful and responsible dissemination, and embracing ethical reporting and sharing of data, putting an end to the current pandemic of unsound and immoral practices.


The author thanks Rob McIntosh who commented on an earlier version of this manuscript.


1. Open Science Collaboration. Estimating the reproducibility of psychological science. Science. 2015;349(6251):aac4716. https://doi.org/10.1126/science.aac4716

2. Gross C. Scientific misconduct. Annu Rev Psychol. 2016;67(1):693-711. https://doi.org/10.1146/annurev-psych-122414-033437

3. Chambers CD. The seven deadly wsins of psychology: a manifesto for reforming the culture of scientific practice. Princeton, NJ: Princeton University Press; 2019.

4. Simmons JP, Nelson LD, Simonsohn U. False-positive psychology: undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychol Sci. 2011;22(11):1359-66. https://doi.org/10.1177/0956797611417632

5. Botvinik-Nezer R, Holzmeister F, Camerer CF, Dreber A, Huber J, Johannesson M, et al. Variability in the analysis of a single neuroimaging dataset by many teams. Nature. 2020;582(7810):84-8. https://doi.org/10.1038/s41586-020-2314-9

6. Cubelli R, Della Sala S. In search of a shared language in neuropsychology. Cortex. 2017;92:A1-A2. https://doi.org/10.1016/j.cortex.2017.03.011

7. Mirman D, Scheel AM, Schubert A-L, McIntosh RD. Strengthening derivation chains in cognitive neuroscience. Cortex. 2022;146:A1-A4. https://doi.org/10.1016/j.cortex.2021.12.002

8. Della Sala S, Morris RG. When no more research is needed (without further reflection). Cortex. 2020;123:A1. https://doi.org/10.1016/j.cortex.2019.12.018

9. Chambers CD. Verification reports: a new article type at Cortex. Cortex. 2020;129:A1-A3. https://doi.org/10.1016/j.cortex.2020.04.020

10. Della Sala S, Grafman J, Cubelli R. I copy, therefore I publish. Cortex. 2013;49(9):2281-2. https://doi.org/10.1016/j.cortex.2013.08.010

11. Larivière V, Pontille D, Sugimoto CR. Investigating the division of scientific labor using the Contributor Roles Taxonomy (CrediT). Quant Sci Stud. 2021;2(1):111-28. https://doi.org/10.1162/qss_a_00097

12. Della Sala S, Cubelli R. Entangled in an ethical maze. Psychologist. 2016;29(12):930-2.

13. Della Sala S, Cubelli R. According to which declaration was the study conducted? Cortex. 2017;96:A5-A6. https://doi.org/10.1016/j.cortex.2017.09.003

14. Sumner P, Vivian-Griffiths S, Boivin J, Williams A, Venetis CA, Davies A, et al. The association between exaggeration in health related science news and academic press releases: retrospective observational study. BMJ. 2014;349:g7015. https://doi.org/10.1136/bmj.g7015

15. Della Sala S, Cubelli R. No truth can come from a single scientific study. The Future of Science and Ethics. 2017;2(1):73-7.

16. Della Sala S. LAY summaries for Cortex articles. Cortex. 2015;67:A1. https://doi.org/10.1016/j.cortex.2015.03.008

17. Grafman J, Della Sala S. Reviewing for rewards. Cortex. 2002;38:463.

18. Elson M, Huff M, Utz S. Metascience on peer review: testing the effects of a study‚?Ts originality and statistical significance in a field experiment. Adv Meth Pract Psychol Sci. 2020;3(1):53-65. https://doi.org/10.1177/2515245919895419

19. Smith R. Peer review: a flawed process at the heart of science and journals. J R Soc Med. 2006;99:178-82. https://doi.org/10.1258/jrsm.99.4.178

20. Della Sala S. Author/reviewer: a case of split personality. Cortex. 2015;69:A1. https://doi.org/10.1016/j.cortex.2015.04.012

21. Mavrogenis AF, Quaile A, Scarlat MM. The good, the bad and the rude peer-review. Int Orthop. 2020;44(3):413-5. https://doi.org/10.1007/s00264-020-04504-1

22. Solzhenitsyn A. A World Split Apart. Harvard University. June 8, 1978 [cited on Jan 18, 2022]. Available from: https://www.solzhenitsyncenter.org/a-world-split-apart.

23. Else H. A guide to Plan S: the open-access initiative shaking up science publishing. Nature. 2021. https://doi.org/10.1038/d41586-021-00883-6

24. Briston K. Plan S: how open access publishing could be changing academia. Biomedical Odyssey. Johns Hopkins Medicine. April 17, 2019 [cited on Jan 18, 2022]. Available from: https://biomedicalodyssey.blogs.hopkinsmedicine.org/2019/04/plan-s-how-open-access-publishing-could-be-changing-academia/

25. Hartley D. The cobra effect: good intentions, perverse outcomes. Psychology Today, Oct. 8, 2016 [cited on Jan 18, 2022]. Available from: https://www.psychologytoday.com/intl/blog/machiavellians-gulling-the-rubes/201610/the-cobra-effect-good-intentions-perverse-outcomes

26. Siebert H. Der Kobra-Effekt. Wie man Irrwege der Wirtschaftspolitik vermeidet. Munich: Deutsche Verlags-Anstalt; 2001.

27. Suber P. Open Access. Cambridge, MA: MIT Press; 2012.

28. Della Sala S. Roll up, roll up! Cortex. 2017;90:A1-A2. https://doi.org/10.1016/j.cortex.2017.02.002

29. Bohannon J. Who‚?Ts afraid of peer review? Science. 2013;342(6154):60-5. https://doi.org/10.1126/science.2013.342.6154.342_60

30. Zarrell R. A paper called ‚?oGet Me Off Your F***ing Mailing List‚?? was accepted by a science journal. BuzzFeedNews. Nov 21, 2014 [cited on Jan 18, 2022]. Available from: https://www.buzzfeed.com/rachelzarrell/a-paper-called-getme-off-your-fcking-mailing-list-was-accep?utm_term¬ľ.xhGv4aGKO#.mdpP5vr1M

31. Frandsen T. Why do researchers decide to publish in questionable journals? A review of the literature. Learn Pub. 2019;32:57-62. https://doi.org/10.1002/leap.1214

32. Bagues M, Sylos-Labini M, Zinovyeva N. A walk on the wild side: ‚?~Predatory‚?T journals and information asymmetries in scientific evaluations. Res Policy. 2019;48(2):462-77. https://doi.org/10.1016/j.respol.2018.04.013

33. Cockerell I. China‚?Ts ‚?~paper mills‚?T are grinding out fake scientific research at an alarming rate. Codastory. Nov 9, 2020 [cited on Jan 18, 2022]. Available from: https://www.codastory.com/waronscience/

34. Siler K, Vincent-Lamarre P, Sugimoto CR, Larivi√®re V. Predatory publishers‚?T latest scam: bootlegged and rebranded papers. Nature. 2021;598(7882):563-5. https://doi.org/10.1038/d41586-021-02906-8

35. Jalalian M. The story of fake impact factor companies and how we detected them. Electron Physician. 2015;7(2):1069-72. https://doi.org/10.14661/2015.1069-1072

36. Sorokowski P, Kulczycki E, Sorokowska A, Pisanski K. Predatory journals recruit fake editor. Nature. 2017;543(7646):481-3. https://doi.org/10.1038/543481a

37. Serra-Garcia M, Gneezy U. Nonreplicable publications are cited more than replicable ones. Sci. Adv. 2021;7:eabd1705. https://doi.org/10.1126/sciadv.abd1705

38. Bik E. The Tadpole Paper Mill. Science Integrity Digest. 2020 [cited on Jan 18, 2022]. Available from: https://scienceintegritydigest.com/2020/02/21/the-tadpole-paper-mill/

39. Björk B-C, Kanto-Karvonen S, Harviainen JT. How frequently are articles in predatory open access journals cited. 2020 [cited on Jan 18, 2022]. Available from: https://arxiv.org/ftp/arxiv/papers/1912/1912.10228.pdf

40. Cubelli R, Della Sala S. Write less, write well. Cortex. 2015;73:A1-2. https://doi.org/10.1016/j.cortex.2015.05.008

41. Grudniewicz A, Moher D, Cobey KD, Bryson GL, Cukier S, Allen K, et al. Predatory journals: No definition, no defence. Nature. 2019;576(7786):210-2. https://doi.org/10.1038/d41586-019-03759-y

42. Chambers CD. Registered reports: a new publishing initiative at Cortex. Cortex. 2013;49(3):609-10. https://doi.org/10.1016/j.cortex.2012.12.016

43. Chambers C, Tzavella L. The past, present and future of registered reports. Nat Hum Behav. 2022;6:29-42. https://doi.org/10.1038/s41562-021-01193-7

University of Edinburgh, Human Cognitive Neuroscience, Psychology, Edinburgh, UK


Sergio Della Sala
Email: sergio@ed.ac.uk

Received on January 20, 2022
Accepted on January 24, 2022

Disclosure: The authors report no conflicts of interest

Funding: none


Home Contact