An economic system produces effects not only on the way goods are produced and exchanged but also on the culture of the countries in which it is established. In present times, one main characteristic of the Advanced countries economic system is the presence of powerful industries. The industry’s nature is to make products; its cultural implication is that the industrial societies tend to approach all the problems of human individual and social life through products. Scientific research, too, is approached in such a way. It must be recognized that the combination of scientific research and industrial production has given a great boost to human progress; however, at present, the industrial approach to scientific research is showing clear signs of crisis.

What I mean with the expression “industrial approach to scientific research” can be well exemplified by a series of scientific works issued in the last years. One example is what David Kaplan has most recently highlighted about conformism prevailing in the science field independently of any individual intention1. He starts from a work that demonstrates the racial imbalance in the funding of research projects by The U.S. National Institute of Health2; he argues that such bias cannot be due to racism, given that the projects are subjected to blind review. Rather, it can be explained through recovering the industrial-like way reviewers work in order to take their decisions: they are specialists of some research branch (in the medical and biological fields, given the nature of NIH), they have small time assigned and typically receive complex and copiously documented projects that, even though conceived in their same field, can refer to different micro-specialisations3. In short: they cannot really examine in depth the projects and, in order to reduce the probabilities of errors, they are unavoidably led to approve those that are more akin to their cultural background and to what they know. It’s human, it’s comprehensible; however, the projects are designed complex and as original as possible exactly in order to improve their probabilities to succeed, what implies that the optimization (as much as to say the industrialization) of the project screening process is producing conformism, something antithetic to science.

Another effect of the industrial approach to scientific research is the over-production of scientific articles. If someone asks how many scientific papers are published every year, the true answer is: “nobody knows”4. In the Seventies of XXth Century, it was calculated that the number of scientific publications was doubling each five years; today (2010), the number of scientific journals (“science, technology and medicine”) is estimated in 25.400, increasing 3,5% a year5. Such huge production can be mainly linked to the quantitative criteria usually employed to assess researchers: their “rating” typically depends on the amount of publications, what entails two main problems.

The first problem is “the impossibility of being expert”: in the work having such title (see Note 5), the authors demonstrate that even in a single sub-specialisation (diagnostic imaging in cardiology is the case they study) it is actually impossible for a physician to maintain updated his competences and, at the same time, do his professional work. The second problem is the impossibility of adequately controlling the quality of scientific research. Around this, a specific research field is growing6, and its results show how the reliability of the discoveries claimed by most of the researchers is extremely low7.

A different chapter of the story is represented by the frauds in scientific research. From the “Piltdown man”8 up to the most recent cases regarding stem cells9, many examples of unfair scientific research can be listed. However, personal unfairness is not the point, in this paper; scientific community already has the means to contrast such behaviours. The point are the structural problems I have reminded in the previous part of my work because the increasing conformism, the impossibility of being expert and the low reliability of scientific literature do not depend on personal correctness of researchers. It is the system of the Advanced countries industrialized scientific research that, treating knowledge as if it were a product, becomes at stake, given that its structures origin these problems.

Any good news? Certainly there are. The first one is that inside the scientific community a growing research field is exactly the assessment of, and the controls on, the quality of scientific research and its results; such self-assessment is really promising. Then, we can say that Open Access to scientific publications is another possible answer (just reminding that Open Access, too, is a complex system and not all that looks open is really “open”). The “citizen science” phenomenon (for a first definition and survey http://en.wikipedia.org/wiki/Citizen_science) seems to be another good idea, even though many of its implications are not yet clear and would need to be studied in deep. What makes these phenomena interesting is that they can be placed out of a strictly industrialized scientific research rationale; that is the direction we should address. Because our societies need high-quality science and knowledge, and the de-industrialization of research appears, at the moment, our best choice, even though certainly it will not be easy to be carried out.

 

  1. Kaplan, David (2012) Science and prejudice – The NIH may be biased in ways that harm not only African-American researchers but any whose ideas fall outside the mainstream. Scientific American, Volume 306, N. 2 (February 2012), p.7.
  2. Ginther, Donna K. et al. (2011) Race, Ethnicity, and NIH Research Awards. Science, 19 August 2011: Vol. 333 no. 6045 pp. 1015-1019 – DOI: 10.1126/science.1196783.
  3. Just to give the reader a yardstick: the classification table of PLoS-ONE (http://www.plosone.org/) had, in 2012, about 5.000 specialisation fields listed in order to help the authors to correctly place the articles they were proposing.
  4. This subject has been exhaustively treated by Tom Lang. My main reference is to a series of slides titled “How to Write an Article Reporting Clinical Research” and published online in 2009, whose URL I was not able to recover. Tom Lang’s website: http://www.tomlangcommunications.com/.
  5. Fraser, AlanG. & Dunstan, Frank D. (2010) On the impossibility of being expert. British Medical Journal 2010; 341: c6815. DOI: 10.1136/bmj.c6815. The authors also specify that, in 2009, 1.5 million articles have been published and that PubMed (http://www.ncbi.nlm.nih.gov/pubmed), the US (online) National Library of Medicine, cites now “more than 20 million papers” (24 million are claimed in its home page in November 2014).
  6. Some examples drawn from the most recent literature: Vasilevsky, Nicole A. et al (2013) On the reproducibility of science: unique identification of research resources in the biomedical literature. PeerJ 1:e148; DOI: 10.7717/peerj.148. Wager, Elizabeth; Singhvi, Sanjay; Kleinert, Sabine (2013). Too much of a good thing? A study of prolific authors. The Seventh International Congress on Peer Review and Biomedical Publication, Chicago, Illinois (USA) September 8-10, 2013, Plenary Session Abstracts (available at http://www.peerreviewcongress.org/2013/Plenary-Session-Abstracts-9-8.pdf). Peplow, Mark: Social sciences suffer from severe publication bias – Survey finds that ‘null results’ rarely see the light of the day. Nature, August 2014. DOI: 10.1038/nature.2014.15787 (available at http://www.nature.com/news/social-sciences-suffer-from-severe-publication-bias-1.15787). Medin, Douglas; Lee, Carol D.; Bang, Megan (2014). Point of view affects how science is done. Sientific American, Vol. 311, Issue 4, September 2014 (available at http://www.scientificamerican.com/article/point-of-view-affects-how-science-is-done/). Ioannidis, John P. A. (2014). Science research needs an overhaul – The current incentive structure often leads to dead-end studies—but there are ways to fix the problem. Scientific American Forum, November, 3 2014 (available at http://www.scientificamerican.com/article/science-research-needs-an-overhaul/).
  7. The cited Tom Lang (see Note 4), in one of his works states that “it is impossible to write a scientific article so badly that it cannot be published”.
  8. It was 1912, December 18th, when the news of what is known as “the greatest [paleontological] hoax in the history of science” appeared on the newspapers throughout the world. BBC information about the case: http://www.bbc.co.uk/history/ancient/archaeology/piltdown_man_01.shtml.
  9. A famous one is the 2006 case of South Korean researcher Hwang Woo-suk, who admitted embezzlement and bioethical violations (the New York Times coverage of the case at http://topics.nytimes.com/top/reference/timestopics/people/h/hwang_woo_suk/index.html). The probably latest one is the STAP (stimulus-triggered acquisition of pluripotency) phenomenon, described by Japanese authors in two controversial papers were published on Nature, January 2014, and then retracted in July 2014 (after that the same research institute the authors belonged to concluded that data had been falsified to uphold results). It ended in tragedy, leading to the suicide of one researcher; see, for example, Dennis Normile’s account in AAAS Science Magazine News (http://news.sciencemag.org/asiapacific/2014/08/senior-riken-scientist-involved-stem-cell-scandal-commits-suicide).