Concerns regarding the deterioration of objectivity in molecular biology

Concerns regarding the deterioration of objectivity in molecular biology
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Scientific objectivity was not a problem in the early days of molecular biology. However, relativism seems to have invaded some areas of the field, damaging the objectivity of its analyses. This review reports on the status of this issue by investigating a number of cases.


💡 Research Summary

The paper argues that molecular biology, which originally flourished under the rigorous frameworks of physics and chemistry, is now experiencing a decline in scientific objectivity due to the infiltration of relativistic thinking, especially in the era of high‑throughput data generation and bioinformatics. The author begins with a brief historical overview, noting that early pioneers such as Schrödinger, Crick and others envisioned biology as an extension of physical chemistry. As DNA sequencing and microarray technologies emerged, the volume and complexity of data grew dramatically, prompting the involvement of informaticians who introduced sophisticated statistical and computational tools. However, the communication gap between these computational specialists and traditional molecular biologists, rooted in differing disciplinary cultures, has allowed methodological relativism to take hold.

Four case studies illustrate the problem.

  1. Microarray normalization – The paper critiques the widespread use of LOESS‑based smoothing and the Robust Multi‑array Average (RMA) method. Both approaches transform raw fluorescence intensities to fit assumed distributions, effectively “manipulating” data in a way that obscures the original physical meaning. The author claims this violates the principle of falsifiability because the transformed data cannot be independently verified against the original measurements.

  2. RNA‑seq – Initially hailed as a “count‑based” technology that would avoid complex modeling, RNA‑seq still relies on assumptions such as log‑normal distribution of transcript counts and total‑count normalization. The author points out that these assumptions hide a high level of noise (≈84 % of genes near the noise floor) and that the adoption of “zombie analyses” – methods that persist despite being disproven or unvalidated – has led to unrealistic expectations about accuracy and reproducibility.

  3. Multiplicity of statistical tests – In transcriptomic studies thousands of genes are tested simultaneously. The paper argues that the conventional practice of applying false‑discovery‑rate (FDR) corrections or other multiplicity adjustments is often unwarranted because the underlying null hypothesis is rarely plausible in biological contexts. Using Bayes’ theorem, the author shows that when the prior probability of a true effect is low, multiplicity corrections become unnecessary, and in practice they can introduce more error than they remove.

  4. Phylogenetics – The author examines the reliance on distance‑based hierarchical clustering and various tree‑building algorithms. While mathematically these methods generate plausible tree shapes, they do not provide objective evidence of evolutionary relationships, especially in the presence of horizontal gene transfer. The paper stresses that any phylogenetic inference that depends on a single tree model is inherently non‑falsifiable.

Across all examples, the central theme is that methodological choices are often driven by convenience or tradition rather than by rigorous testing of underlying assumptions. The author advocates for an exploratory data analysis (EDA) mindset, minimal‑parameter models, and explicit verification of each assumption to preserve falsifiability. By reinstating these philosophical safeguards, molecular biology can regain the objectivity that underpins reliable scientific progress. The paper concludes with a call for the community to critically assess analytical tools, ensure they are compatible with the core principles of scientific objectivity, and to foster interdisciplinary dialogue that bridges the cultural divide between informaticians and experimental biologists.


Comments & Academic Discussion

Loading comments...

Leave a Comment