The mathematization of the individual sciences - revisited

The mathematization of the individual sciences - revisited
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

We recall major findings of a systematic investigation of the mathematization of the individual sciences, conducted by the author in Bielefeld some 35 years ago under the direction of Klaus Krickeberg, and confront them with recent developments in physics, medicine, economics, and spectral geometry.


💡 Research Summary

The paper revisits a systematic study of the mathematization of the individual sciences that was carried out roughly thirty‑five years ago at the University of Bielefeld under the direction of Klaus Krickeberg. The original project, conducted between 1972 and 1975, brought together representatives from all university disciplines (except engineering and medical sciences) to examine how mathematical modeling and calculation had become integral to scientific practice. The author recalls the three broad tendencies identified at that time: (1) the growing complexity of problems and data in each discipline, (2) the consequent need for planned, economical methodological procedures, and (3) the way this methodological emphasis itself fuels further mathematization.

Historical reflections invoke John Dee’s “principal‑derivative” dichotomy, emphasizing the long‑standing tension between pure mathematics and its applied counterpart. The author argues that today many mathematicians remain content to work at an “inner” level, while others avoid “vulgar” applications; a balanced view is required.

The paper then uses physics as a detailed case study because it offers the clearest illustration of the intimate interplay between mathematics and scientific inquiry. It cites Hilbert’s view of probability theory as a chapter of physics, the Peierls‑Fisch memorandum that calculated the critical mass for an atomic bomb, and the evolution of theoretical physics in Britain where many physicists held joint mathematics appointments. To structure the discussion, four modeling purposes are distinguished: (i) production of data and model‑based measurements, (ii) simulation, (iii) prediction, and (iv) control.

In the data‑production role, mathematics is essential for designing experiments (e.g., measuring visco‑elastic constants via piezoelectric transducers) and interpreting indirect measurements. Simulation is portrayed as a double‑edged sword: while it can generate visually impressive and seemingly realistic results, numerical artifacts (digit chopping, hardware quirks) may create spurious structures such as eddies in computational fluid dynamics that have no physical counterpart. The author warns against over‑reliance on visual similarity without rigorous validation.

Prediction is examined through the transition from medieval astronomical tables (based on the Ptolemaic system) to modern numerical weather forecasting, where high‑performance models derived from thermodynamic and hydrodynamic equations now achieve about a 90 % success rate for ten‑day forecasts. Nonetheless, a residual 10 % failure rate is deemed unacceptable for industrial quality control, highlighting the persistent challenge of uncertainty quantification.

Control is identified as the most demanding mathematical task, especially regarding safety. The paper differentiates feasibility, efficiency, and safety in design, noting that while feasibility and efficiency can be addressed through thought experiments, parameter estimation, and optimization, safety requires integrated, often experience‑based, scientific and engineering judgments.

Although the manuscript’s excerpt provides limited detail on medicine, economics, and spectral geometry, the author briefly notes that medical modeling (e.g., pharmacokinetics, diagnostic imaging) relies heavily on mathematical frameworks, while economics remains contested: mathematical models of risk and decision‑making are used, but their scientific status is debated. Spectral geometry is presented as a bridge between pure mathematics and physics, with recent work linking Laplacian spectra to quantum field theory and topological invariants.

Overall, the paper argues that mathematization is no longer a peripheral tool but a central, structuring force across the sciences. It stresses the need for clear purpose classification, vigilant validation of simulations, rigorous handling of prediction uncertainties, and especially robust safety analysis in control problems. The author calls for continued interdisciplinary dialogue, methodological transparency, and ethical scrutiny as mathematics further permeates scientific research and policy decisions.


Comments & Academic Discussion

Loading comments...

Leave a Comment