The Error of the Dominant Paradigm

  The mathematical–scientific method is not the only path to knowledge, truth, or justice.


With the birth of formal mathematics, first in ancient Greece and later through its modern systematization, human beings began to conceive of complex abstract thought. The world came to be reduced to numbers, proportions, and ideal forms. Thought itself adopted a deductive, sequential structure.

As the mind observed the rigor of mathematical structures, it eventually came to imitate them: it began to believe that truth must be stable, demonstrable, and contained within closed, self-referential systems—systems sustained by the very rules they define. This was a profound cognitive transformation, but also a double illusion.

On one hand, scientific truth is always provisional, corrigible, and contingent. Each time it seems to have reached a definitive point—where the answers it provides appear to be truths—it is surpassed and corrected by new science, new medicine, new physics, and so on.

On the other hand, adopting the logical-scientific method as the sole criterion for understanding and judgment radically impoverishes our experience of the world.

In Les Mots et les Choses, Michel Foucault argues that the mathematization of the world was one of the conditions of possibility of the classical episteme. Western knowledge, he says, chose mathematics as the measure of rigor and legitimacy, and modeled every form of truth upon it.

In that historical phase, the mind mirrored itself in the theorem: to think well meant to deduce, to reduce the multiple to the unitary, and even to seek formal elegance. Mathematics became the grammar of rational thought, imposing on knowledge an axiomatic structure and a system of constraints. This model gave rise to modern physics, engineering, and technology—but also to an idea of knowledge as formalization and abstraction rather than experience or interpretation.

In the process of affirming the scientific-mathematical paradigm, many other models of knowledge were pushed into the shadows. Some were considered inferior, others merely ignored, and still others delegitimized because they did not conform to the criteria of falsifiability, precision, and replicability imposed by the mathematical-scientific method.

The very idea of the ontological superiority of the mathematical method is both an epistemological and a historical fallacy.

I do not deny the validity of the scientific method, but I contest its claimed epistemic exclusivity. There exist other modes of knowing, equally legitimate according to coherent yet different criteria.

Among them is the experiential model, in which knowledge is embodied, tacit, and non-formalizable. It is transmitted through practice, the body, and direct contact. We find it in artisanal traditions, in Chinese medicine, in pre-Socratic philosophy, and in indigenous sciences. Michael Polanyi defined this dimension as tacit knowledge: “we know more than we can tell.” Yet dominant culture tends to discredit such systems precisely because they do not conform to the very criteria that the scientific-mathematical paradigm established for itself. This is a logical and epistemological short circuit. Those models, by definition, do not possess such requirements, and there is no rational reason why the scientific paradigm should be the measure of truth for all other paradigms, each of which has its own methods for defining truth.

There is also the narrative and mythic model, which organizes meaning through symbolic images and archetypal stories. It does not explain how but rather seeks why. Its function is not mechanical but hermeneutic. Within this model arise cosmologies, collective identities, and maps of meaning: a form of knowledge that is effective because it connects.

There is also the analogical and intuitive model, in which truth is not demonstrated but intuited through resonance. This is the case of Taoism, Ayurvedic medicine, alchemy, or Neoplatonism. Knowledge here proceeds not by deduction or verification, but by immersion and analogy—as if intelligence resided not only in reasoning but also in perceptive sensitivity.

Some might argue that these systems were rightly displaced because they “do not work,” because they produce errors. But this conviction itself reveals a deep and unexamined assimilation of the scientific-mathematical paradigm—so deep that it produces two major misunderstandings.

The first is the belief that the scientific method is ontologically superior to all others, which is not logically sustainable.

The second is the belief that where other models fail, the scientific method does not.

Every model is valid within its own domain. The scientific method excels in formalization, prediction, and control. Other models operate better in interpretation, embodied experience, sense-making, or in handling irreducible uncertainty.

Not all models are equivalent in every field, but each has its own domain of efficacy and an internal rationality that deserves recognition, not erasure in the name of epistemic universalism.

The fallibility of a model under certain conditions does not discredit it entirely—nor does it automatically validate others. It is therefore a perspectival error to oppose the scientific model to those that “fail,” as if it did not fail itself. Scientific thinking always gives the impression of having arrived at an inevitable truth. But looking back, we see how many past scientific theories were presented as definitive and later revealed to be temporary or wrong: ineffective therapies, flawed hypotheses, unfounded assumptions, grave and sometimes dangerous errors. Nothing guarantees that we are not still living within similar errors, which will only be unmasked in time.

Science has exercised an epistemic dominance disguised as ontological necessity, producing the mistaken equation that what cannot be formalized cannot be known—and what cannot be known scientifically is neither real nor true.

The scientific method is an extraordinarily powerful tool for exploring and understanding the world. But it is neither infallible nor exhaustive, nor ontologically superior. It approaches truth, grasps it in certain domains, but misleads us in others. It is animated by a profound commitment to precision, control, and coherence, but this does not entitle anyone to treat it as the sole guarantor of reality, knowledge, truth, or justice. If something works—and is true for those who live it within a certain model—it is not less real simply because it cannot be reduced to scientific formalism.

Another model, also marginalized, is the dialogical and intersubjective one. This is the model of Socratic dialogues, phenomenology, and maieutic pedagogies. Here, truth is neither subjective nor objective but emerges from the confrontation between subjectivities. It is a model well suited to chaotic or variable conditions, where scientific formalization is not possible or accessible. It is also a model we use daily, though rarely acknowledge as such. Personally, it is the one—together with the scientific-mathematical model—that I prefer for my own intellectual and professional needs.

Alongside these, there is the systemic-emergent model, found in the sciences of complexity, evolutionary epistemology, and systems theory. In this view, knowledge is an emergent expression of complex adaptive systems. There is no absolute truth, but there are stable local configurations, functional coherences, and dynamic transformations. What is true is what works and creates coherence in a local context. This model has been developed by thinkers such as Prigogine, Varela, Bateson, and Luhmann, among others. In it, truth is a process, not a fixed outcome.

Finally, there is the computational-constructivist model, which links Turing, von Foerster, Maturana, and, today, artificial intelligence. From this perspective, to know is to construct operational models, to simulate environments, and to generate structures. The mind is seen as a computational or autopoietic system—one capable of maintaining its own organization, interacting with the environment without being determined by it, remaining organizationally closed yet energetically open.

I continue to affirm the importance of the scientific-mathematical method, to which I attribute precision, rigor, and explanatory power. But I see it as one—perhaps my preferred one—among several possible ways of accessing truth.

If even space and time, once thought absolute, have proved relative to the observer, as in Einstein’s physics, we should not be shocked to admit that truth itself might also be situated relative to the models through which we interrogate it.

My intention here is not to be relativistic, but rather pluralistic and reflective. I believe there are conditions and contexts in which two opposing statements can both be true. Truth is not always singular, and justice is not always absolute.


Bibliography

Albert Einstein, Relativity: The Special and the General Theory, Bollati Boringhieri, 2000.
Alan Turing, The Essential Turing, Oxford University Press, 2004.
Francisco Varela, The Tree of Knowledge, Garzanti, 1992.
Galileo Galilei, The Assayer, ed. Libero Sosio, BUR, 1992.
Gaston Bachelard, The Formation of the Scientific Mind, Dedalo, 1976.
Gregory Bateson, Steps to an Ecology of Mind, Adelphi, 1976.
Heinz von Foerster, Understanding Understanding: Essays on Cybernetics and Cognition, Springer, 2003.
Humberto Maturana and Francisco Varela, Autopoiesis and Cognition: The Realization of the Living, Reidel, 1980.
Ilya Prigogine and Isabelle Stengers, The New Alliance: Metamorphosis of Science, Einaudi, 1981.
Michael Polanyi, Personal Knowledge: Towards a Post-Critical Philosophy, Rusconi, 1974.
Michel Foucault, The Order of Things: An Archaeology of the Human Sciences, Rizzoli, 1971.
Niklas Luhmann, Social Systems: Outline of a General Theory, FrancoAngeli, 1990.

Post più popolari