Science conjures up a world, by means not of magic immanent in reality but of rational impulse immanent in mind.
—Gaston Bachelard1

This paper aims to show that Gaston Bachelard’s epistemological position can be considered as a third option between the computational/determinist and the embodied/evolutionary approaches to scientific knowledge. In this regard, I will first introduce the formalist and positivistic backgrounds of computational sciences and I will show the limits that these impose on knowledge. The second part of this talk will deal with the anti-computational paradigm which considers knowledge as the result of an historical process of interaction among adaptive systems and I will point to the limits that this perspective imposes on reason. In the third and last part, I will present French rationalist epistemology as an alternative which, while preserving the autonomy of reason and its independence from natural evolution, allows us to account for the historicity and the unpredictability of thinking.

1 – Formalism and Computation

Computability originated from logic within the frame proposed by the founding fathers of formalism, Frege and Hilbert, largely focusing on Arithmetic. This strategy was meant as a solution to the challenge posed by the introduction of non-Euclidian geometries: instead of deciding which geometry must be considered as matching the real structure of space, Frege and Hilbert decided to provide mathematics with an autonomous and purely rational foundation. To do so, they decided to give a central role to Arithmetics and its “absolute laws” in order to disregard the interaction with physics which was at the core of Riemann’s geometry.2 Mathematics has to be founded on pure logic or on formal computations over meaningless strings of signs rather than on psychology, intuition or on living actions and interactions. One main motivation for formalists, then, is to avoid any ontological commitment to a problematic realm of objects. As Frege wrote in 1903, while criticizing Hilbert’s approach :

For the formalist, arithmetic is a game with signs which are called empty. That means that they have no other content (in the calculating game) than they are assigned by their behaviour with respect to certain rules of combination (rules of the game).3

One could improve these ideas by treating mathematical theories, their language, axioms and rules, as formal, mathematical objects in their own right. This is exactly what the Hilbert program set out to do, creating the new discipline of meta-mathematics. Accordingly, we should be given a rigorous specification of which arrangements of well-formed formulae count as proofs in a given system, and of what theorem they prove in each case.4 No further issue of truth need arise; nor do we need to assume that there is only one system for a given set of symbols. In this way, mathematics is described as a ‘calculus’ that is not to be used to represent the world as it is in itself but whose value is entirely instrumental. Formalism, in particular Wittgenstein’s Tractatus version, greatly influenced the positivists such as Carnap, against which Bachelard developed his own epistemological theory, as we will see in the last part of this paper.

From the positivist standpoint, then, the proper object of philosophy is the analysis of the structure of thought, however the study of thought is to be sharply distinguished from the study of the psychological process of thinking. Thus the proper method for analysing thought consists in the analysis of language. There are several tacit assumptions here, as Dummett noticed:5

  • The philosophy of science is not concerned with the actual thought processes of scientists.
  • It is concerned with the analysis of the structures of scientific thought as these are manifest in the language of science, and more specifically as these are manifest in the assertions made about the physical world by scientists, whether these are proposed as theoretical claims or as reports of experimental findings.
  • Concepts are investigated via their function in statements.
  • Logic here means formal logic, so that a complete logical analysis/reconstruction of a scientific theory would take the form of a formal axiomatisation of it.

Scientific knowledge, as expressed in scientific theories and as the concern of the philosophy of science thus becomes independent of the knowing subject. Reasoning is characterised as formal in the sense that any insight into, and knowledge of, subject matter is irrelevant in the recognition of validity. Formal principles are such that one can imagine an automaton, a ‘reasoning machine’, which carries out inferences in accordance with them: our computers are the realization of this possibility opened up by the introduction of the notion of formal calculus. Accordingly, the simulation of any sort of rational activity requires the establishment of a set of rules operating on some given data to provide, by computation, the solution to a particular problem.6 For example, in the Laplacean dream implied by computationalism, it is possible to predict the trajectories of celestial bodies according to Newtonian laws and, thanks to the speed of the machine, to deliver a simultaneous description of the totality of the future states of the universe. Computational devices chase the ideal of the perfect prediction of the future, that is the possibility of describing the deterministic trajectories of particles and bodies as if they could be accelerated to infinity. Accordingly, computational sciences chase an asymptotic ideal of a perfect and non temporal knowledge which assigns the limit of human intellectual capacity to its physical embodiment: there is no specific duration of the calculus, the time it takes depends on the material constraints of the body, which is the hardware. In other words, computational sciences present the myth of perfect knowledge as the simultaneous contemplation of the totality of possible experience, that is the totality of all the possible events that one could experience in an infinite time and which can happen according to a particular given function.

However, physical and material constraints – which we can hope to overcome thanks to the evolution of technology – are not the only boundaries of thought. We do not have to forget that formalism was born to avoid the metaphysical question concerning the relation between our representations and reality, which implies the capacity of explaining why a mathematical description is able to capture reality or why the real is such that we can know it. To go through a computation mechanically, following prescribed rules, may lead one to functional solutions, but it does not imply any real knowledge of the simulated system. Accordingly, we have to account for this important limit of computational sciences: they do not allow themselves to get any knowledge of the reason why a particular mathematical function can predict the behaviour of a certain system: the formal or conceptual conditions of representation can be validated only a posteriori. One can verify a simulation by checking if it actually imitates the behaviour of a system (if it does not, one can recalculate) but, according to this epistemological paradigm, one cannot get any knowledge of the reason why a system behaves in such a way that a description is possible. Thus, on the one hand computational sciences promise us to overcome the limits of our temporally bounded capacity of determining the future states of physical systems, whereas, on the other hand, they exclude the possibility of explaining the becoming of our knowledge and experience of the world, a reality that it is difficult to reduce to a given achievable totality. Moreover, this implies a sort of undecidability concerning the choice of the conceptual axiomatic frame. For example, to address the representations of two different scientific theories, one has to verify each of them relatively to a particular, contextual axiomatic ‘framework’ which include a system of rules of proof: given these, some sentences are ‘determinate’, so are provable or refutable. But, if both the representations are proven to be consistent with regard to their frame, how do we choose which system to adopt? According to Carnap’s Principle of Tolerance,7 one can adopt the system that one wishes, and this means that the choice will support the frame that is considered more efficient in matching the evolution of the simulated physical system. Given this, we see that the vaunted ontological neutrality is a mere facade hiding a radical form of empiricism (and, thus, of skepticism).

2- The Evolutionary and Anti-Deterministic Position

I am now going to briefly introduce the anti-computational and non-deterministic epistemological paradigm supporting the idea that our representation can and must be grounded on the evolutionary history of material physical interactions. Accordingly, scientific representation is justified as a real knowledge (rather than as a mere description) since it finds its necessary conditions in the temporal process of becoming and differentiation that is the real. This paradigm, then, addresses the “metaphysical” issue of the real genesis of the conceptual mathematical framework that allows us to get a description of the world, an issue that the computational approach cannot deal with since, for it, thinking must be distinguished from any historical or psychological process.

We can situate the origin of this perspective in Poincare’s (1906) criticism towards Hilbert’s foundational program, which he considered to be viciously circular (and this is what Gödel proves later). According to Poincaré’s well known “three bodies problem”, the system of equations that determine the movements of three celestial bodies has no analytic solution, thus, against the Laplacian program of perfect computatbility, the trajectories turn out to be unpredictabile. A physical process may be fully determined by a computable evolution function which would yield a program, however, in mathematical physics the problem is that often there is no analytical solution. Because of this peculiar “undecidability” Poincaré opposed Hilbert’s search for complete and decidable knowledge, although he was convinced that the very knowledge of this unpredictability constituted a very important scientific advance.8 Moreover, we have to remember that the formalist computational program was supported by the need to disentangle geometry and physics by showing that the validity of the axiomatic system is totally independent of any issue concerning the structure of reality. Against this program, Riemann’s approach can be seen as a sort of renovation of the relation between geometry and physics, based on the assumption of the interdependency of the form of space and its material content. So, rather than being interested in the deterministic interplay of meaningless signs, the epistemological alternative to formalism and logicism is engaged with the real interaction of material systems whose behaviour cannot be exactly modelled by a system of integrable equations. Accordingly, the goal of science does not consist in describing and predicting the behaviour of systems, but in explaining the reasons for the unpredictability that results from most of the interactions. From this standpoint, then, the limits of knowledge are not bounded to the speed of calculation, but the knowledge of the uncomputability of the behaviour of real physical systems grounds our human ability to think within a natural open and non-deterministic evolutionary process. Here, that which appeared as the limit that could be overcome, the limit related to the speed of calculation, turns out to be the only real and unavoidable constraint: for knowledge to be possible, the physical interaction of systems is necessary and this means that representation is bounded to the time of experience and to the historical time of co-evolutionary adaptation. It is not possible to isolate one system from another one pretending that there are no interferences leading to an exponential deviation of the trajectories; moreover, it is not possible to consider the behaviour of a system as a sequence of discrete states. On the contrary, one must consider the irreversible continuum of processes as intrinsically temporal and leading to the emergence of qualitative differences. Thus, the scientist’s goal does not consist in predicting what is essentially unpredictable, but in explaining the reasons for the unpredictability of the outcomes of the interactions among physical systems. Accordingly, this second epistemological position aims to overcome the limit that computational sciences cannot; the impossibility of grounding knowledge on a metaphysical necessity; towards the explanation of the reason why reality is such that we can know it. Our representations, then, are not the outcome of a decision concerning the efficacy of a specific conceptual frame, but they are necessarily bounded to an evolutionary history of actions and interactions within the environment. Our representations are true not because they are consistent relative to a logical system that happens to produce good predictions, but because they have been selected within a complex history: it is the knowledge of this non-deterministic history which explains why reality is such that we can describe it in an efficient way.

Now, not only can this anti-computational perspective be criticized for transgressing the foundational limit that, according to the computational paradigm, leads to the paradoxical knowledge of the genesis of the conditions of knowledge, but, most seriously, it can be criticized because it makes of human knowledge something which depends on a non-rational natural process (if we do not want to claim that everything thinks, in some way) from which it cannot be clearly distinguished. As a consequence, human reason risks being reduced to the expression of a sort of natural drift allowing any system to cope with the environment by modifying and varying its structure and behaviour. It is evident that this would render it impossible to justify the human capacity of choosing its owns ends in an autonomous way and of speculating and acting in a way that is not directed to immediate utility. Moreover, it is difficult to understand how the knowledge of the temporal process, or natural drift, that enabled knowledge to emerge is possible: if a functional behaviour concerns a local point of view embedded in a specific situation, the knowledge concerning the totality of the natural evolutionary process demands a global perspective which cannot be justified.

3 – Gaston Bachelard: A Third Way

In this third and last part, I am going to introduce Gaston Bachelard’s (1884-1962) epistemological position which can be situated within the unfairly unknown French rationalist lineage starting with Léon Brunschvicg (1869-1944) and including philosophers such as Jean Cavaillès (1903-1944), Albert Lautmann (1908-1944) and Jules Vuillemin (1920-2001). They all criticize the positivist epistemological position of the Vienna Cercle, according to which an experimental science is not the study of a certain domain of reality, but a coherent set of propositions that involves certain words and certain attributes corresponding to objects of experience and their observable properties. The disjunction between the meaning of the signs and the operations that one is allowed to perform on them entails that the formal study of scientific language is considered as the sole object of the philosophy of the sciences. This is a thesis that is difficult to accept for those who think that the task of philosophy consists in establishing a coherent theory of the relationship between logic and the real. There is a physical real and what has to be explained is that there is a need of the most developed mathematical theories to interpret it.

Sharing these concerns with his French rationalist fellows, Bachelard emphasises that the institution of a set of rules that, if followed, leads to the formulation of true theories, does not imply any scientific discovery or actual knowledge, since the consistent theories only happen to predict the behaviour of a system without being able to explain why it happens. For Bachelard, knowledge is not a collection of data that can be stored in computer memories and that can be used to perform calculations, rather, knowledge is a state of a rational subject who can explain the reason why the process of discovery he is carrying out is likely to reveal new aspects of reality (and not only the future states of a given system). What is at stake here is the method of production of the mathematical structures that are involved in the genesis of our representation of reality, a method which aims to enlarge the possibilities of experience rather than to enclose it in a predictable totality. As we read in the New Scientific Spirit:

It does not matter how long realism affords the mind the luxury of intellectual repose; the striking fact is that every fruitful scientific revolution has forced a profound revision of the categories of the real. What is more, realism never precipitates such crises on its own. The revolutionary impulse comes from elsewhere, from the realm of the abstract. Mathematics is the wellspring of contemporary experimental thought.9

Bachelard was aware of the becoming of scientific knowledge and of the revolution that was brought about by Rieman’s geometry and Einstein’s relativity, so, instead of dissociating knowledge and history to make of scientific statements universal analytical truths, he tried to answer the following question: how is it possible to explain that a mathematical discovery, which happens at a certain point in time, makes our experience of real objects possible, for example relativistic or quantum objects? As we said, Bachelard thought that formalist epistemologies could not provide such an explanation because one needs to consider the scientist as an historical subject rather than as impersonal software calculating according to given logical rules. As he wrote in this respect:

What audacity for the philosopher to assume that, abandoning his ego for a moment, he can recreate the entire World. And in any case what authorizes him to think that he can apprehend the simple, naked ego apart from its involvement in the acquisition of objective knowledge? In order to circumvent these basic questions, we shall look not only at the problems of science but also at the psychology of the scientific mind, regarding objectivity not as a primitive given, but as something that is learned with great difficulty.10

It is true that one can introduce a new axiomatic, but, from a formalist point of view, they are not meant to have any effect on the production of reality since they only provide us with a different operational frame, which is supposed to be applied on the same empirical basic observational data: thus the result is not a new object, but a different description of the same object that happens to be more or less efficient. However, the anti-computational naturalizing position, which considers knowledge as the emergent property of embodied interacting and coevolving systems, is not likely to provide an answer to the question about the rational method for enlarging experience. In fact, as we claimed before, from this stand point one can get an explanation of the historical genesis of the conceptual frame and account for its becoming, however the new experienced reality is conceived as the result of a process that, in a certain way, is imposed on reason: if knowledge is the outcome of a history of exchanges and interactions with the environment, then any new representation of this environment is not the expression of the creative autonomy of thinking, but of a natural unpredictable non-rational drift which express itself within processes of individuation and differentiation. This naturalizing epistemology, which seems to reduce thinking to a non-exclusively human ability of adaptation, would prevent the possibility of conceiving the creative autonomy of reason, which is essential in order to explain the purely rational method of mathematical discovery. In this regard, in The New Scientific Spirit, Bachelard writes:

The philosophers, for their part, hold out to us the idea of communion with an all-enveloping reality, to which the scientist can hope for nothing better than to return, as to a philosophy original and true. But if we really want to understand our intellectual evolution, wouldn’t we do better instead to pay heed to the anxiety of thought, to its quest for an object, to its search for dialectical opportunities to escape from itself, to burst free of its own limits? In a word, wouldn’t we do better to focus on thought in the process of objectification? For if we do, we can hardly fail to conclude that such thought is creative.11

Thus, both the epistemologies, the computational and the evolutionist, turn out to be unable to explain how reason can engage in the activity of autonomous conceptual creation, which leads to new real experiences rather than to reinterpretations of them. We could say, then, that Bachelard could have agreed with the formalist effort to ground mathematics on pure rationality rather than on historical material processes and, at the same time, he agrees with the effort within the evolutionary paradigm to provide a genetic explanation of the conceptual mathematical frame in order to legitimize its becoming. So, for example, concerning the radical novelty of Einstein’s relativity, Bachelard is interested in the rational process leading to the constitution of its mathematical condition: Riemann’s geometry. The introduction of this new geometry cannot be explained by referring to intuition and to the functional relation of an embodied subject within its (Euclidian) environment, on the contrary, one should wonder the inductive value of that purely rational structure which opens to the possibility of experiencing a new reality, a reality that was previously hidden.12 According to Bachelard’s original standpoint, the possibility of experiencing subatomic particles or gravitational waves, for example, is disclosed by the production of the mathematical structure that allows us to actually construct the phenomena, to look for the empirical impressions to be organised in order to build the real object. In this way, the purely rational research concerning mathematical ideas finds its functional value by inducing the possibility of experiencing an enlarged reality. This entails that the real is neither something which is empirically given like a discrete data that can be used as an input for calculation, nor the historical evolutionary process expressing a natural drift which comes to get a knowledge of itself. Bachelard’s real is rather the product of what he calls a “phenomenotechnology” which consists in constructing new real experiences according to the conditions provided by what he calls a “noumenology”, which is the reflective and creative activity by which new mathematical structures are rationally generated. Thus, instead of considering the noumenon as the Kantian inaccessible thing-in-itself, which hides beyond the phenomenal world of our experience, he conceives it as a rational idea, as a purely rational mathematical structure that demands to be actualized, that demands to become a real experience. As Bachelard makes explicit :

The dialectical relationship between the scientific phenomenon and the scientific noumenon is not leisurely and remote but rapid and strict; after a few revisions, scientific projects always tend toward effective realization of the noumenon. A truly scientific phenomenology is therefore essentially a phenomeno-technology. Its purpose is to amplify what is revealed beyond appearance. It takes its instructions from construction. Wonderworking reason designs its own miracles.13

Hence, science must be considered as a “phenomeno-technology” establishing the experimental conditions allowing us to make the experience of a reality to which we had no access before the production of the mathematical conceptual frame, which makes us speculate about its existence. Conversely, the activity of the mathematician can be said to be a “noumenology” exploring the ideal domain of purely rational structures independently of the concrete possibility of their actualization. What matters here, is that, on the one hand, Bachelard offers a philosophical explanation of the way in which reason produces its own ideas and concepts in an autonomous way, independently of any evolutionary or adaptive constraint, and, on the other hand, he explains why we can claim that we know the real and that reality is likely to be described by us within an historical process of discovery.

To go a step further, we can briefly introduce Albert Lautman’s dialectic as being a good example of the kind of rational process in which Bachelard’s noumenology would actually consist. Lautman, whose writings have been recently translated into English and collected in the volume “Mathematics, Ideas and the Physical Real”,14 tried to elaborate a philosophical schema of genesis for mathematical objects, a schema that cannot be reduced to an axiomatic. According to Lautman, Hilbert’s formalist effort, consisting in considering mathematical entities as satisfying the conditions of the axioms.

tends to stabilize the mathematical entities in certain immutable roles and ignores the fact that the abstract entities that arise from the structure of a more concrete domain can, in their turn, serve as a basic domain for the genesis of other entities. It is therefore only within a determined problem that distinct functions can be assigned to different kinds of entities. Any logical attempt that would profess to dominate a priori the development of mathematics therefore disregards the essential nature of mathematical truth, which, on the contrary, is connected to the creative activity of the mind, and participates in its temporal character.15

Thus, the philosopher has neither to extract the laws, nor to envisage a future evolution, his role only consists in becoming aware of the logical drama which is played out within the theories. The only a priori element that we conceive is given in the experience of the exigency of the problems, anterior to the discovery of their solutions.16

Lautman’s dialectic is then a rational method of discovery that; instead of pointing to the most general in order to arrange and organize the plurality of mathematical objects, distinct elements and domains under a unitary axiomatic; looks for the problem which is implicit in a certain relation of ideas such as the discrete and the continuum, the local and the global, etc. This speculative dialectic, inspired by Plato’s method of division rather than by Aristotle’s search of the genus, is an art of constructing ideal problems as structures which are apt to generate new objects.

To conclude, I would like to go back to the computational sciences and try to look at them from the perspective of Bachelard’s and Lautman’s philosophies of science and mathematics. It seems to me that, instead of considering the calculations of computers as mere simulations which do not provide us with any real knowledge of reality, we can regard them as experimental devices that actualize mathematical structures and offer us the possibility of experiencing a reality that we couldn’t perceive before. We know, for example, that models like cellular automata allow us to visualize different patterns of emergence that it was very difficult to observe before, in the same way, computer simulations of chaotic dynamical systems give us not only the possibility of making weather forecasts, but also the chance to visualize the reality of the different attractors. Hence, rather than as miraculous devices offering us the perfect knowledge and control of the future, we should consider our computers as realities that have been produced as consequences of previous theoretical insight into mathematical ideas. A speculative and theoretical insight that these machines actualize without being able to simulate it as a speculative process of discovery. Our computational devices are available real experiences, thus they are products of knowledge rather than producers of knowledge. To be engaged in knowledge production, in fact, means to be engaged with ideal problems, which lead to the production of conceptual and mathematical structures as conditions for experiencing new realities and creating possible worlds. Conversely, to be a product of knowledge means to be the actualization of a set of conceptual conditions allowing for a certain kind of experience: the representation of a particular and a priori given world. Thus, it seems to me that, instead of the ultimate experience of the totality of possible experience, we should consider our computational machines as one among the many experiences that our non-computational speculative reason can offers to us. What we should wonder about, then, is neither the speed at which a computer can simulate the behaviour of a system, nor the natural drift that makes all the trajectories diverge allowing unpredictable differences to emerge: we should rather wonder about our capacity for generating ideal structures that can be actualized in new real experiences which are independent from natural adaptative ends: computational devices are among these new real objects.