“In the transition from consciousness to reality, the ego, the thou and the world arise into existence indissolubly connected and, as it were, at a stroke.”1

The Line without Thickness and Money

The “Big Bang” of Western mathematics dates from Euclid’s definition β: a line is a length without thickness. This invention connects mathematics with Myth while proposing new forms of knowledge, and it organizes human space by separating the visible from the invisible, extracting Platonic Ideas from the world. In his Confessions, Saint Augustine recalls that even the filament of a spider’s web has thickness; so it is impossible to draw this idea of a line from sensory experience—God enables us to know this mathematical structure by inscribing it in our memory. Yet without adopting a hypothesis pertaining to absolutes, we may reconstruct the entirely human symbolic gesture that inaugurates the invention of the line without thickness. The latter is merely a boundary cutting out the figures of Greek geometrical lines, pushing to the limit the feature by which our ancestors in Lascaux drew bison composed solely of lines and contours, in which only humans can see the silhouette of an animal.

Ancien Greek Coin, Phocaea, 5th Century BC. Photo Credit: Doug Smith

Philosophy, mathematics, and metal currency were born almost simultaneously in Ionia between the seventh and sixth centuries BCE.2 These three practices each bear new forms of thought and social life. The invention of money implies categorization—a generalized equivalence, Marx and Keynes would say—which is linked to what philosophy does. As regards mathematics, the Euclidian axiomatic demonstration—this product of the agora—is as important as the invention of purely ideal structures, such as the line and the point, which organize and measure the world, while existing solely in language and in the gesture. Ideas are grasped through gestural practices, trajectories in space that show (montrer) the demonstration through movements of traced points and lines. The theorem comes from theorein, a “seeing” that refers to “showing”, on the model of the theater, where we access to “ideas” through the viewing of mythic events.

Among these mathematical “ideas,” the line as the trace of a continuous gesture is a fundamental notion: the point (semeion) is at the extremity of a segment (definition γ) and is apprehended, according to Euclid (Theorem 1, Book I) as the sign of a position on a line or an intersection of two lines. The point is not the fundamental entity: from a spatial-geometrical point of view, it is not the line that is composed of points, but it is the point that is the sign-trace of relations and movements of which the line is the support. All the figures of Greek geometry are composed of continuous lines and their intersections. On the one hand, figures constituted of lines without thickness make it possible to calculate surfaces; on the other hand, the ideality of the line, a limit notion, explains the eruption of the irrational (alogos) into the calculation of finite figures. The geometry of “pure” lines exceeds the arithmetic logos and organizes the world through relations of reciprocal determination between abstract diagrams rooted in humanity’s gestural and figural experiences—experiences to which the bison of Lascaux are a testament. Coin currency also institutes abstract relations of equivalence between objects that are fixed in symbolic writing (the value of the first coins was marked by geometric figures and by their combinations). This relation between geometry and ideality—between abstraction and figural vision—also governs Greek philosophy, the fundamental notion of which is the eidos, the Idea.

Uniform Rectilinear Movement

Thanks to a gesture just as audacious as Euclid’s, Galileo introduced into physics the principle of inertia as the fundamental principle of conservation (the quantity of movement). Galileo’s principle constituted a shift to the limit: since perfect inertia does not exist in the world but rather is an external limit, then Galileo, ideally situated on the horizon of all movements, can make them all intelligible as modifications of an ideal limit-state. In the Italian Renaissance, the infinite-ideal encounters the finite—i.e., human—world and natural processes. This revolutionary encounter became a prime theme in the great medieval theological debates,3 but its spatial representation would arise in the pictorial theology of the Annunciations—the decisive site of the invention of linear perspective.4 At the start of the fifteenth century, the presence of divine infinity within the finitude of a woman—the Virgin—is symbolized by the inscription within the painting of a perspective point where the lines converge “on infinity.” Later, Nicolas de Cues and Giordano Bruno plunged humanity into a God-Nature infinity that expressed a new view of the human condition: a gaze from infinity that reconciles humanity and divinity, ideality and spatiality—just like the Annunciations articulate the infinitude of God with the corporality of the human creature.

AnnonciationPieroDellaFrancesca-Longo

Piero della Francesca, The Annunciation to Mary, c. 1455, fresco, 329 x 193 cm, San Francesco, Arezzo

Renascent pictorial art is the prime site that plunges figures into a spatial-geometric infinity: according to Pomponius Gauricus, the artist should draw space on the canvas before situating the bodies there.5 Piero della Francesca is faithful to this principle, whereas with Paolo Uccello and Ambrogio Lorenzetti the geometry of pictorial space follows upon the event depicted. But in both cases, it is geometry that organizes the event, by giving it a visible and signifying structure. Shortly thereafter, mathematics and physics (with Descartes, Desargues, Leibniz, and Newton) embarked on an analogous journey, rendering intelligible the forms and movements of finite material bodies through infinite principles-limits.

The Invention of Mathematical Space

If the geometrical organization of pictorial space becomes inseparable from the constitution of figures and events, the geometric organization of space in physics becomes inseparable from the constitution of trajectories, which the Aristotelian vision studied as if they were independent and isolated phenomena. From Descartes to Newton, the space preexisting trajectories is constituted as an a priori structure, which Kant would turn into a principle—no longer of the real, but of knowledge. But this preexisting structure of cognition has generally been considered as an ontological reality. Thus there developed an implicit Platonism that turns equations, functions, forms of writing and of being, into ideal things that precede the world—a spontaneous ideology that is found in biology in the metaphor of DNA as a “code” preceding and containing the organism, just as God (or evolution) “programmed” it. The birth of ontology implicit to the Renaissance is also over-determined by the invention of paper money, which is a subsequent phase in the process of abstraction of value and of social relations.6

The complex articulations between myth, theology, and the formation of scientific thinking constitute the object of historical epistemology. However, the various “spontaneous” Platonisms that continue to exist in the philosophy of mathematics refuse this “gap,” this distanciation, that enables epistemological reflection: they reify the constituent human gestures, theological and mathematical, by which the world is rendered intelligible. In reality, the line without thickness, the point-sign, linear perspective, and space given a priori are all merely non-arbitrary constructions, engendered by concrete practices rubbing up against a “real,” which by its constraints channels our knowledge operations. But these constructions whose dominant traits are immateriality and mathematic invariance—and whose power arises from not only their great efficacy in mathematics and physics, but also from their mythical and theological origins—are completely inadequate for the biological field. It is this inadequacy that links the epistemological problem to the current orientation of our work.

Intermezzo: Existence and Truth

Any discourse on the “existence” of a mathematical structure (like the line without thickness) misses the problematic of the constitutive gesture. Not only does study of the brain show that the latter imposes boundaries on objects,7 but the organization of the world into continuous lines also arises from the organism’s primary gestures, starting with the circle of retention-protention.8 But these biological structures do not already contain the concepts of line and border: they are merely the conditions of possibility of a primary relation to the world that language, drawing, and writing organize conceptually on the basis of the gesture of drawing figures on a rock.

It is impossible to subject the principle of inertia to empirical verification, which would require realizing in the world an absolute absence of forces. We are dealing with organizing principles that arise from a praxis that allows us to regard the world from a shift to the limit. The mathematical structures issuing from a long scientific and theological history may nevertheless be modified through confrontation with the constraints of the real: this is the case with Relativity, where the Riemannian form of space, unifying inertia and gravitation, is ultimately the fruit of a reorganization imposed by the observed invariance of the speed of light. The power of symbolic constructions9 as well as shifts to the limit allow us to apprehend the finite from the a-logos, and the a-peiron continues to support the construction of our knowledge of the physical world. However, the question arises of whether these principles enable us to grasp the historicity and variability of what is living.

Space and Laws

It was only in the eighteenth century that the term “law” came to predominate in the vocabulary of the sciences—a projection onto nature of divine law, and according to some, of the law emanating from absolute monarchies.10 There is a link between the takeoff of a notion of law in the natural sciences and the invention of space. The idea of physical laws does not become a rigorous notion until it is formulated in the form of equations, which is only possible thanks to the construction of the a priori structures that are space and time, conditions of the possibility of physical and mathematical knowledge. For the regularities of nature to become “laws,” they have to be written in the form of equations or as functions of evolution, and to do this, it is necessary to construct an a priori space of mathematical inscription (spatiotemporal parameters). Mathematical physics in the nineteenth century would extend the Cartesian space of Newton, Lagrange, and Laplace into the space of phases, i.e., a framework given in advance as a condition of possibility of the determination by equations of geodesics—that is to say, optimal trajectories into the considered space. This framework remains an a priori condition even when the determination of trajectories proves incomplete, and when the formalism of equations describing the trajectories leads to the unpredictability of a determinist type, as in Poincaré’s resolution of the three-body problem, which destroyed Laplace’s myth of complete determination.11

Poincare plot of Chaotic Pendulum Activity (Single Phase)

In the nineteenth century, positivism repressed the theological and metaphysical origin of the “laws of nature” and attributed to them the power to totally regulate human actions. From these illusory postulates derived the catastrophic failure of the theories of economic equilibrium, which Poincaré had already demolished in a letter to Léon Walras. But the idea still persists today that economic processes correspond to a Walrassian dynamic, which spontaneously engenders an optimal state when it is not disturbed by exterior interventions. Here we are dealing with a metaphysics of power that considers society as an ensemble of rational individuals who are each aiming at the maximization of profit, and whose interactions produce a spontaneous social order, regardless of communication among people and their capacity to construct a common world. This metaphysics implies the idea that people can and should be governed by the laws of numbers, the techniques of calculation, and the norms spontaneously engendered by the statistical treatment of data. This principle is affirmed in a universal normalization of any policy—economic, political, and scientific—by its statistical results, which end up rendering superfluous and negligible both the singular variation and the qualitative decision. The normativity of statistical data completely forgets that the decisive moments in the history of scientific thought have involved radical metaphysical decisions and a new articulation between the regularities of the real, on the one hand, and mathematical ideality on the other. The decision to organize space on the basis of lines without thickness, to look at the system of celestial bodies from the sun’s point of view, to imagine spaces whose curvature is variable, the continuous deformations of space-time, and finally the articulation between number and form (Gestalt) under the various forms of analytic geometry, differential geometry, and algebraic geometry as far as Grothendieck topoi—all these decisions are founded on explicit metaphysical “leaps.” What negates an economics founded on numbers are the choices bearing values that correspond, in mathematical physics, to the introduction of new structures and appropriate observables, such as the quantity of movement, energy or entropy, fluidity, or the “color” of quanta.

A Theology Overthrown

It is in biology that the myth of computable, alphanumerical data has produced the greatest distortions throughout the twentieth century, by making DNA a “program” separate from cellular materiality and by reducing organisms to simple “avatars” of genetic information. While biologists like François Jacob12 directly identified genes with alphabetic writing, Francis Collins, director of the National Human Genome Institute publicly asserted in 2000, “We have grasped the traces of our own instruction manual, previously known to God alone.” This myth fabricates universals from particulars, by overthrowing the infinite-finite relation that had propelled the takeoff of science from Antiquity to the Renaissance. The finite technical activity of the computer programmer, the heir of the artisan-clockmaker, is projected onto the action of God or evolution.

Moreover, a person reduced to a codifiable sequence of bits loses the material thickness that Renaissance painting had conferred on human figures when plunged into the tridimensional infinitude of the Universe. The relation between the finite human and the mathematical infinite is broken down by digital and alphabetic metaphors, which erect into mythological entities the images drawn from common sense–with which science is supposed to break. These images effect “transfers” of vague, “weak” meanings,13 whose crude dualism effaces the singularity and historicity of the living, which is always this living thing here, with this body and this history, irreducible to the ideal invariance of mathematics, to the a-historic generic nature of its objects—or at least not in the same terms as in physics.

Strong Consequences of Weak Hypotheses

The image of the DNA-program has had several consequences: first of all, the idea that biology, dependent on the properties of molecules, could be reducible to the laws of physics.14 But which laws? Mathematical physics, from Galileo to quantum, has never ceased constructing and modifying its laws by confronting unprecedented phenomena. What I am trying to do in my work, along with Bailly and Montévil, is to articulate certain physical and mathematical theories with phenomena that are specific to life. In science, unification should always be provisional and “local,” not dogmatic and reductionist. But there are other consequences of the weak hypothesis when it is transferred from common sense to biology, whose effects on analyses of cancer interest me most at present. This domain was long dominated by the ideology of the computer program, in which cancer was supposedly the result of DNA deprogramming provoked by a signal that disrupted the instructions given to cells.15

From Triumph to Debacle

Since 1971, enormously funded projects have heralded the final victory against cancer thanks to genetic therapies able to “reprogram” “deprogrammed DNA.”16 These therapies were supposed to be based on the “rock” of chemical and physical laws, and the proximity of metaphors of “programming” to common sense facilitated their success among funding bodies and civil authorities as well as in public opinion. However, a half-century of genetic research has produced no genetic cancer therapy whatsoever, or even any solid knowledge about the unleashing and developing of cancer.17 On the contrary, the huge effort to decode the DNA of cancerous cells (undertaken starting in 2000) has demonstrated that the complex nature of cancer cannot be reduced to any purely molecular cause. The enormous financial efforts and the harsh repression of alternative hypotheses have both been motivated for long decades by the idea that any phenotype presupposes complete determination by the genes. By contrast, approaches such as that proposed by cancer biologists Sonnenschein and Soto,18 and adopted by Longo et al.19 are based on the Darwinian principle that organisms, including cells, tend to reproduce themselves with variations (a limit-state analogous to Galilean inertia, but specific to life forms). Lacking effective control by tissues, hormones, etc., the reproducing cells may reach the speed of embryogenesis, which is the case with certain tumors. Thus cancer does not depend on a “triggering signal” at the molecular level, but on the failure of the regulatory relation between tissue, organism, and ecosystem. These hypotheses, and their therapeutic consequences—which redirect the attention of researchers to prevention and environmental conditions—have only just begun to spread into the field, after decades of the informational-programming catastrophe.

Working Hypotheses in Biology

And so it is appropriate to go back to Darwin, whose greatness is to have formulated the theoretical principles of intelligibility of the biological, on the model of the great creators in mathematical physics. The two Darwinian principles are descendance with modifications and selection. The challenge is to articulate these principles with the analysis of the organism, unifying ontogenesis and phylogenesis. We have already looked at the role of principles in mathematics and physics, from the geometric structures of the space to the principles of conservation. The quest for principles in biology should follow these examples, but the principles specific to physics—founded on invariance, conservation of properties such as symmetry, and optimal trajectories—are inadequate for the reality of living beings. Living systems are in a constant state of critical transition: their symmetries are continually breaking up and being reconstituted.20 Thus the Darwinian principle of reproduction-with-variation may be seen as a principle of non-conservation, opposed to and symmetric with the principles of conservation and invariance in mathematics and physics. The adequate theorization of the biological field therefore demands extensions and intersecting of various physical theories—demands that we think of the coexistence of random classical and quantum phenomena in the cell.21 These operations rely on existing physical theories, while remaining irreducible to their techniques; they are “points of view,” “perspectives” on the organism, whose unitary and primordial reality furnishes the guiding thread through these different theorizations. The intelligibility of the biological field is only possible through intersections and partial integrations that aim to construct objects-of-knowledge in dialectical relation with the constraints of “raw” experience. In biology, this experience plays a singular role, unknown to physical theories: thanks to mathematization, the latter cut generic objects out of the real, and their objectivity depends entirely on the theoretical framework.

In biology, on the other hand, objects are always historic singularities, which are grasped by conceptual models that are qualitative, provisional, and over-determined by culture and ideologies.22 The centrality of each singular organism, with its own historicity, implies the primacy of variation and the breaking of symmetries that overthrow the mathematical primacy of invariance—a primacy with very powerful knowledge effects, but which proves an obstacle to understanding life, especially when it is disfigured in the image of DNA as the informational invariant and the myth of the “program.” For example, the materiality of each organism, its historical thickness, and the density of its internal and external relations, rule out any dualism between “software” and “hardware” that is specific to the notion of computer programs. Finally, one of the decisive operators of objectivity in physics is overthrown in biology: the space of phases (the observables and the parameters). Whereas space was fixed a priori as the condition of possibility and immanent norm of physical trajectories, in biological processes, by contrast, the singular trajectories constitute and constantly reorganize the space of possibles (of phases), the ecosystem, and the observables are the results of processes.

If our analysis of living dynamics is pertinent, it poses the problem of how to test the limits of traditional scientific objectivities, of which physics and mathematics represent the paradigms, in the face of the constraints of biological theorization. Overcoming very powerful theoretical practices that are rooted in venerable metaphysical and theological ideas is a radical challenge, but attempts are finally seeing the light of day.

Edited and translated from the Italian by Andrea Cavazzini. Translation from the French by Susan Emanuel. The complete original version “Le conseguenze della filosofia” can be downloaded at here.