Eng (Sambo) Boris

PhD Student in Computer Science, Team LoVe, LIPN (Université Sorbonne Paris Nord, France) domain_name@hotmail.fr
« Return to Thoughts

Girard's conceptual knitting (version 3)

philosophy

Prerequisite: Curry-Howard correspondence, linear logic, proof-nets, basic knowledge of Transcendental Syntax

This text may be extended further depending on how far my understanding goes.

Context : the Curry-Howard correspondence

The Curry-Howard correspondence shows that some logical systems match with some computational systems (typically natural deduction for minimal logic and simply typed lambda-calculus) but does that mean that logic and computation are actually the same thing? It seems to me that the current Curry-Howard correspondence is merely a point where both meet; an intersection between reasoning and computation (e.g Coq). We will see that even though computation and logic can be entangled/mixed, their nature may be more subtle.

The Blind Spot and the logical space

Traditions and the unreachable essence of logic

When looking at the History of formal logic, we can observe that it was studied through an essentialist point of view for a long time. It means that the meaning of the logical rules was supposed to be prior to their existence: the so-called rules of reasoning. The evolution of logic differed in what were the common conception/understanding we had of it, leading to different ideas:

This biased study of logic was still useful and produced several results such as Gödel’s incompleteness theorems or connections with Computer Science through the Curry-Howard correspondence and later linear logic. It actually shows a good point of essentialism: its authority. We get both order and irrefutability (we can check that we indeed followed the definitions).

Revolution and the chaotic existence of logic

The idea of Blind Spot (to which Girard often referred to) is close to the epistemological rupture of Bachelard and Althusser and linear logic (especially proof-nets) can be thought as a paradigm shift in the sense of Kuhn. The idea suggests that our access to knowledge is obtruded by obstacles we have to remove in order to access to the “reality” of a concept. A science evolves by “revolutions” and doubt of its prior foundations. Although this idea can be easily misused or misunderstood, it seems to be quite relevant in the context of formal logic. The idea is that our understanding of logic is -not wrong- but biased and living in a closed world preventing us from reaching its hidden mechanisms (the blinds spots). If we take any logical system, its proofs are limited to its rules and formulas. This is what Girard call a “bunker”. How can we free logic from our preconceived understanding of it?

The logical space is the space in which logical objects (relatively to a given context) live. The first satisfying materialisation of proofs as mathematical objects comes with the natural deduction and the sequent calculus of Gentzen. The space of proofs is made of well-formed trees constructed with given rules. This space is very restricted because we can’t materialise wrong proofs and study the effect of their existence. We can say that this space is very opaque. Linear Logic extended the logical space by making explicit the management of resources at the core of reasoning. We obtain two conjunctions, two disjunctions and two connectors ? and ! related to the management of hypotheses as resources. Later, the proof-nets of Girard, which seemed like a mere syntactic hack, actually constitutes a further extension of the logical space. Proofs are now very general hypergraphs linking formulas. Some proofs are wrong and others are correct. We can distinguish them with a correctness criterion (typically Danos-Regnier’s one) by testing them with an appropriate set of tests. The Geometry of Interaction programme of Girard goes even further by representing Danos-Regnier’s correction graphs by the same kind of object as proofs. Correction now lives in the same space as proofs. The Transcendental Syntax is the natural successor of that logical exploration. As we will see, it lies on a strong conceptual knitting. This knitting seems to be an interesting update of Kant’s epistemology taking computation into account.

A computational update of Kant’s epistemology

Analytic / Answers

In order to give ground to the logical foundations, we have to find a very transparent and objective computational space (at least more than the sequent calculus, proof-nets or the geometry of interaction) from which Logic will emerge. Such a space is said to be “analytic”. The lambda-calculus, is an interesting candidate for its simplicity but not an adequate one because of the hidden complexity of the beta-reduction. Moreover, the beta-reduction is an external procedure and we want all computation to be self-operated/internal in some sense, similarly to how chemical/biological interactions work. We would like a system “which has nothing to hide”. Girard already showed in the context of his Geometry of Interaction program that we could use first-order terms to represent the physical addresses of objects occurring in a proof. He generalises that idea to a model of computation called stars and constellations. These computational objects constitute a system where Logic is reconstructed from our primitive intuition of space: locations/addresses and associations/routing. I believe that Girard’s analytics corresponds to an idea of computation which is described topologically (for instance Seiller’s interaction graphs compute by mere juxtaposition of graphs).

The analytic space is the space of answers. What we do is nothing more than computations on syntactic objects. As long as we do not interpret the answers we compute, the answers are meaningless/objective in some sense. The space of answers can be divided into two parts: the constat (result) and the performance (procedure). The constat is the finale answer we get and the performance is the procedure leading to that answer. The first one is explicit (terminated, no more operation can be done) and the second one is implicit in the sense that the access to the information it conveys is indirect: we need to start a computation. It corresponds to Kant’s analytic a posteriori and a priori. In the Sequent Calculus, the implication => is constative because accumulated (a simple left/right introduction of symbol) while |- is performative, because sequents are destructed/altered by the cut-elimination procedure. This is a difference unknown to semantic realism since => and |- are identified by semantics. The underlying meaning of the answers we get is contingent, relative to a subject, typically us, as humans. This leads to the nature of meaning and questions, associated to Kant’s synthetic.

Synthetic / Questions

The synthetic is where everything takes its meaning. Giving a meaning to something is arbitrary, dependent of a subject. In formal logic, computation takes place but we associate a proof to the formula or statement it proves and, as subject, judge if a proof is a correct or incorrect justification of a statement. If we take the very relevant example of logic programming, what we basically do is connecting syntactic objects and doing a computation based on unification. We represent questions with formulas and answers with proofs. Giving meaning is formatting the meaningless in the same sense that simple types describe the behaviour of the untyped lambda-calculus in terms of input/ouput relationship. This is Girard’s idea of subjective format. A format is restrictive, subjective/arbitrary but useful (typed lambda-calculus). There is no perfect format but the absence of format leads to chaos (untyped lambda-calculus).

The emergence of meaning is divided into two kinds: the usine (factory) and the usage (use). In the natural deduction, what we do is connecting rules together in order to produce trees. We have introduction rules corresponding to the creation/definition of symbols and the elimination rules corresponding to the practice/use of symbols. They both are materialisation of the usine/usage distinction. Both are strongly subjective since arbitrary but one can check if they connect well in a quite objective way by computation (e.g by the cut elimination theorem in sequent calculus). With proof-nets, the usine corresponds to tests in the correctness criterion. In MLL, we can see Danos-Regnier correctness criterion as testing the upper-part made of axioms with several switchings in the lower part. The usine is represented by the set of tests certifying that we have a proof-net corresponding to a “correct” proof proving a certain formula. However, by connecting proofs with cut and doing the cut-elimination, we can use these proofs and formulas in infinitely many ways since there’re infinitely many formulas to prove. Certainty emerges when the usine and the usage connect well. Moreover, a new “relative” approach to meaning emerges: the meaning of a proof depends on how it interacts with other proofs.

Interaction, testing and certainty

The idea of tests is close to Popper’s idea of refutability by experiments. This idea often associated to Popper, suggests that a claim/theory is valid until one reach a refutation of it by testing it against an experiment, thus distinguishing a “true science” from a “pseudo-science”. Even though this idea appears as perfectly valid, it suffers from its essentialist root. A claim/theory exists within a subjective/arbitrary framework and the idea of refutation itself may be opaque (a test of temperature is dependent of both the tested/water and the tester/thermometer but also their interaction/contact). Moreover, the “testing” of a claim/theory may be infinite, meaning that we may never reach an absolute certainty. In the Transcendental Syntax, testing appears as a computation and we’re concerned with either finite tests (absolute certainty/apodictic) or sample of an infinite set of tests (reasonable certainty/epidictic). A formula is refuted by a computational interaction with its negation depending of a chosen definition of orthogonality (seen as a point of view).

Although a proof of a claim and a proof of its negation can’t usually coexist, they virtually appear and have a role in proof-nets and the Transcendental Syntax. This may be linked to Hegel’s contradictory foundations where a concept lies on its negation itself mutually lying on the former in a conflicting relationship. In the Transcendental Syntax, formulas and their negations are set of constellations (computational objects) which satisfy a certain criterion of orthogonality when interacting together. They both exist as computational entities but only one “wins” when trying to resolve a conflict. The winner is considered to be “true/valid/provable” in the logical sense.

Regarding certainty in the correctness of proofs, in MLL, the certainty is absolute since the set of tests is finite and that cuts are very simple. From a more deontic point of view, the usine corresponds to the set of duties, the usage is the set of rights (something we usually can’t always predict/easily list). Cut-elimination is the balance between rights and duties. The usine corresponds to Kant’s synthetic a posteriori since checking/certifying a proof is empirical and operated by testing. The use corresponds to synthetic a priori because it is independent of experience.

Knitting / Separation

Constat/Performance. Both are separated by the computational indecidability of Church and Turing. An implicit computational artefact can’t be reduced to the set of its results because we can’t always predict their existence unless we do a computation (Turing’s halting problem). In other words, the implicit can be merely replaced by the explicit. They are linked by computation, a mean to go from performance/implicit to constat/explicit.

Usine/Usage. Both are separated by Gödel’s incompleteness theorems. Gödel’s incompleteness theorems shows the construction of unprovable formulas and the gap between provability and truth. The “format” we create for logic can’t be reduced to its use because the actual use can sometimes escape the boundaries we imagined and the use we expected.

Performance/Usine. The usine has a subjective part because we choose want kind of test we want to perform a certification/labeling. Some tests are better than others. Nonetheless, it also has an explicit part: performance is part of the usine. Because testing itself is performative, explicit and objective: we just have to perform a computation. What is contingent is the interpretation get from the result.

Performance/Usage. If we take cut-elimination, the result conveys a meaning but it is also a computational performance. The performance is also part of the usage but the usage can’t be reduced to that because the formulas are determined by the (usually infinite) set of interaction it can have, something overwhelming the simple idea of explicit performance. Unlike semantic realism, the idea of meaning is contextual, subjective rather than objective.

Comments

The logico-functional space. Girard’s chosen analytics is actually a variant of Robinson’s resolution and very close to resolution-based systems using graphs (for instance Kowalski connection graphs aka clause graph resolution). Therefore, we can do logic programming with stars and constellations but since the Transcendental Syntax reconstructs linear logic and that from linear logic, we can naturally simulate the lambda-calculus, we have a space where functional programming and logic programming meet.

The analytic space of unification. If we understand the idea of analytic space as being a computational model where computation can be strictly described by topologically (in a sense we have to define), then Robinson’s resolution is analytics because Herbrand’s unification can be seen as a connection of wires. A function symbol of arity n is a splitting of a wire into n subwires. Variables represents a end of wire which can be connected. The different strands are typed depending on the function symbols. The unification can also be described as pointers traversing graphs corresponding to terms.

Liberalised typing. The transcendental syntax can be seen as a kind of realisability theory with respect to the geometry of interaction with the choice of a convenient realiser (here constellations). Interestingly, the model of stars and constellations can naturally represent logic programs but also models which are also dynamical systems such as Wang tiles or abstract tile assembly model (aTAM) used in DNA computing. We can imagine providing a type theory of these models. As constellations are generalisation of Girard’s flows (mostly studied by Bagnol) and Seiller’s interaction graphs we can imagine applications in implicit computational complexity.