⊢ Boris ENG

PhD Student in Computer Science, Team LoVe, LIPN (Université Sorbonne Paris Nord, France) [last name][first name]@hotmail.fr

On Intuitionism and Classicism

proof theory history technical

Prerequisite: natural deduction, sequent calculus, intuitionistic logic, classical logic

Introduction

Are there several logics? A unique and right logic? Intuitionism was introduced during the foundational crisis of mathematics in the beginning of the 20th century by mathematicians such as H. Poincaré and L.E.J Brouwer [1]. It constitutes both a criticism of the mathematical practice and an opposition to formalism in favor of intuition and construction. I provide my understanding of intuitionism and classicism to the ones familiar with their technical interpretation. We start from the question of the reliability of logic and mathematics.

The location of certainty

Practice shows (empirically) that mathematics has a great reliability but in the time of Brouwer, the existence of non-euclidean geometries resulted in the emergence of doubts. What allows us to consider a proposition as axiom or to use a particular rule of deduction? Where does the certainty of mathematics lie? Two schools of philosophy tried to answer that question:

• Formalism (German): it is located “on the paper”, that is, in the symbols we write, in the linguistics of mathematics, which is concrete, elementary and combinatorial.
• Intuitionism (mainly French): it is located in the human’s intuition, that is, we should rely to what is accessible to our intuition, what we can know as humans with our own knowledge and perception.

They still have a common point: the rejection of the correlation between mathematical laws and the laws of nature. In other words they are both subjectivist conceptions of logic and mathematics. We focus on intuitionism and its viewpoint regarding logic.

Subjectivism and constructivism

Science rely on empirical constructions, rooting certainty on observation and experience. From the viewpoint of intuitionism, certainty appears when we can make the existence of mathematical objects explicit (opposing semantic realism and its external existence of mathematical objects). It has several consequences:

• A doubt about infinity. Brouwer rejected actual infinity but accepted potential infinity [3, p.48-49].
• The rejection of the law of excluded-middle $\forall P. P \lor \lnot P$ because it automatically provides either a proof of $A$ or $\lnot A$ both coming “out of nowhere”.
• The rejection of proofs by contradiction (i.e $\lnot A \Rightarrow \bot \vdash A$) and several other equivalent laws for the same reason.
• A proof of existence $\exists x. P(x)$ must provide explicitly a witness $c$ and a proof of $P(c)$.

The rejection of these logical laws was quite surprising since they were strongly used in the mathematical practice. Intuitionism is a subjectivist ideology because it only accepts what we, as humans, can know, understand and construct, opposing any theory of truth and realism (adherence to a preconception of reality). Intuitionism accepts that we do not necessarily know and that we have to consider the undecidability of propositions (a fatality).

Brouwer defended pure intuition instead of a mechanical and belittling view of mathematics based on linguistics, so he was against any formalist conception of logic even though himself and his own student, A. Heyting (and independently A. Kolmogorov) are behind the formalisation of constructivism through the BHK interpretation. Logic is now divided in two: classical and intuitionistic logics.

Now, let us explore the technical content of intuitionism.

Technical interpretation

Classical logic tells us if a proposition is a true or false and Intuitionistic logic tells us why it is so. It gives a “proof/witness” of an answer.

Model-theoretic interpretation

Classical logic is naturally interpreted with truth values either 0 or 1 associated to a logical proposition. As the concept of provability changed in the intuitionistic framework, we have to switch to Heyting models or Kripke models internalising the “unknowable” and thus ignoring the mechanisms and dynamics of intuitionism.

In Heyting models, we usually interpret propositions with either a symbol representing truth (true/false) or “intermediate truth values”. These values represent the absence of knowledge that we can assign to undecidable propositions such as the law of excluded middle. With Kripke models, we do something similar by reifying a “world of all possibilities” allowing us to model the unknown/unknowable. These interpretations do not have the power of expressing an explanation of the “why”.

Proof-theoretic interpretation

The proof-theoretical interpretation is a bit more interesting. We will investigate the case of the Natural Deduction NJ and the Sequent Calculus LJ.

• The system NJ is inherently and naturally intuitionistic because it emulates reasoning with implicit infinity in the use of assumptions and with determinism on the unique conclusion (we focus on the construction of the proof of a unique conclusion without “losing focus” on it). It is also constructive because everything is explicitly justified at some point: the proof is made of proofs of sub-formulas. We can make it classical by adding a classical law but it is indeed an artificial extension which also breaks the symmetry introduction/elimination (called “harmony” by Dummett [4]) since the excluded-middle is a kind of introduction of disjunction. We should not break it since it is responsible for the good behaviour of proof-reduction.

• The sequent calculus is especially interesting because we can make it either classical or intuitionistic by restricting the right-hand part of the sequent to either at most one formula or an arbitrary number of formulas. The system LK has sequents of the shape $\Gamma \vdash \Delta$ for arbitrary sequences of formulas $\Gamma, \Delta$ and $LJ$ of the shape $\Gamma \vdash A$. In other words, we forbid contraction (infinity/duplication) in the conclusion. This peculiarity is not innocent and neither purely technical: the right-space restriction allows to focus only on a unique conclusion while laws such as the excluded-middle allows us to change our focus on several goals. Having the focus only on the construction of the unique conclusion enforces constructivity.

Computational interpretation

The system NJ, being isomorphic to the lambda-calculus through the Curry-Howard isomorphism, the computational interpretation is straightforward:

• Intuitionistic proofs $\Gamma \vdash A$ are typed functional programs $\Gamma \vdash M : A$.
• Lambda terms can also be seen as a kind of logs or annotations describing the structure of the proof.

In particular, intuitionism captures computational decidability. What is provable is what is computable by a sequential functional program. We have a correspondence between “computationally decidable” and “logically decidable” in this framework. The focus on a unique conclusion becomes the construction of a unique result.

Classicism strikes back

Is intuitionism weaker due to its rejection of logical rules?

Intuitionism is not weaker. It can even be considered as stronger because intuitionistic logic is a logic where only explicit constructions emerge. Not only it has a power that classical logic do not have but moreover, the conception of classical truth is not lost (we can encode classical provability in the intuitionistic framework).

To recover the classical truth, there exists several encodings called double-negation translations (which we will write $[\![ \cdot ]\!]$) such that if $A$ is provable classically then $[\![ A ]\!]$ is provable in intuitionistic logic. These translations usually work by putting double-negations $\lnot\lnot$ on the right position in formulas. Here is an example with a proof of the encoding of the excluded-middle in natural deduction $NJ$ and sequent calculus $LJ$.

Note that we have $\lnot A \equiv A \Rightarrow \bot$ in intuitionistic systems. We also use the notation $\Gamma = {(A \lor \lnot A) \Rightarrow \bot}$.

$\infer[\Rightarrow i]{ \infer[\Rightarrow e]{ \infer[ax]{\qquad\qquad\qquad\qquad}{\Gamma \vdash (A \lor \lnot A) \Rightarrow \bot} \qquad \infer[\lor i]{ \infer[\Rightarrow i]{ \infer[\Rightarrow e]{ \infer[ax]{\qquad\qquad\qquad\qquad\qquad}{\Gamma, A \vdash (A \lor \lnot A) \Rightarrow \bot} \qquad \infer[\lor i]{ \infer[ax]{\qquad\qquad\qquad}{\Gamma, A \vdash A} }{\Gamma, A \vdash A \lor \lnot A} }{\Gamma, A \vdash \bot} }{\Gamma \vdash \lnot A} }{\Gamma \vdash A \lor \lnot A} } {(A \lor \lnot A) \Rightarrow \bot \vdash \bot} }{\vdash_{NJ} ((A \lor \lnot A) \Rightarrow \bot) \Rightarrow \bot}$

$\infer[\Rightarrow r]{ \infer[c]{ \infer[\Rightarrow l]{ \infer[\lor r]{ \infer[\Rightarrow r]{ \infer[\Rightarrow l]{ \infer[\lor l]{ \infer[ax]{\qquad}{A \vdash A} }{A \vdash A \lor \lnot A} \qquad \infer[ax]{\qquad}{\bot \vdash \bot} }{\Gamma, A \vdash \bot} }{\Gamma \vdash \lnot A} }{\Gamma \vdash A \lor \lnot A} \qquad \infer[ax]{\qquad}{\bot \vdash \bot} }{(A \lor \lnot A) \Rightarrow \bot, (A \lor \lnot A) \Rightarrow \bot \vdash \bot} }{(A \lor \lnot A) \Rightarrow \bot \vdash \bot} }{\vdash_{LJ} ((A \lor \lnot A) \Rightarrow \bot) \Rightarrow \bot}$

These translations do not only recover the classical power of proofs but also give a computational interpretation to classical proofs. Why does $\vdash \lnot\lnot(A \lor \lnot A)$ work? Double-negation is not a magical technicality coming out of nowhere. Actually, it works because adding negations allows us to bypass the restriction of the unique conclusion by taking advantage of the free contraction on the assumptions. We clearly see that negation acts as a spatial manipulation in a proof (it puts information in the context, similarly to the action of saving data). Double-negation makes the proofs and programs aware of the context (as we can see in CPS translations).

If we take a close look at the two proofs above, we can see that a kind of mechanism of “saving information” is at work. The double-negation allows us to put information in the context which is not restricted and thus we can reuse it for both the branches with $A$ and the one with $\lnot A$ in $A \lor \lnot A$ while we had a choice destructing information (irreversibility of disjunction) in the intuitionistic case.

Going further, we can see that classical laws (especially introduction of double-negation) correspond to an ability of freely saving information while maintaining constructivity and sequentiality. The Curry-Howard isomorphism tells us that the computational content of classical logic lies in continuation-passing style and control operators such as call/cc.

Conclusion: classical logic and non-determinism

To conclude, right-hand restriction on space forbids infinity on the conclusion but enforces constructivity and sequentiality. We may think that some classical truths are lost in this new intuitionistic framework but an encoding with double-negation allows to simulate classical proofs in a sequential way by a mechanism of jump/pointer/saving on the left-hand side which acts as a kind of memory.

But still, classical logic remains the most natural way of reasoning for humans. But why? Intuitionism fits the point of view of sequential machines but pure classicism apparently does not. What makes $\vdash A \lor \lnot A$ provable? It seems to be the ability of retraction of choices (which is destructive for “sequential machines”). The excluded-middle is indeed provable if we choose to prove $\lnot A$, then introduce $A$ in the context and change our mind to prove $A$ instead (as suggested by P. Wadler’s devil):

$\infer[\lor r]{ \infer[\lnot r]{ \infer[ax]{\qquad}{A \vdash A} }{\vdash A, \lnot A} }{\vdash_{LK} A \lor \lnot A}$

But since LK is lacking confluence as stated by the below critical pair [2], it is irrelevant in the computational world.

$\infer[cut]{ \infer[RW]{ \infer{\vdots}{A \vdash B} }{A \vdash C, B} \qquad \infer[LW]{ \infer{\vdots}{D \vdash E} }{D, C \vdash E} }{A, D \vdash B, E}$

reduces to either

$\infer[{RW, RC, LW, LC}^\ast]{A, D \vdash B, E}{A \vdash B}$

or

$\infer[{RW, RC, LW, LC}^\ast]{A, D \vdash B, E}{D \vdash E}$

To study logical phenomena further we can normalise the rules of LK thanks to the involutivity of negation to only consider right rules producing a monolateral version of LK. Sequents are now of the shape $\vdash A_1, …, A_n$ corresponding to a kind of space of concurrent interactions with partial/unfinished proofs we can postpone:

• the cut ${\small\infer[cut]{\vdash \Gamma, A \quad \vdash \Delta, \lnot A}{\vdash \Gamma, \Delta}}$ makes A interacts with its dual $\lnot A$.
• general infinity (free duplication/contraction and erasure/weakening) creates non-determinism, leading to complex exponentiation and polarisation in Linear Logic.
• the goal is to connect two dual formulas by the axiom $\vdash \Gamma, A, \lnot A$.

Classicism gives more space to work on proofs and seems to be aware of its environment for both pure classicism and negation-translations. Intuitionism provides a “why” explaining the observations/results of Classical logic. But why is this explanation correct? The “how” still remains. The answer will we partially provided by Linear Logic which describes the operational behaviour of Logic.

References

[1] Brouwer, L. E. J. (1913). Intuitionism and Formalism. Bulletin of the American Mathematical Society 20:81-96.

[2] Girard, J. Y., Taylor, P., & Lafont, Y. (1989). Proofs and types (Vol. 7). Cambridge: Cambridge University Press.

[3] Kleene, Stephen Cole, et al. Introduction to metamathematics.

[4] Michael Dummett, The Logical Basis of Metaphysics.