In Dual Intuitionistic Logic and a Variety of Negations: The Logic of Scientific Research Yaroslav Shramko, inspired by Popper, makes an interesting case that co-constructive logic as the logic of refutation is the logic of empirical science. In a 2016 paper Structuring Co-constructive Logic for Proofs and Refutations James Trafford, following on an slightly earlier paper Co-constructive Logics for Proofs and Refutations shows that both intutionistic and co-intuitionistic logics can be brought together in a way that make a very nice distinction between hypothesis and proofs or refutations. It is shown that a logic that supports exponentials and co-exponentials collapses into classical logic.

In Trafford's later paper the notion of co-exponential is clearly defined. I am finding these somewhat difficult to think of, so I want to try to see if I can put together an example here. If we accept that co-intuitionistic logic is the logic of empiricism, then we should be able to find a real life example of how it works. Here $L_I$ stands for inutionistic logic, and $L_C$ for coinutionistic logic.

enter image description here

So to help me understand this, I thought I'd try to see if I can formalise a real life example that happened recently. I went to a shop thinking I could use my debit card to pay, but the card was refused. Was it because I did not have enough money on the account, or because something in the payment system did not work?

  • B: Money on my Account
  • C: Banking system working
  • A: Payment with debit card

We could use the following refutations proofs.

  • b: B - b may be a refutation that I have money on my account that I could get by going to the Web site of the bank and looking at the balance.
  • a1, a2: A - these are refutation that I can pay with my DC. a1 would be the event of trying to pay in the shop, a2 the event of being refused also at the cash point with the same card.

Things get quite difficult then to interpret because we are working in a dual topos. So everything is upside down. Luckily Trafford does explain co-implication using subobject classifiers.

enter image description here

Still it is tricky. I have a few attempts but I am not sure if I got them right.

So the question is: how can we interpret $B_A \oplus A$? hopefully with an practical example as the one given to help form the inuitions more carefully.


I start with an explanation of how co-constructive logic relates dually to constructive logic (as a way of verifying that I understand the concepts, and so you can verify my argument too). Then I work out an example using constructive logic that makes sense. Finally I turn that example around to show the benefits of co-constructive logic.

Sequents in constructive and co-constructive logic

Constructive Logic is a logic to formalise constructive proofs. It has its origins in Mathematics as a way to move away from thinking of the mathematical world as objective, as that raises the question as to how one can come to know those abstract objects. It's aim is to concentrate on the constructing of proofs.

The inference relations between formulas in Constructive logic forms what is known as a Heyting Algebra. Starting with atomic formulas one can freely generate complex formulas by combing them with $\land, \lor, \to, \lnot$. The logical rules these come with tell us how we can relate the formulas, allowing us to create a category where $A \vdash B$ is represented by an arrow from $A$ to $B$ or rather equivalence classes of formulas containing them. This category can be drawn as lattice type structure with $\bot$ at the bottom and $\top$ at the top, and the other formulas in between. $\bot$ is the initial object of the category and represents incoherence and can be thought of as the empty set. $\top$ is the final object and represents logical truths. From an incoherence $\bot$ everything else follows vacuously. This is described in detail in chapter 2 of Lectures on the Curry-Howard isomorphism.

Negation is defined as $\lnot p =_{def} p \to \bot$. Since there is only one arrow to $\bot$, namely the identity arrow $1_{\bot}$, this means that p is in the equivalence class of $[\bot]$. Under the Curry-Howard isomorphism one can also see an instance of such a type as represented as a function $p \Rightarrow bot$, which is also impossible. (todo: check this more carefully)

In constructive logic we have $\Gamma \vdash \alpha$ meaning that if all formula in $\Gamma$ are accepted then we must accept $\alpha$. If we accept that sequent, yet also refute $\alpha$, then we'd be obliged to refute one of the premises in $\Gamma$. But co-constructive logic tries to build refutation arguments. Starting from a refuted set of premises leads to a refuted conclusion. This refuted conclusion can then be used to refute the constructed conclusion in a constructive proof, and so refute one of its premises. Instead of inverting the antecedent and consequent as is done in Urbas' Dual Intuitionistic Logic, Trafford turns the turnstile around giving us the $\dashv$ symbol.

So co-constructive logic is a logic of refutation. Imagine you are trying to refute someone: you win if you can prove them wrong, so false is good!

If one combines constructive and co-constructive logics one gets classical boolean logic. In a short 2015 paper Co-Constructive logics for Proofs and Refutations Trafford starts working with them separately and later shows how they can be brought together. But the key finding is that these two logics are completely dual.

Where Constructive Logic is modeled by a Heyting Algebra, turning all the arrows around gives us a co-constructive logic with $\top$ at the top modelling the formula that can never be refuted, and $\bot$ modelling those that have been refuted, with the arrows going from the initial object $\top$ to the final object $\bot$.

The duality is very deep. The article The Evil Twin: The Basics of Complement-Toposes cited by Trafford, argues that every Topos gives rise to both structures.

Some examples rules

A full list of rules for the system LDJ is to be found in Dual Intuitionistic Logic. Note that there the antecedent and the consequent are swapped. So where we write $\Delta \dashv \alpha$ Urbas writes $\alpha \vdash \Delta$.

If a set of assumed proofs have $\bot$ as a consequence (the $\bot$ sign is often dropped) then anything follows.

$ \dfrac{\Delta \vdash \bot } {\Delta \vdash \alpha} \textsf{(Weak-R)} $

Similarly, if from the assumption that $\Delta$ is refuted we can conclude the statement that can never be refuted ($\top$) then we can deduce any other statement from $\Delta$. (remember $\top$ is the initial object in a co-Heyting algebra).

$ \dfrac{\Delta \dashv \top } {\Delta \dashv \alpha} \textsf{(Weak-R)} $

Similarly the following rule shows that one must interpret the left side disjunctively.

$ \dfrac{\Delta \dashv \beta} {\Delta, \alpha \dashv \beta} \textsf{(Weak-L)} $

For if from a hypothetised refuted $\Delta$ one can refute $\beta$, then adding an arbitrary $\alpha$ to $\Delta$ will not affect the refutation. This addition must be harmless. It explains also why the rule is called a weakening.

Following the Curry-Howard isomorphism, we would like formulas to be understood as types. In constructive logic the witnesses to the formulas are elements of the type, which can also be thought of as proofs. On the other hand, in the co-constructive world witnesses are refutations of the type. One can think of them as being part of the shadow of the type, or it's anti-extension which is the term used by Estrada-Gozalez in From (paraconsistent) topos logic to universal (topos) logic.

Next, consider the following fragments of the co-intuitionistic sequent calculus

$ \dfrac{\Delta \dashv \alpha} {\Delta \dashv \alpha \land \beta} \textsf{($\land$ R)} $

From a hypothesis $\Delta$ containing a refutation of you being in your house I can refute that you are in your room ($\Delta \vdash \alpha$). It follows that from a refutation that you were in your home I can also refute that you were in your room and making a phone call ($\alpha \land \beta$).

Co-implication explained

$ \dfrac{\Delta, \alpha \dashv \beta} {\Delta \dashv \beta \leftarrow \alpha} \textsf{($\leftarrow$ R)} $

If we start with the hypothesis that $\Delta$ and $\alpha$ are refuted then we can refute $\beta$. If we move $\alpha$ to the right then, we are told that from a hypothesis of $\Delta$ being refuted, we can refute $\beta \leftarrow \alpha$. What kind of thing is $\beta \leftarrow \alpha$?

Perhaps the easiest is to start with an empty $\Delta$, giving us the special case

$ \dfrac{\alpha \dashv \beta} {\dashv \beta \leftarrow \alpha} \textsf{($\leftarrow$ R minimal)} $

Let us start with the hypothesis that from a refutation that you are in your home ($\alpha$), I must refute that you are in your room ($\beta$). From this I can conclude that $\emptyset \dashv \beta \leftarrow \alpha$, which means that from no hypothesis I can refute $\beta \leftarrow \alpha$. One interpretation is that the refutation of $\beta \leftarrow \alpha$ are the possibilities of the refutation of $\beta$ minus those of $alpha$. In our case that would be the possibilities refuting your being in your room minus the possibilities of your being in your house. But anything that refutes you being in your house refutes your being in your room too. So there are no remaining possibilities. This is called a counter-theorem (of co-intutionistic refutation logic): It can't be refuted.

If we try with only two formulas in the antecedent, such as:

$ \dfrac{\gamma, \alpha \dashv \beta} {\alpha \dashv \beta \leftarrow \gamma} \textsf{($\leftarrow$ R minimal-2)} $

$\gamma$ is the statement that you are talking to your grandmother, $\alpha$ that are in your home and $\beta$ that you are in your room. What is clear is that if I accept the top line of the rule, then I must accept the bottom one. And the bottom one no longer has $\gamma$ to the left of $\dashv$ on which I can rest a denial. So the result is that from denying $\alpha$ I must deny $\beta$ but without the potential support of $\gamma$. So from denying that you are in your home I can refute that you are in your room but without refuting that you are talking to your grandmother.

Trafford at p159 Meaning in Dialogue argues that $\beta \leftarrow \gamma$ can also be thought of as a function. What kind of function is it?

c is a falsification of $B^- \leftarrow A^-$ iff c is a function that converts each refutation $\alpha^-$ of $A^-$ into a refutation $c(α^-)$ of $B^−$.

Where the superscripts - indicate that we the propositions are to be read in a negative sense.

It is very tempting to think of this function as needing a proof of $\alpha$ in order to return a refutation of $\beta$. But we can't assume we have proof objects here, and this seems to loose the symmetry with functions... Indeed this point is brought out very nicely by Kaspner in his 2014 book Logics and Falsifications through this illustration

Let us suppose that I, in an attempt to show off my fortune telling abilities, say to you on new year’s eve: “If a black cat crosses your path tomorrow, you will not catch a cold all through the year.” You are not quite sure what to make of this prediction, but resolve to watch out for black cats the next day. However, as the day unfolds the whole business slips from your mind. Two weeks later you come down with a bad cold. This is the point where you remember what I had said. You cannot remember whether you saw a black cat or not, and there is surely no one else who followed you around and paid attention to this matter. It seems this is enough to preclude future falsification (or verification, for that matter) of the antecedent. The consequent, on the other hand, is clearly falsified. That is, our current state of affairs is in the set $f_{NoCold} \cup f_{BlackCat}^{\bot}$, and thus in $f_{BlackCat} \to NoCold$. But would we really say that you have falsified my assertion? Would it be right of you to call me and demand that I withdraw it?

  • You said I wouldn’t get sick if a black cat crossed my way on the first of January, and now I have a fever!
  • I’m sorry and surprised to hear it; did you see a black cat, then?
  • I don’t know, but I’m sure you can’t prove that I didn’t!

It is clear that this is not a convincing way to get me to take back what I said. Even if you replied instead

I surely did!

knowing full well that I cannot falsify your claim, I could simply reply

Oh, but your cold shows that you couldn’t have seen one. Probably you’ve mistaken a small dog for a cat.

As we assume that we are uttering our claims under the falsificationistic norm, both of our assertions will stand. What is missing to force me to retract my conditional is clearly the verification of the antecedent.

To get the verification requires fetching information from the other side of the discussion, as argued by Trafford.

Building on Category Theory

I start with an example in constructive proof in order to then turn it around

Starting from a constructive proof

Since constructive logic is dual to refutation logic, and since as a programmer I am more used to constructive logic, I thought it would better to start from there and then turn things around. That would help test if the proposition examples are well chosen. The following seems to work better:

  • B: Payment with debit card
  • C: Banking system working
  • A: Money on my Account

diagram of exponential

Start with a function $f: C \times A \to B$ that takes proof that the banking system works and a proof that I have money in my account to return a proof that I can pay with my debit card. The diagram then tells us that from f we can also get a function $g: C \to B^A$ which says that there is a function that takes a proof that the banking system is working to a function that given a proof that there is money in my account will tell me if I can pay with my debit card.

Now clearly proving that the banking system is working is a very high order requirement. It would require a huge amount of empirical observations to attest. This opens the door to the dual refutation based reasoning.

Dual example in co-constructionist logic

A co-exponential of objects A and B, denoted $B_A$ is defined by the arrow $\ni: B \to B_A \oplus A $ which satisfied the following property: for any object C, and any arrow $f: B \to C \oplus A$, there is a unique arrow $h: B_A \to C$ as in the following diagram

Co-Exponential diagram

That is, the exponent object captures a universal, and can be built only from using $\ni$ using only information from B. So in terms of types as possibilities, it makes sense to see $B_A$ as all the possibilities of B less those of A thought of as refutation types. $\ni$ then can be seen to take a B and to return either the possibilities of B less A or those of A.

Here we can clearly can see that a function $f: B \to C \oplus A$ tells us that from a failure of a payment we can get to a proof that either the banking system is not working or the account being empty. That seems right. Note how we need to read all the propositions negatively. What the rest of the diagram tells us is that f can be decomposed into two further functions $\ni ; h$

To get this right we need to understand what $B_A$ is. An instance of $B_A$ is a co-implication $B \leftarrow A$, which is a refutation of B that does not depend on a refutation of A. That is a Debit Card transaction failure that occurs even though there is money in the bank. So

$\ni: B \to B_A \oplus A $

tells us that from an unsuccessful payment we can deduce either

  • $A$ a refutation of money being on the account
  • or $B_A$ a payment failure with money being on the account, or rather a payment failure, that is established not to involve money being on the account.

Now Trafford wants to go further an apply the Curry-Howard isomorphism here. He argues that by symmetry with constructive logic we should see an instance $f_{b \leftarrow a}: B_A$ as a function that takes a falsification of A to a falsification of B. In this case this would be a function that takes a falsification of money being on the account to a falsification of the debit card working. Can we produce one such function? Sure that's what banks do when they cut off a debit card. So what $\ni: B \to B_A \oplus A$ is saying is that from a proof of an unsuccessful debit card transaction one can get either a proof that the account is empty or (exclusive or) a function $f_{b \leftarrow a}: B_A$ just described.

But here we just present such a $f_{b \leftarrow a}$ to the morphism $h: B_A \to C$ to reach the conclusion that the banking system is broken. We don't apply the non existent refutation of A to it, so the co-implication $f_{b \leftarrow a}$ acts more like a proof, than something to which something is applied. And indeed we are using it here for what we may wish to call a co-curry operation, rather than an application operation.

Here is is worth remembering that the function $h: B_A \to C$ is unique. Since $id_A$ is also unique, the combination $g = h \oplus id_A$ is also unique (given $f$), and $f =\; \ni; g$. Category Theory tells us that we need a function $f_{b \leftarrow a}$ in order to get a result from our $g$ in all circumstances. We can oddly enough only use it though when it does not apply (eg. in our use case we do not have a proof that the bank account is empty).

To summarize. The advantage of the co-intuitionistic way of reasoning, is that it is much easier to get a refutation of a payment and a proof that an empty bank account will lead to a non-working of a debit card (by looking for example at the contract for the card) than it is to check that the whole banking system is working.