Posts tagged logic
The simplest logic

Here is a very simple logic. It has two ingredients. Propositional variables and the Sheffer stroke (which, following Peirce, we could call the amphecore). The logic uses Polish notation to express formulae, which is just to say that the syntax rules require the sentences to be ordered (although there isn't a unique ordering because | commutes).

The syntax looks like this. Any propositional variable is well-formed and if p and q are well-formed then |pq is well formed. We can quickly confirm that all well-formed formulae have n propositional variables and n-1 strokes.

The valuation function is likewise simple. V(|pq)=0 iff V(p)=1 and V(q)=1.

Finally, what are some interesting deductive systems for this logic? A simple modus ponens-style inference rule is the following:

|pq,p ⇒ |qq

Nicod gives the following as his single NAND rule of inference.[^1]

||rqp,p ⇒ r

We could use either of these to construct a Hilbert-style axiom system.


Pretty quickly we might get tired of this logic if we're used to thinking in the less sparse language of classical logic. What parts of this logic are equivalent to classical logic?

  • ~p = |pp
  • p∧q = ||pq|pq
  • p∨q = ||pp|qq
  • p→q = ||qqp

Some Tautologies

  • ||ppp
  • |||pr|pq|pr
  • ||||ppq||ppqp

I suspect that there are some interesting relationships between the length of an axiom in this system and the number and arrangement of unique propositional variables. I mess around with this when I'm doodling, but I don't have much else to say about it.

Some speculation:

  • All tautologies have at least one sentence letter that occurs an even number of times. Maybe an odd number of such sentence letters? Maybe exactly one?
  • No tautology exists for ranks |,|||,|||| but one occurs for every other rank.
  • There is an arrangement where the number of | is weakly decreasing and the number of p is weakly increasing.
  • Probably other stuff too!

[^1]: J.G. Nicod, A reduction in the number of primitive propositions of logic, Proc. Camb. Phil. Soc. 19 (1917), 32-41.

researchAdam Edwardslogic
Notes on Trivalent Logic

I've been reading a lot about trivalent logics to help with understanding Peter Vranas' work on a three-valued imperative logic. One thing in particular that I'm puzzled by is how we should understand negation in a trivalent logic. In this post I'm going to write some thoughts on this problem.

Consider that there are only six out of the 27 possible unary truth functions that are non-degenerate. By non-degenerate I mean that they do not reduce the number of possible outputs, given an arbitrary input. These are as follows:

INPUT IDENTITY 1.1 1.2 1.3 2.1 2.2
- - - 0 1 0 1
0 0 1 - 0 1 -
1 1 0 1 - - 0

It's a little simpler to tell what's going on here when you graph the relationship between the truth values. Graphically, the idea is that if each truth function is a relation between 3 truth values, then these six are the only relations where there is only one incoming edge and one outgoing edge for each truth value. In the case of identity and 1.1-3, some of these are the same edge.

Kleene, Priest, and others interpret 1.1 as negation (depending on how we interpret the truth values FALSE and UNKNOWN) but I'm not sure this is right. For one thing, this is what lets us preserve double negation elimination as a theorem/rule and with it DeMorgan's laws, etc. I think for a genuinely trivalent logic we should interpret either 2.1 or 2.2 as negation.

We can start to see the effects of this interpretation by looking at the binary truth functions. Since there are almost 20,000 of them in a trivalent logic, we're going to have to rely on symmetries to help make sense of things.

1 2 3 4 5 6
000000 000000 ------ 111111 111111 ------
0----0 011110 -1111- 1----1 100001 -0000-
0-11-0 01--10 -1001- 1-00-1 -1001- 01--10
0-11-0 01--10 -1001- 1-00-1 -1001- 01--10
0----0 011110 -1111- 1----1 100001 -0000-
000000 000000 ------ 111111 111111 ------

This is a collection of the characteristic truth tables for 24 binary truth functions in trivalent logic, divided into six groups of four, based on symmetries between them. Starting with the leftmost set of four (labeled '1'), the top left 3x3 square of truth values is the characteristic truth table for conjunction. Moving counterclockwise in that set of four, we see the characteristic truth tables for conjunction, nonimplication, sheffer stroke (nand), and converse nonimplication. In the fourth collection we have peirce's arrow (nor), converse implication, disjunction, and implication.

Likewise, the relationship between the unary function 1.1 and the binary functions in the first and fourth collections is as we would expect. Negating one or the other inputs rotates around the collection, and a negation in front of the function moves us to the other collection (and, importantly, back again).

What about the other collections of binary truth functions? These are truth functions that we can produce by applying the unary functions in 2.1 and 2.2. These, in some sense, rotate us through the pair of characteristic truth tables in (5 and 6) and (2 and 3), respectively.

OK, so what does all this show us? Honestly, I'm not sure. But I think that these truth tables show that there is a kind of symmetry that would give us two unary truth functions that capture many, and perhaps all, of the properties of negation that we care about. But instead of negation, in a trivalent logic we have left-handed negation and right-handed negation, or something like that.

These "handed" negation functions would have some interesting properties. First of all, they would cancel each other out:

For any proposition P, LRP = RLP = P.

Also, they would obey a triple negation elimination rule:

For any proposition P, LLLP = RRRP = P.

This would imply a modification to the DeMorgan's Laws for operator duality. Duals would no longer be defined in the typical way, and instead each operator would have two intermediate stages (not represented in the truth tables above) that it would have to pass through on the way to its classical dual.

I'm going to have to spend some more time on figuring out what these intermediate steps are, but it seems plausible to me that this is a more thoroughgoing trivalent logic than the traditional interpretation.

Teaching Question

Here are some questions I have about teaching formal systems. Suppose I'm teaching an introductory logic course and as part of that course I'm teaching a unit on propositional logic. There is a standard notation for the logical operators, representing premises and conclusions, etc.

1) When (if ever) is it appropriate to change the standard notation? Suppose I think that representing something in a non-standard way is more intuitive, simpler, more easily graspable than when the very same thing is represented using the standard notation. Which notation should I teach? Do I teach the standard notation because students are likely to encounter it in lots of places, or do I teach the non-standard notation because I think it's better?

2) Similarly, when (if ever) is it appropriate to introduce wholly new notational systems in the classroom? Is it ever appropriate to do for pedagogical reasons, or should we make modifications or additions in research and only teach them once they become part of the established body of work that constitutes the field of study (by being published in at least one place)? Or not even then, but only once the non-standard notation starts to become assimilated into the standards of the field (at least once multiple people have published using the previously non-standard notation)?

3) Does it make a difference if the subject is classical logic, or something in a field with less agreement? It seems like the prevalence of classical logic would weigh against using a non-standard notation. And even when there is some persistent differences in notation, those are usually mentioned and then ignored. Many instructors will at least mention that the horseshoe and right arrow are both used interchangeably to represent the material conditional, for instance.

It seems to me that there could be some value in exploring alternative notational systems, especially in places where the notation could use updating.