The simplest logic

Here is a very simple logic. It has two ingredients. Propositional variables and the Sheffer stroke (which, following Peirce, we could call the amphecore). The logic uses Polish notation to express formulae, which is just to say that the syntax rules require the sentences to be ordered (although there isn't a unique ordering because | commutes).

The syntax looks like this. Any propositional variable is well-formed and if p and q are well-formed then |pq is well formed. We can quickly confirm that all well-formed formulae have n propositional variables and n-1 strokes.

The valuation function is likewise simple. V(|pq)=0 iff V(p)=1 and V(q)=1.

Finally, what are some interesting deductive systems for this logic? A simple modus ponens-style inference rule is the following:

|pq,p ⇒ |qq

Nicod gives the following as his single NAND rule of inference.[^1]

||rqp,p ⇒ r

We could use either of these to construct a Hilbert-style axiom system.


Pretty quickly we might get tired of this logic if we're used to thinking in the less sparse language of classical logic. What parts of this logic are equivalent to classical logic?

  • ~p = |pp
  • p∧q = ||pq|pq
  • p∨q = ||pp|qq
  • p→q = ||qqp

Some Tautologies

  • ||ppp
  • |||pr|pq|pr
  • ||||ppq||ppqp

I suspect that there are some interesting relationships between the length of an axiom in this system and the number and arrangement of unique propositional variables. I mess around with this when I'm doodling, but I don't have much else to say about it.

Some speculation:

  • All tautologies have at least one sentence letter that occurs an even number of times. Maybe an odd number of such sentence letters? Maybe exactly one?
  • No tautology exists for ranks |,|||,|||| but one occurs for every other rank.
  • There is an arrangement where the number of | is weakly decreasing and the number of p is weakly increasing.
  • Probably other stuff too!

[^1]: J.G. Nicod, A reduction in the number of primitive propositions of logic, Proc. Camb. Phil. Soc. 19 (1917), 32-41.

researchAdam Edwardslogic
May Sinclair

Mary Amelia St. Clair (1863-1946), possibly better known by her pseudonym May Sinclair, was a philosopher. She was also a novelist, poet, and suffragist at the turn of the twentieth century. She is responsible for coining the term "stream of consciousness" as it refers to a style of novel in her review of Dorothy Richardson's Pilgrimage.

I know about her work because I picked up a copy of her book A Defense of Idealism (1917) in a used book store in Atlanta a few years ago. She wrote a follow up to that book five years later called The New Idealism (1922).

She begins her Defense as follows:

There is a certain embarrassment in coming forward with an Apology for Idealistic Monism at the present moment. You cannot be quite sure whether you are putting in an appearance too late or much too early.

It does look like personal misfortune or perversity that, when there are lots of other philosophies to choose from, you should happen to hit on the one that has just had a tremendous innings, and is now in process of being bowled out. As long ago as the early 'nineties Idealism was supposed to be dead and haunting Oxford.

So she gets it. The new Hegelians in Britain had just had their whole deal obliterated by the likes of Bertrand Russell and G. E. Moore, so it's odd to see someone come out in defense of the view. This is one thing that makes Sinclair such an interesting figure in the history of philosophy.

I'm troubled by the fact that Sinclair's philosophical work appears to have generated essentially no secondary literature. Like, none. A search on PhilPapers brings up a her books and a few articles, and some contemporary reviews of her books that strike me as... uncharitable. In any case, I think these books deserve attention. File this one away in After-The-Dissertation-Is-Done.

I want to close with this quote from one of Sinclair's feminist pamphlets:

We are dealing less with a psychological portent than with a new sociological factor, the SOLIDARITY OF WOMAN. And there is only one other factor that can be compared with it for importance, and that is the SOLIDARITY OF THE WORKING-MAN.

And these two solidarities are one.

Women's rights = worker's rights.

[EDIT]: Here is a good source on Sinclair's philosophical development by Dr. Charlotte Jones, who works on Sinclair's fiction. Philosophers should be engaging with this work!

The Problem of Induction

(This post is inspired by a tweet by Nathan Oseroff in which he says that we should "retire" the word induction because the word has come to mean so many different things in so many different contexts that the so-called problem of induction is just a useless collection of vaguely related concepts. This is my attempt to give a clear definition of induction and the problem everyone has with it. Yes, I'm aware of this comic.)

Deductive inference is a process by which we come to know that some proposition, Q, is true on the basis of other propositions P_1,...,P_n. The hallmark feature of deductive inference is that if it's employed in the "right" way (e.g. in a valid inference) then we can be absolutely certain that Q is true, given P_1,...,P_n.

Inductive inference (or "induction" for short) is a process by which we come to know that some proposition, Q, is true on the basis of other propositions P_1,...,P_n. In that way, deduction and induction are identical. However, unlike deductive inference, inductive inference never guarantees the truth of Q.

That's where we get the infamous "problem" of induction (which isn't actually a problem but a feature of inductive inference). The problem of induction is a challenge to tell me under what conditions, if any, I can know that Q on the basis of some propositions P_1,...,P_n. We take ourselves to know how this works for valid inferences. What about invalid inferences? Trying to answer that question is what generates the "problem" of induction.

The problem of induction is just the problem of evidential support for invalid inferences. The problem of evidential support asks why are some propositions in a relation of evidential support with others and can we tell when this happens and when it doesn't? We generally take ourselves to have a good answer to this question for deductive inference (most people think the evidential support relation reduces to the relation of validity), so the "problem" with induction is that we don't have a good answer for inductive inference.

As I understand it, there are only two plausible possible answers to the question of when we can know Q on the basis of P_1,...,P_n (in the case of induction): never and sometimes. (I'm ignoring people who asy "always" because I don't think any such people exist.)

People who say "never" are skeptics about induction. People who say "sometimes" are statisticians.

The further, and I think really interesting, question here is whether an "all-purpose" answer to this question exists. Is there a unity of "confirmation" (like Carnap thought) such that deduction and induction all fall under the same broad process? If not, is there at least a good, subject-neutral answer to the question for induction (separate from the one for deduction) that we can apply in cases where deduction won't help us?

Partitions and Graphs

A partition of a set is a way of dividing that set into several non-empty, non-overlapping subsets. For example, the partitions of the set {A,B,C} are as follows:

  • A|B|C
  • AB|C
  • AC|B
  • BC|A
  • ABC

The Bell numbers (OEIS/A000110) describe the number of unique partitions for a set of n elements. The sequence grows pretty fast:

1, 1, 2, 5, 15, 52, 203, 877, 4140, 21147, 115975, 678570, 4213597, 27644437, 190899322, 1382958545, ...

Set partitions are interesting for a number of reasons. One reason is that set partitions also describe the number of equivalence relations on a set. So, for an arbitrary set there are B_n ways to treat the n elements of that set as equivalent. This puts an upper bound on measurement, since we can understand these equivalence relations as modes of indistinguishability.

Another reason is that Bell numbers also describe the number of unique component graphs for n nodes. E.g. for a simple graph with 3 labeled nodes, there are 5 unique graphs:

  • A B C
  • A-B C
  • A-C B
  • B-C A
  • A-B-C

Set partitions also describe the possible ways of dividing a set of events into equivalent outcomes for Bayesian learning. If I want to know how to think about the possible outcomes of some experiment, for example, I partition the set of possible observations. However, your choice of partition matters a lot to how you should set your credences. This suggests that you should follow a rule or be otherwise internally consistent in some way in how you partition the set of possible observations. What I'd like to know is whether we can give well-defined conditions under which this requirement is met.