Here are three suggestions for fun holiday reading — none of them new books, as it happens, but ones I’ve particularly enjoyed in the last few weeks. So, warmly recommended if you haven’t yet read them and are looking for pleasing distractions:

• Robert Harris An Officer and a Spy. I was, to be honest, pretty disappointed by Harris’s latest, Conclave, whose flat-footed plot twist I found so implausible as to even make me a bit annoyed I’d spent the time reading the book. A strange lapse of form. By contrast, this fictional recounting of the Dreyfus affair is both un-put-downable and satisfying.
• Sarah Dunant, In the Company of the Courtesan. Mrs Logic Matters has been recommending for ages that I try Dunant’s books set in Renaissance Italy: and she’s absolutely right. This is the first in the series, and my first, and just is a terrific read.
• Tim Parks, Italian Ways. A serendipitous find in a charity shop, this is notionally about the vagaries of various railway journeys round Italy that Tim Parks took. There’s a lot of sharp though mostly affectionate observation from a long-time resident in Italy, written with a novelist’s way with words. As you rattle through the book, you learn some history, and (if you know the country at all) will nod with pleased recognition at Italian foibles. An unexpected delight.

Any suggested holiday reads you’d like to share?

Added The day after I posted this, the Sunday Times printed its recommendations for summer holiday reading. They mention three other books I’ve particularly admired in recent months and which you certainly should read sometime, Rose Tremain’s The Gustav Sonata, Julian Barnes’s The Noise of Time and Sarah Perry’s The Essex Serpent. But I’m not sure that either of the first two have quite the page-turning lightness you want for holiday relaxation — and while The Essex Serpent is absolutely gripping, surely you should keep it to a more Dickensian time of year, perhaps to read round the fire as the nights draw in …

Posted in This and that | 3 Comments

## Slow logic

I’m all for taking things slowly. But this is getting ridiculous.

But then, despite my all best resolutions, the lead up to the General Election here and its aftermath have all been quite ridiculously distracting. No wonder that the writing for the second edition of my intro logic text has been going corresponding slowly. Though in some ways the slowness has been rather enjoyable: for it gives me space to I feel I am still learning and getting clearer about various issues as I go along.

I suppose you might think that, having taught the stuff for forty years, I really would have achieved settled views by now on the logical basics and on how best to expound them. But you know how it is: in the hurly burly of teaching, especially when your mind is focused on more challenging courses, you let yourself get away with conventional almost-truths, or park concerns for when you have time to return to think harder about them. And then somehow another teaching year comes round before you have had the time to settle down to do the re-thinking, and the extensive re-writing of overheads, that you promised yourself. You tinker and satisfice for another year. Well, retirement with no more teaching duties brings me the time to try to do better. We’ll have to see whether the time is being well-spent!

But just today, I got CUP’s happy approval of my suggested cover image for the second edition. You’ll have to wait to find out what I’ve chosen: I’ll only say that it’s an abstract painting by a well-known early-twentieth century artist. I now merely have to get the following four hundred pages into good enough shape not to disgrace the cover …

Posted in This and that | 9 Comments

## Dummett on Frege: a modest proposal

I’ve been wanting to think a bit about Frege on assertion (and the assertion sign, content stroke, judgement stroke, etc.). So after revisiting Frege himself — not at his best, here — it was natural to turn to Dummett’s great book. Except that his chapter on Assertion is a mere 69 pages of continuous prose, without any sectioning, and without any guide posts. Which really isn’t forgivable. Paragraph by paragraph Dummett can be extremely clear: but it is only too easy to lose the overall thread.

What we need — a modest proposal —  is a third edition of Frege: Philosophy of Language where a sensitive and thoughtful editor divides the whole book into sections (and perhaps subsections too) on the model of Frege: Philosophy of Mathematics. After all, it isn’t that Dummett had a religious objection to chunking up his prose.  Even better, the editor could also provide a detailed old-style analytic table of contents. Then the reader could much more easily find their way through the book, and be greatly helped to see the structure of those long chapters as they unfold. (Here’s a rough-and-ready chunking of Ch. 2 which I did as a student hand-out, which gestures to what would be possible.)

Posted in This and that | 2 Comments

## Pavel Haas Quartet at St John’s, Smith Square (and on BBC)

To London, to see the Pavel Haas Quartet at St. John’s, Smith Square. First, Beethoven Op. 127 (with, in particular, an absolutely magical performance of the great Adagio). And then, after the interval, the most joyful performance of Dvořák’s Piano Quintet No.2 with Denis Kozhukhin. The audience was totally delighted, especially when the broadly smiling players responded to the prolonged applause by playing the Scherzo-Furiant for us again. A wonderful evening.

The PHQ were also on the radio the day before and you can listen here. From 1.17, they play the first movement of Op. 127. And from 1.37, there is a short but very charming interview with Veronika Jarůšková and Peter Jarůšek, and then the quartet play the opening moving of Shostakovich’s second quartet (which they promise to record next year).

## Postcard from Cornwall

A photo taken along the South West Coast Path, as it runs to St Anthony Head in Cornwall. It’s a particularly beautiful stretch of the path, and a quick ferry ride takes us across the bay from the apartment we rent this time of year to join the path. Balm for the soul to be here. Back to logic next week.

## Gerhard Gentzen’s Shorthand Notes

Springer have just published an exceptional volume that should quite certainly be in any university library that has any kind of logic collection. From the blurb of Saved from The Cellar – Gerhard Gentzen’s Shorthand Notes on Logic and Foundations of Mathematics: “Attempts at locating [Gentzen’s] lost manuscripts failed at the time [immediately after his death], but several decades later, two slim folders of shorthand notes were found. In this volume, Jan von Plato gives an overview of Gentzen’s life and scientific achievements, based on detailed archival and systematic studies, and essential for placing the translations of shorthand manuscripts that follow in the right setting. The materials in this book are singular in the way they show the birth and development of Gentzen’s central ideas and results, sometimes in a well-developed form, and other times as flashes into the anatomy of the workings of a unique mind.”

You can freely download part of Jan von Plato’s introduction here (pp. 65 onwards give outlines of what are in Gentzen’s notebooks). Also the “Look inside” facility on the Amazon page here enables you to look at some of the translated notebook pages. The editing and translating of these notes look to be a major achievement, which should be of great interest, and not just to pure historians of logic.

## Maddy on sets, categories, and the multiverse

Here are three large-scale issues. First, in what good sense or senses, if any, does ZFC provide a foundation for mathematics. Second, can category theory provide an alternative(?), or better(?) foundation for some or all mathematics? Third, should set theory should be understood as the study of a single universe, or as the study of a multiverse of set-theoretic universes? Each of these questions is many-faceted, and there are evident connections.

Penelope Maddy takes them all on in a substantial paper ‘Set-theoretic Foundations‘ (forthcoming in a festschrift for Woodin). She writes: “I won’t pretend to sort out all these complex and contentious matters, but I do hope to compile a few relevant observations that might help bring illumination somewhat closer to hand.” Written with her customary lucidity, Maddy does indeed do some useful sorting out, it seems to me. Worth a read if you haven’t seen the paper yet.

Posted in This and that | 2 Comments

## Empty domains #3 – from Vej Kse

The following was posted as a comment to Empty domains #1 by Vej Kse — but I think it is particularly interesting and worth highlighting as a stand-alone post.

There is an elegant and natural (from a user’s perspective) way to allow empty domains, which has been known in categorical logic and modern type theories (based on de Bruijn’s Automath and Martin-Löf’s type theory) for over 40 years.

It consists in considering formulae only relative to a given context (list) of variables (which must contain all free variables of the formula, and can be empty if the formula is closed). In many-sorted logic, each variable is attributed a sort in the context. In a way, “standard” first-order logic (the one taught in most logic textbooks) amounts to working in a context with a countable number of variables.

In natural deduction for such a system, we want to prove a formula $\phi$ in a variable context $\Gamma$ from a list of formulae $\Psi$ (an hypothesis context), where all involved formulae are well-formed in the context $\Gamma$. Just like proving an implication in a given hypothesis context amounts to proving the consequent from the context expanded with the antecedent (the variable context hold fixed), proving a universal quantification $\forall x \varphi (x)$ in a variable context amounts to proving $\varphi(x)$ in the context expanded with $x$ (the hypothesis context hold fixed).

The main reason of using a variable context, in my view, is not that it allows empty domains, but that it models more faithfully how mathematicians reason. We do not start our informal proofs by assuming given a countable sequence of variables of our domains of interest; when we write proofs, variable introduction parallels hypothesis introduction: “Let x be a real number …” vs. “Let’s assume the axiom of choice.”. Moreover, mathematicians remember just as well the variables they introduced (stored in the “variable context” part of their memory) as they remember the hypotheses they introduced (stored in the “hypothesis context” part of their memory). “Standard” first-order logic fails to model that memory of the variables introduced.

The standard proof in “standard” first-order logic of $\exists x. \varphi(x)$ from $\forall x. \varphi(x)$ is odd for mathematicians precisely because a variable appears out of nowhere, without ever being introduced.
1. Let’s assume $\forall x \varphi(x)$;
2. Eliminate $\forall$ by applying it at $y$, which gives us $\varphi(y)$.
3. Introduce $\exists$, which gives us $\exists x. \varphi(x)$.
In correcting a student’s work I would write in red next to the second step: “what is this $y$? you never introduced it”. The proof should introduce the variable $y$ before using it, which also means that it cannot just be forgotten at the end of the proof, even though it doesn’t appear in the conclusion (it’s still in the memory of the reader). To remove it from the context, a $\forall y$ must be added, giving $\forall y (\forall x \varphi(x) \implies \exists x \varphi(x))$.

Advantages of working with variable contexts besides allowing empty domains and being more faithful to ordinary reasoning:
– Leaner semantics: No need to assign values to infinitely many unused variables; we only have to assign values to the variables in the finite context.
– Separation of concerns: The quantifier rules don’t have side-conditions any more, these become well-formedness conditions on the contexts (all variable names must be different so that, when a context is extended, the name must be new) and the formulae (they must be well-formed in the contexts in which they are used, in particular only contain variable names present in those contexts).
– The structure of the quantifier introduction and elimination rules is more transparent: it takes the form of an adjunction, like for the other connectives.
– It repairs the symmetry between assumptions and variables, implication and universal quantification, conjunction and existential quantification (broken by their very different treatments in “standard” first-order logic), paving the way to the propositions-as-types (Curry-Howard) correspondence and modern type theories.

– Prenex normal forms depend on the inhabitedness assumption (and the law of excluded middle) (so we can still use it in Peano arithmetic for instance, just not in general first-order logic).

Here are a few books in which first-order logic is presented using variable contexts:
– Peter T. Johnstone: “Sketches of an elephant: A topos theory compendium” (Chapter D1);
– Bart Jacobs: “Categorical logic and type theory” (Chapter 4);
– Paul Taylor: “Practical foundations of mathematics” (Section 1.5);
– Richard Bornat: “Proof and disproof in formal logic: An introduction for programmers”.

Posted in This and that | 7 Comments

## Empty domains #2

Here is a long quotation from Oliver and Smiley in their Plural Logic, motivating their adoption of a universally free logic (i.e. one allowing empty domains and empty terms).

If empty singular terms are outlawed, we are deprived of logic as a tool for assessing arguments involving hypothetical particular things. Scientists argued about the now-discredited planet Vulcan just as they argued about the now-accepted proto-continent Pangaea; intellectuals argued about the merits of Ern Malley, the non-existent poet of the great Australian literary hoax. Sometimes of course we may know that the particular thing exists, but at others we may wish to remain agnostic or we may even have good grounds to believe that it doesn’t exist. Indeed, our reasoning may be designed precisely to supply those grounds: ‘Let N  be the greatest prime number. Then … contradiction. So there is no such thing as N’.

Now consider the effect of ruling out an empty domain. For many choices of domain —people, tables, chairs—the requirement that the domain be necessarily non-empty is frankly absurd. One may retreat to the idea that the non-emptiness of the domain is a background presupposition in any argument about the relevant kind of thing. But then we shall be deprived of logic as a tool for assessing arguments involving hypothetical kinds. We may indeed be certain that there is a human hand or two, but for other kinds this may be a matter of debate: think of the WIMPs and MACHOs of dark matter physics, or atoms of elements in the higher reaches of the Periodic Table. Sometimes we will have or will come to have good grounds to believe that the kind is empty, like the illusory canals on Mars or the sets of naive set theory, and again this may be the intended outcome of our reasoning: ‘Let’s talk about sets. They are governed by a (naive) comprehension principle … contradiction. So there are no such things.’

It may be replied that some domains are necessarily non-empty, say the domain of natural numbers. It follows that the absolutely unrestricted domain of things is necessarily non-empty too. But even if this necessity could be made out, and what’s more made out to be a matter of logical necessity, we would still not want the argument ‘∀xFx , so ∃xFx ’ to be valid. As we see it, the topic neutrality of logic means that it ought to cover both unrestricted and restricted domains, so it ought not to validate a pattern of argument which certainly fails in some restricted cases, even if not in all.

I think there are at least two issues here which we might initially pause over. First, in just what sense ought logic to be topic neutral? And second (even if we accept some kind of topic neutrality as a constraint on what counts as logic) how far does a defensible form of neutrality require neutrality about e.g. all(?) questions of existence and/or whether our language actually hooks up to the world? But leave those issues to another post — spoiler alert: I’m not minded to go with Oliver and Smiley’s arguments for a universally free logic. For the moment, what is interesting me is — whatever the force of the arguments — there is a straight parallelism between the considerations which Oliver and Smiley adduce for allowing empty names and for allowing empty domains. Certainly, I see no hint here (or in the surrounding text) that they conceive the cases as raising significantly different issues.

In the previous post, I noted that Wilfrid Hodges’s very briskly offers similar reasons of topic neutrality to argue for allowing for empty domains, but goes on to present a logical system which doesn’t allow for empty names. And, having in mind similar considerations to Oliver and Smiley’s, I suggested that this half-way-house is not a comfortable resting place: if his (and their) reason for allowing empty domains is a good one, then we should equally allow (as  Oliver and Smiley do) empty names. So the story went.

However, it was suggested that Hodges’s position is much more natural that I made out. Taking parts of two comments, Bruno Whittle puts it this way:

On a plausible way of thinking about things, to give the meaning of a name is to provide it with a referent. So to ban empty names is simply to restrict attention to those which have been given meanings. On the other hand,  …  I would say that to give the meaning of a quantifier—to ‘provide it with a domain’—is to specify zero or more objects for it to range over. To insist that one specify at least one seems barely more natural than insisting that one provide at least two (i.e. that one really does provide objects plural!)

But is it in general supposed to be a plausible way of thinking about singular terms that to give one a meaning is to provide it with a reference? Is this claim is supposed to apply, e.g., to functional terms? If it doesn’t, then coping with ’empty’ terms constructed from partial functions, for example, will still arguably need a free logic (even if we have banned empty unstructured names). While if it is still supposed to apply, then I think I need to know more about the notion of meaning in play here.

Set that aside, however. Let’s think some more about what is meant by “outlawing” or “banning” empty names and empty domains in logic — given that we can, it seems, (or so Oliver and Smiley insist) find ourselves inadventently using them, and reasoning with them. To be continued.

Posted in This and that | 6 Comments

## Empty domains #1

From wild mathematical universes back to nothing much at all: the question of empty domains in first-order logic. Not that I have anything very insightful to say:  but I want to get my mind clearer before committing myself to a line in IFL2.

In his nice book Logic (1977), Wilfrid Hodges introduces a tableau system with the following four rules for dealing with quantifiers. Two rules allow us to push a negation past a quantifier — so given $\neg\forall x\varphi(x)$ on a branch you add   $\exists x\neg\varphi(x)$, and dually. Then the rules for dealing with bare quantifiers are as follows.  Given $\forall x\varphi(x)$ on a branch, you can add $\varphi(n)$ for any $n$ already featured on the branch. While given $\exists x\varphi(x)$ on a branch, you can add $\varphi(n)$ for some $n$ which is not already used on the branch.

Compared with the standard rules for tableaux as in e.g. Jeffrey’s Formal Logic: Its Scope and Limits (1967/1981), Hodges’s quantifier instantiation rules are a tad neater, indeed elegantly symmetric — simply: instantiate universals with old names, existentials with new names.

Using ‘$\vdash$’ for provability in Hodges’s system, we notably do not get $\forall x\varphi(x) \vdash \exists x\varphi(x)$. But that is all to the good, Hodges thinks. That inference fails if we allow the empty domain. “Many logicians add one further derivation rule, which has the effect of decreeing that every domain of quantification has at least one individual in it. From our point of view this would be a mistake. A domain of quantification can perfectly well be empty — as it well may be if the conversation turns to discussing intelligent life on other planets.”

At first blush this is very neat — a more inclusive logic and more elegant rules. What’s not to like? But there’s a price to pay, though. As David Bostock points out in his Intermediate Logic (OUP 1997), in Hodges’s system we do get e.g. $\forall x\varphi(x) \vdash \varphi(n)$ and $\varphi(n) \vdash \exists x\varphi(x)$ so transitivity of derivability fails; and there are other nasty effects. Such consequences do seem pretty unhappy. Bostock diagnoses the source of the trouble to be Hodges’s half-way-house position of wanting to accommodate empty domains while his logic requires that names have denotations. Allow empty names too and of course both $\forall x\varphi(x) \vdash \varphi(n)$ and $\varphi(n) \vdash \exists x\varphi(x)$ fail, and transitivity can be restored.

And indeed, leaving aside nasty effects like transitivity failure, it is difficult to see why it should be a desirable feature of a logic that it allows empty domains but not empty names. After all, if “the conversation turns to discussing intelligent life on other planets” it could equally (back in the nineteenth century) turn to discussing the intramercurial planet Vulcan, or (later) turn to discussing Ern Malley (among poets) or Nat Tate (among painters) or Frederick R. Ewing (among novelists). And if our logic has functions, then we can find ourselves unwittingly discussing partial functions so using more terms which fail to have references. If we want to avoid the mishap of talking of things that don’t exist, then surely we’ll want to avoid not just wholesale foul-ups (empty domains) but local foul-ups (empty terms). If we push Hodges’s rationale for allowing empty domains to its natural conclusion, then, it seems that seems we have to go the full revisionary hog and be free logicians. But free logics — while not horrendous — haven’t the elegant simplicity of Hodges’s system. Do we really have to end up there? Not in an introductory book, that’s for sure! To be continued …

Footnote: what is the natural deduction analogue of Hodges’s system? I think this works: take the usual sort of $\exists$-elimination rule (given $\exists x\varphi(x)$ start a subproof with  $\varphi(n)$ with $n$ new …), let the $\forall$-elimination rule allow just instantiation with respect to names already appearing (not in finished subproofs), and add the two rules, from $\neg\forall x\neg\varphi$ infer $\exists x\varphi$ and its dual. But I guess this isn’t an entirely natural presentation of a natural deduction system, if we like thinking Gentzen-style of introduction rules as more fundamental, as meaning-fixing, and elimination rules to be justified as being in harmony with them.

Posted in IFL, Logic | 10 Comments