Up to the London to go to another Pavel Haas Quartet concert. They played Igor Stravinsky’s short Concertino for String Quartet (new to me, and I’ll need to listen again to get more out it), followed by a wonderful performance of the Ravel quartet. After the interval, the Pavel Haas were joined by Pavel Nikl — who was their original violist, and only left the Quartet early last year for reasons of family illness. They played Dvořák’s String Quintet in E flat major, Op 97, to rapturous applause — indeed an appropriate response. They have recently recorded this Quintet (together with the Dvořák Piano Quintet — the CD is released soon); the music was very obviously in their hearts and in their fingers, and was played with great warmth and their usual stunning ensemble.

Although it isn’t the same as being there, you can listen to the concert for another four weeks on the BBC site, here.

]]>The answer to the quiz question? The first group, from George Herbert on, unlike members of the second group, might keep company together in an anthology of poets who were all members of Trinity, Cambridge. OK, if you are fussy, it might be stretching a point, I suppose, to count the versifying A.A. Milne as a poet: but let’s be ecumenical, as his verse certainly has added to the gaiety of nations!

And, yes, a substantial anthology of *Trinity Poets* has recently been published by Carcanet, edited by Adrian Poole and Angela Leighton. It features almost fifty poets from the sixteenth to the twenty-first centuries, many familiar, but also quite a few new to me. Like many a poetry anthology, it affords the delight of serendipitous discovery. Not least among the most recent poets. So while this is a book I bought with a certain self-mocking collegial devotion, I found myself dipping into it with considerable unforced pleasure.

Where might George Herbert, Andrew Marvell, John Dryden, Lord Byron, Alfred Lord Tennyson and A. E. Houseman keep company with A. A. Milne and Vladimir Nabokov? And why haven’t John Donne, John Milton, Alexander Pope, Percy Bysshe Shelley, Robert Browning or John Betjemann been invited?

Answer in the next post.

]]>Dark times, in many too ways. Words can fail us. But Haydn’s inexhaustible humanity can be a comfort and inspiration, no? So let me recommend a recent CD of his music that I have so very much enjoyed, the Chiaroscuro Quartet returning to the Opus 20 quartets to complete their recording (BIS 2168 — you can also stream through Apple Music, and find their previous four discs there too).

Four friends, occasionally coming together to play concerts and record, performing with delight and bold inventiveness. Their use of gut strings makes for wonderful timbres, now earthy, now confiding, now echoing a viol consort. This is extraordinary playing, and not just from Alina Ibragimova who leads the quartet: the sense of ensemble and the interplay of voices puts some full-time quartets to shame. Richly rewards repeated listening.

]]>- Elaine Landry,
*Categories for the Working Philosopher*(Nov 2017). - Geoffrey Hellman and Stewart Shapiro,
*Varieties of Continua: From Regions to Points and Back*(Jan 2018). - Tim Button and Sean Walsh,
*Philosophy and Model Theory*

Enthusiasts for category theory may also be interested in Olivia Caramello’s book

*Theories, Sites, Toposes: Relating and studying mathematical theories through topos-theoretic ‘bridges’* (Nov 2017) — though I have in the past found it difficult to grasp her project, which sounds as if *ought* to be of interest to those with foundational interests.

All these books are available for pre-order in the sale.

]]>The revised, surely-more-natural, disjunction elimination rule mentioned in the last post is, of course, Neil Tennant’s long-standing proposal — and the quote about the undesirability of using explosion in justifying an inference like disjunctive syllogism is from him. This revision has, I think, considerable attractions. Even setting aside issues of principle, it should appeal just on pragmatic grounds to a conservative-minded logician looking for a neat, well-motivated, easy-to-use, set of natural deduction rules to sell to beginners.

But, in Tennant’s hands, of course, the revision serves a very important theoretical purpose too. For suppose we want to base our natural deduction system on paired introduction/elimination rules in the usual way, but also want to avoid explosion. Then the revised disjunction rule is just what we need if we are e.g. to recover disjunctive syllogism from disjunction elimination. And Tennant does indeed want to reject explosion.

Suppose, just suppose, that even after our logic teachers have tried to persuade us of the acceptability of explosion as a logical principle, we continue to balk at this — while remaining content e.g. with the usual remaining negation and conjunction rules. This puts us in the unhappy-looking position of accepting the seemingly quite harmless and while wanting to reject . So we’ll have to reject the unrestricted transitivity of deduction. Yet as Timothy Smiley put it over 50 years ago, “the whole point of logic as an instrument, and the way in which it brings us new knowledge, lies in the contrast between the transitivity of ‘entails’ and the non-transitivity of ‘obviously entails’, and all this is lost if transitivity cannot be relied on”. Or at least, all *seems* to be lost. So, at first sight, restricting transitivity is a hopeless ploy. Once we’ve accepted those harmless entailments, we just have to buy explosion.

But maybe, after all, there is wriggle room (in an interesting space explored by Tennant). For note that when we paste together the proofs for the harmless entailments we get a proof which starts with an *explicit* contradictory pair . What if we insist on some version of the idea that (at least by default) proofs *ought* to backtrack once an explicit contradiction is exposed, and then one of the premisses that gets us into trouble needs to be rejected? In other words, in general we cannot blithely argue past contradictions and carry on regardless. Then our two harmless proofs cannot be pasted together to get explosion.

But how can we restrict transitivity like this without hobbling our logic in the way that Smiley worried about? Well suppose, just suppose, we can arrange things so that if we have a well-constructed proof for and also a well-constructed proof for , then there is EITHER a proof of (as transitivity would demand) OR ELSE proof of . Then perhaps we can indeed learn to live with this.

Now, Neil Tennant has been arguing for decades, gradually revising and deepening his treatment, that we *can* arrange things so that we get that restricted form a transitivity. In other words, we can get an explosion-free system in which we can paste proofs when we ought to be able to, or else combine the proofs to expose that we now have contradictory premisses (indeed that looks like an epistemic gain, forcing us to highlight a contradiction when one is exposed). On the basis of his technical results, Tennant has been arguing that we *can* and *should* learn to live without explosion and with restricted transitivity (whether we are basically classical or intuitionist in our understanding of the connectives). And he has at long last brought everything together from many scattered papers in his new book *Core Logic* (just out with OUP). This really is most welcome.

Plodding on with my own book, I sadly don’t have the time to comment further on this Tennant’s book for now. But just one (obvious!) thought for now. We don’t have to buy his view about the *status* of core logic (in its classical and constructivist flavours) as getting it fundamentally *right* about validity (in the two flavours). We can still find much of interest in the book even if we think of enquiry here more as a matter of exploring the costs and benefits as we trade off various desiderata against each other, with no question of there being a right way to weight the constraints. We can still want to know how far can we go in preserving familiar classical and constructivist logical ideas (including disjunctive syllogism!) while avoiding the grossest of “fallacies of irrelevance”, namely explosion. Tennant arguably tells us what the costs and benefits are if we follow up one initially attractive way of proceeding — and this is a major achievement. We need to know what the bill really is before we can decide whether or not the price is too high.

The standard Gentzen-style disjunction elimination rule encodes the uncontroversially valid mode of reasoning, argument-by-cases:

Now, with this rule in play, how do we show ? Here’s the familiar line of proof:But familiarity shouldn’t blind us to the fact that there is something really rather odd about this proof, invoking as it does the explosion rule, ex falso quodlibet. After all, as it has been well put:

Suppose one is told that

ABholds, along with certain other assumptionsX, and one is required to prove thatCfollows from the combined assumptionsX,AB. If one assumesAand discovers that it is inconsistent withX, one simply stops one’s investigation of that case, and turns to the caseB. IfCfollows in the latter case, one concludesCas required. One does not go back to the conclusion of absurdity in the first case, and artificially dress it up with an application of the absurdity rule so as to make it also “yield” the conclusionC.

Surely that is an accurate account of how we ordinarily make deductions! In common-or-garden reasoning, drawing a conclusion from a disjunction by ruling out one disjunct surely *doesn’t* depend on jiggery-pokery with explosion. (Of course, things will be look even odder if we don’t have explosion as a primitive rule but treat instances as having to be derived from other rules.)

Hence there seems to be much to be said — if we want our natural deduction system to encode very natural basic modes of reasoning! — for revising the disjunctive elimination rule to allow us to, so to speak, simply eliminate a disjunct that leads to absurdity. So we want to say, in summary, that if both limbs of a disjunction lead to absurdity, then ouch, we are committed to absurdity; if one limb leads to absurdity and the other to *C, *we can immediately, without further trickery, infer *C*; if both limbs lead to *C*, then again we can derive *C*. So officially the rule becomeswhere if both the subproofs end in so does the whole proof, but if at least one subproof ends in *C*, then the whole proof ends in *C*. On a moment’s reflection isn’t this, pace the tradition, *the* natural rule to pair with the disjunction introduction rules? (At or at least natural modulo worries about how to construe “” — I think there is much to be said for taking this as an absurdity marker, on a par with the marker we use to close of branches containing contradictions on truth trees, rather than a wff that can be embedded in other wffs.)

If might be objected that this rule offends against some principle of purity to the effect that the basic, really basic, rules for another connective like disjunction should not involve (even indirectly) another connective, negation. But it is unclear what force this principle has, and in particular what force it should have in the selection of rules in an introductory treatment of natural deduction.

The revised disjunction rule isn’t a new proposal! — it has been on the market a long time. In the next post, credit will be given where credit is due. But just for the moment, I am interested in the proposal as a stand-alone recommendation, not as part of any larger revisionary project. The proposed variant rule is obviously as well-motivated as the standard rule, and it makes elementary proofs shorter and more natural. So what’s not to like?

]]>I’d like to have had the time now to carefully read Jan von Plato’s new book and comment on it here, as the bits I’ve dipped into are very interesting. But, after a holiday break, I must get my nose back down to the grindstone and press on with revising my own logic book.

So, for the moment, just a brief note to flag up the existence of this book, in case you haven’t seen it advertised. Von Plato’s aim is to trace something of the history of theories of deduction (and, to a lesser extent, of theories of computation). After ‘The Ancient Tradition’ there follow chapters on ‘The Emergence of Foundational Studies’ (Grassman, Peano), ‘The Algebraic Tradition of Logic’, and on Frege and on Russell. There then follow chapters on ‘The Point of Constructivity’ (Finitism, Wittgenstein, intuitionism), ‘The Göttingers’ (around and about Hibert’s programme), Gödel, and finally two particularly interesting chapters on Gentzen (on his logical calculi and on the significance of his consistency proof).

This isn’t a methodical tramp through the history: it is partial and opinionated, highlighting various themes that have caught von Plato’s interest. And it’s all the better for that. The book retains the flavour of a thought-provoking and engaging lecture course, which makes for readability. It has been elegantly and relatively inexpensively produced by Princeton UP: you’ll certainly want to make sure your university library has a copy.

]]>