It’s tough being a philosophy student …

The last tripos scripts marked! And by happy chance, the answers I marked today were among the best of the season, and cheered me up a lot. (I’m Chair of the Examining Board for Parts IB and II this year so I’ve meetings to organize this coming week, and still have to examine three M.Phil. dissertations: but at least I’ve read all the relevant undergraduate scripts for the year. And I’m on leave next summer term, so that’s it for two years. Terrific.)

It strikes me again while marking that it’s quite tough being a philosophy student these days: the disputes you are supposed to get your head around have become so sophisticated, the to and fro of the dialectic often so intricate. An example. When I first started teaching, Donnellan’s paper on ‘Reference and Definite Descriptions’ had quite recently been published — it was state of the art. An undergraduate who could talk some sense about his distinction between referential and attributive uses of descriptions was thought to be doing really well. Just think what we’d expect a first class student to know about the Theory of Descriptions nowadays (for a start, Kripke’s response to Donnellan, problems with Kripke’s Gricean manoeuvres, etc.). True there are textbooks, Stanford Encyclopedia articles, and the like to help the student through: but still, the level of sophistication we now expect from our best undergraduates is daunting.

Quackery and the philosophy of science

I’ve just added a link to Prof. David Colquhoun’s blog on quackery, now renamed Improbable Science (after a bit of a legal kerfuffle, recounted in today’s Guardian). Great stuff.

And indeed, there are some really terrific internet resources out there, doing their best to counteract the tide of unreason that sometimes seems to be lapping at our doors. A question though. You’d have thought that philosophers of science would have a strong interest in joining in enthusiastically with the public defence of scientific reason, and doing their bit to chip away at the complete bollocks of creationism, intelligent design, and all the rest. Where are they all? Maybe it is just incompetent googling on my part, but I haven’t noticed any vigorous internet interventions. But I’d be very happy to be proved wrong … any pointers to blogs for me to link to?

Church’s Thesis 13: Gödel on Turing

Phew! At last, I can warmly recommend another paper in Church’s Thesis after 70 Years

Although initially Gödel was hesitant, by about 1938 he is praising Turing’s work as establishing the “correct definition” of computability. Yet in 1972 he writes a short paper on undecidability, which includes a section headed “A philosophical error in Turing’s work”, in which Gödel takes issue with Turing’s implicit assumption of the finitude of the human mind. Question: is this a change of view on Gödel’s part (from seeing Turing’s work as successful to seeing it as fatally flawed)?

Surely not. What Gödel was praising in 1938 was Turing’s analysis of a finite step-by-step computational procedure. (Recall the context: Turing was originally fired up by the Entscheidungsproblem, which is precisely the question whether there is a finitistic procedure that can be mechanically applied to decide whether a sentence is a first-order logical theorem. So it is analysis of such procedures that is called for, and that was of concern to Gödel too.) What the later Gödel was resisting in 1972 was Turing’s further thought that, in the final analysis, human mental procedures cannot go beyond such finite mechanical procedures (he was inclined to think that, in the words of his Gibbs Lecture, the human mind “infinitely surpasses the powers of any finite machine”). There is no change of mind, just a change of topic.

That’s at any rate what I have previously taken to be the natural view, without doing any careful homework to support it (but it is of course the view that fits with the important distinction I’ve been stressing in these comments, between CT as a claim about effective computability, and bolder theses about what can be computed by machines and/or minds). And so it is very good to see that now Oron Shagrir has given a careful and convincing defence of this view, with lots of detailed references, in his (freely downloadable!) paper “Gödel on Turing on computability”. Good stuff!

Church’s Thesis 12: Kreisel, Church

I’m going to pass over the next three papers in the Olszewski collection pretty quickly, for various reasons.

First, there’s a piece on “Analog computation and Church’s Thesis” by Jerzy Mycka: I just don’t know anything about this stuff, and am certainly not in a position to comment on this piece, though it looks decent enough.

Then there is a paper by Piergiorgio Odifreddi on “Kreisel’s Church”. Now, I find Kreisel’s writings intriguing and irritating in roughly equal measure (all sorts of interesting things seem to be going on, but he just can’t or won’t write flat-footed technical exposition and direct, transparent, plain-spoken, philosophical commentary). So I’d hoped that Odifreddi might take up one or two of Kreisel’s themes from his discussions of CT and give them a careful development and exploration — for Odifreddi’s wonderful book on Recursion Theory shows how clear and judicious he can be. But regrettably, he does something quite different: he takes us on a lightening survey of Kreisel’s many relevant articles, with a lot of quotations and some references to related later work. But if you were puzzled by Kreisel before, you’ll stay pretty puzzled — though you’ll have a longer bibliography!

Next, Adam Olszewski contributes a short piece on “Church’s Thesis as interpreted by Church”. Here Olszewski does at least pick up on the point that I noted that Murawski and Wolenski slurred over without comment. CT is nowadays usually taken to be a claim about the co-extensiveness of the two notions of effective computability and recursivess; but the Founding Fathers were wont to talk of the notions being identical, or of one notion being a definition of the other. Church in 1936 himself uses both identity talk and definition talk. But actually, it isn’t too likely that — at that early date — Church had a clearly worked out position in mind on the status of the correlation he was proposing between computability and recursivess, and I don’t think we can read too much into his choice of words here. The headline is that Church, at least originally, seemed to take the modest view of the correlation as having something like the status of a Carnapian explication, while Turing thought, more boldly, that it could be forced on us by an analysis of the very notion of a computation. This is familiar stuff, however, and Olszewski’s brief remarks don’t in any way change the familiar picture.

Diversions: Stripes, Bach, cats and soup

The hundreds — nay, countless thousands — who regularly look at this blog for its sheer entertainment value may be feeling just a bit short changed, what with all those heavy duty posts on Church’s Thesis interspersed with moans about tripos marking. What kind of way is that to treat a devoted readership? So here are some assorted diversions (the Monday Colour Section, if you like).

  1. First, you might have missed this about causation, late on BBC2 a few nights ago: and there’s a new CD out in a few days!
  2. Not to your taste? Then here’s an even better reason to be cheerful — dates for a world tour announced.
  3. Then there is the silly website of the day.
  4. Recipe of the day (from my all-time favourite restaurant — worth a detour!)
  5. And announced today, more about Leopard … looks truly amazing.

Church’s Thesis 11: Its status, once more

As you can imagine, it is going to be pretty difficult, by this stage in the game, to say something both novel and illuminating about ‘The status of Church’s Thesis’. And certainly, Roman Murawski and Jan Wolenski don’t really succeed in their paper of that title.

They don’t help their cause by seeming to vacillate from the start about how to understand Church’s Thesis itself. Their official story is clear enough: CT is the claim that a function is effectively computable if and only if it is (partial) recursive. Fine: by my lights, that’s the right thing to say. But note, this makes CT a doctrine in the realm of reference. But then Murawski and Wolenski immediately offer, without any comment on the mismatch, quotations from Kalmar and Kleene which talk of CT as stating ‘the identity of two notions’ or as supplying ‘previously intuitive terms … with a certain precise meaning’ — talk which is only really appropriate if CT is taken as a doctrine in the realm of sense. Of course, our belief that the extensions of ‘effectively computable function’ and ‘recursive function’ are the same may well be grounded in beliefs about the relation between the senses of those expressions: but we surely ought to be clear here when we are talking about sense, and when we are talking about reference.

Anyway, Murawski and Wolenski proceed to distinguish four lines that have been taken about the status of CT, of increasing plausibility according to them: (1) it is an empirical hypothesis, (2) “is an axiom or theorem”, (3) it is a definition, (4) it is an explication.

But they seem just confused in supposing that (1) people have supposed that CT — as they have officially defined it — is an empirical hypothesis. Rather, it is claims like “every humanly computable function is recursive”, or “every function computable by a physically realizable machine is recursive”, that have been thought to be empirical. And as I’ve stressed before in this series of postings, CT is to be distinguished from those quite different claims. So put (1) aside.

Now, as Mendelson famously stressed, the easy half of CT looks to have a perfectly good informal proof (it is an informal theorem, if you like). Running the argument for total functions: the initial functions are plainly effectively computable; definitions by composition and recursion preserve effective computability; and definition by regular minimization preserves effective computability too. But all total recursive functions are definable from initial functions by composition, recursion and/or regular minimization. So all total recursive functions are effectively computable. Mutatis mutandis for partial recursive/partial computable functions. QED. So if we are to reject (2) we need a good reason to suppose that it is impossible to run a comparable argument for the difficult half of CT. But Murawski and Wolenski change the subject twice. First, they talk about what would be needed for a formal proof of CT and the problem of showing that “the axioms of the adopted axioms for computability do in fact reflect exactly the properties of computability (intuitively understood)” which looks too much like the problem of establishing CT. But whatever the force of those remarks about formal provability, they don’t in themselves sabotage the possibility of an argument for the hard half of CT which, while informal, is as cogent as the familiar argument for the easy half. Then secondly, Murawski and Wolenski go off at a tangent talking about the role of CT in proofs: but that’s irrelevant to (2).

We can skip over (3) as such remarks about CT as a definition that we find in the Founding Fathers actually don’t seem to distinguish the thought from (4). The real options do indeed seem to be (2) — understood as a claim about there being an informal proof for CT — versus (4). Murawski and Wolenski worry a bit about the kind of explication CT could provide, given that they say “it replaces a modal property of functions by one that is non-modal”. But while there may be worries about the notion of explication here, that isn’t one of them. For “f is effectively computable” is in the current context equivalent to “there is an algorithm which computes f”. So there isn’t anything excitingly modal going on.

Hemlock all round?

I remember Geoffrey Hunter (the author of the rather good Metalogic) telling me years ago that at the beginning of his intro logic course, having explained the idea of a valid argument, he gave out a sheet of examples to see which arguments beginners naively judged to be valid and which not. Then, at the end of the course, he gave out the same example sheet, asked which arguments were valid … and people on average did worse.

Well, you can see why. Students learn some shiny new tools and are then tempted to apply the tools mindlessly, so e.g. faced with inferences involving conditionals, despite all your warnings, they crank a truth-table, and out comes a silly answer.

Likewise, students uncorrupted by philosophy could of course reel off a list of scientific theories that have been seriously proposed in the past but which — they’d agree — have been shown to be wrong (the phlogiston theory of combustion, the plum pudding model of the atom, and so on and so forth). But teach students some philosophy of science and ask them if you can falsify a theory and they now firmly tell you that it can’t be done (merrily arguing from the flaws of Popper’s falsificationism to the impossibility of showing any theory is false). Sigh. Of course, the same students will — in another answer — also tell you that scientific realism is in deep trouble because of the pessimistic induction from all those false past theories …

We try not to addle our students’ brains, but I fear we do.

Church’s Thesis 10: Precision and pretension

So, we’re halfway through my blogview of Church’s Thesis After 70 Years edited by Adam Olszewski, Jan Wolenski and Robert Janusz: I’ve commented on eleven papers, and there are eleven to go. I’m afraid that so far I’ve not been wildly enthusiastic: the average level of the papers is not great. But there are cheering names yet to come: Odifreddi, Shapiro, Sieg. So I live in hope …

But the next paper — Charles McCarty’s ‘Thesis and Variation’ — doesn’t exactly raise my spirits either. The first couple of sections are pretentiously written, ill-focused, remarks about ‘physical machines’ and ‘logical machines’ (alluding to Wittgenstein). The remainder of the paper is unclear, badly expounded, stuff about modal formulations of CT (in the same ball park, then, as Horsten). Surely, in this of all areas of philosophy, we can and should demand direct straight talking and absolute transparency: and I’ve not the patience to wade through authors who can’t be bothered to make themselves totally clear.

At least the next piece, five brisk sides by Elliott Mendelson, is clear. He returns to the topic of his well known 1990 paper, and explains again one of his key points:

I do not believe that the distinction between “precise” and “imprecise” serves to distinguish “partial recursive function” from “effectively computable function”.

To be sure, we offer more articulated definitions of the first notion: but, Mendelson insists, we only understand them insofar as we have an intuitive understanding of the notions that occur in the definition. Definitions give out at some point where we are (for the purposes at hand) content to rest: and in the end, that holds as much for “partial recursive function” as for “effectively computable function”

Mendelson’s point then is that the possibility of establishing the ‘hard’ direction of CT can’t be blocked just by saying that the idea of a partial recursive function is precise, the idea of an effectively computable function is isn’t, so that there is some sort of categorial mismatch. (Actually, though I take Mendelson’s point, I’d want stress a somewhat different angle on it. For CT is a doctrine about the co-extensiveness of two concepts. And there is nothing to stop one concept having the same extension as another, even if the first is in some good sense relatively ‘imprecise’ and the second is ‘precise’ — any more than there is anything to stop an ‘imprecise’ designator like “those guys over there” in the circumstances picking out exactly the same as “Kurt, Stephen, and Alonzo”.)

As to the question whether the hard direction can actually be proved, Mendelson picks out Robert Black’s “Proving Church’s Thesis”, Philosophia Mathematica 2000, as the best recent discussion. I warmly agree, and I take up Robert’s story in the last chapter of my book.

Damned squiggles

I discovered earlier today that some symbols weren’t showing correctly in a Windows browser. Sorry about that. I think I’ve sorted it and revised a couple of posts accordingly. But I won’t be able to check on a Windows machine again for a couple of days.

Church’s Thesis 9: Epistemic arithmetic, algorithms, and an argument of Kleene’s

Ok, yes, yes, I should be marking tripos papers. But I need to take a break before plunging enthusiastically back to reading yet more about gruesome ravens … So, let’s return to Horsten’s paper on ‘Formalizing Church’s Thesis’.

After talking about ‘formalizing’ CT in an intuitionistic framework, Horsten goes on to discuss briefly an analogous shot at formalizing CT in the context of so-called epistemic arithmetic. Now, I didn’t find this very satisfactory as a stand-alone section: but if in fact you first read another paper by Horsten, his 1998 Synthese paper ‘In defense of epistemic arithmetic’, then things do fall into place just a bit better. EA, due originally to Shapiro, is what you get by adding to first order PA an S4-ish modal operator L: and the thought is that this gives a framework in which the classicist can explicitly model something of what the intuitionist is after. So Horsten very briefly explores the following possible analogue of ICT in the framework of EA:

L∀xyLAxy → ∃exmn[T(e, x, m) ∧ U(m, n) ∧ A(x, n)]

But frankly, that supposed analogue seems to me to have very little going for it (for a start, there look to be real problems understanding the modality when it governs an open formula yet is read as being something to do with knowability/informal provability). Horsten’s own discussion seems thoroughly inconclusive too. So I fear that this looks to be an exercise in pretend precision where nothing very useful is going on.

Horsten’s next section on ‘Intensional aspects of Church’s Thesis’ is unsatisfactory in another way. He talks muddily about ‘an ambiguity at the heart of the notion of algorithm’. But there is no such ambiguity. There is the notion of an algorithmic procedure (intensional if you like); and there is the notion of a function in fact being computable by an an algorithmic procedure (so an extensional notion in that, if f has that property, it has it however it is presented). The distinction is clear and unambiguous.

Let’s move on, then, to the next paper, in the Olszewski collection, ‘Remarks on Church’s Thesis and Gödel’s Theorem’ by Stanisław Krajewski. This is too short to be very helpful, so I won’t comment on it — except to say he mentions the unusually overlooked argument from Kleene’s Normal Form Theorem and (a dispensable use of) Church’s Thesis to Gödel’s Theorem that I trailed here. The argument, as Krajewski points out, is due to Kleene himself, a fact which I now realize I didn’t footnote in my book. Oops! I think the book is being printed round about now, so it’s too late to do anything about it. I had better start a corrections page on the web site …

Scroll to Top