Today is the birthday (1908) of Willard Van Orman Quine, a US philosopher and logician squarely in the analytic tradition, and certainly one of the most influential philosophers of the twentieth century. Western philosophy is every bit as technical as Western science, so I am going to have to struggle to explain Quine’s influence. Quine worked first in symbolic logic and then moved into the philosophy of language and of meaning. These are all areas that fascinate me, but can seem like a gigantic waste of time because they have zero practical application except to amuse and confound smart people.

Quine grew up in Akron, Ohio, where he lived with his parents and older brother. His father was a manufacturing entrepreneur (founder of the Akron Equipment Company, which produced tire molds) and his mother was a schoolteacher. He received his B.A. in mathematics from Oberlin College in 1930, and his Ph.D. in philosophy from Harvard University in 1932.  Apart from a stint away during World War II, lecturing on logic in Brazil (in Portuguese) and deciphering coded messages for military intelligence, Quine spent the remainder of his life at Harvard.

Quine started his academic career working on formal logic, which is the area where Bertrand Russell worked to establish rigorous foundations of mathematics. This gets us quickly into an extremely technical field, so I will content myself with saying that the vast majority of people think that mathematics is about as solid as it gets, yet it is not. If you accept certain basic propositions, such as 2 + 2 = 4, all is well with the world. Once you accept certain basic propositions, then you can build the vast edifice of mathematics. But proving that 2 + 2 = 4 is not only difficult, it is impossible. Sure, you can take 2 apples and add another 2 apples, and you have 4 apples, but that is an empirical demonstration, not a proof. Can you prove that 2 +2 = 4 without apples or any other objects? Can you even define what 2 is, or, more importantly, what a number is? Are numbers real things, or simply convenient abstractions? Russell used formal logic to find answers to these questions, and failed. Quine wrote three textbooks, and numerous academic papers on formal logic, and taught the subject for his entire career. He also wrote Mathematical Logic showing that much of what Russell’s Principia Mathematica took more than 1000 pages to say can be said in 250 pages, and in the last chapter examines Gödel’s incompleteness theorem https://www.bookofdaystales.com/kurt-godel/  and Tarski’s indefinability theorem.  In highly informal terms I will tell you that Gödel proved – definitively – that mathematics inevitably contains statements that are true, but cannot be proven to be true, and Tarski showed that truth in mathematics cannot be defined. Some of the greatest mathematicians in the world proved, beyond question, that mathematics rests on foundations that have to be accepted because they cannot be proven. Any different from building a religion on a spiritual force whose existence cannot be proven?

Quine then extended his investigations concerning logic into discussions concerning language. In particular he was led to doubt the tenability of the distinction between “analytic” and “synthetic” statements which was commonly made in the philosophy of language. Analytic statements are true simply by definition. For example, “Bachelors are unmarried men.” Synthetic statements are true or false because of facts in the world, “There is a black cat sitting on the mat.” Quine’s chief objection to analyticity is with the notion of synonymy (sameness of meaning). An analytic sentence substitutes a synonym for one half of the statement.  The objection to synonymy hinges upon the problem of collateral information. We intuitively feel that there is a distinction between “All unmarried men are bachelors” and “There have been black cats”, but a competent English speaker will assent to both sentences under all conditions because such speakers also have access to collateral information. In the case of black cats this collateral information has to do with the historical existence of black cats. But Quine maintains that there is no distinction between generally known collateral information (such as the existence of black cats) and conceptual or analytic information needed to agree that bachelors are unmarried men. One of the common questions used to elucidate this position is: “Is the pope a bachelor?” Quine argues that there is no distinction between those truths which are universally and confidently believed and those which are necessarily true.

Quine may be best known in some circles for his thoughts on the indeterminacy of translation. Can we ever be sure that we understand what a person speaking another language is saying?  As an anthropologist, this question interests me greatly, but where I part company with Quine is that he uses thought experiments based on imaginary languages, but anthropologists of language can address his concerns more directly using real languages. Quine’s investigations hinge on ontological relativity, that is, the idea that for any empirical observation there are multiple explanations (theories).

Let us consider statements in English first. What do words refer to? Quine says:

How can we talk about Pegasus? To what does the word ‘Pegasus’ refer? If our answer is, ‘Something,’ then we seem to believe in mystical entities; if our answer is, ‘nothing’, then we seem to talk about nothing and what sense can be made of this? Certainly when we said that Pegasus was a mythological winged horse we make sense, and moreover we speak the truth! If we speak the truth, this must be truth about something. So we cannot be speaking of nothing.

We already have a conundrum here because it is difficult enough in English to agree concerning what words are referring to. The problem is compounded when you try to translate sentences in another language into English, because you have to take into account what words refer to in another language as well as what they refer to in English. Quine’s thesis is that no unique interpretation of a foreign language is possible, because a ‘radical interpreter’ has no way of telling which of many possible meanings the speaker has in mind. Quine uses the example of the word “gavagai” uttered by a native speaker of the unknown language Jungle upon seeing a rabbit. A speaker of English could do what seems natural and translate this as “Look, a rabbit.” But other translations would be compatible with all the evidence he has: “Look, food”; “Let’s go hunting”; “There will be a storm tonight” (if the locals have superstitions about rabbits and storms); “Look, a momentary rabbit-stage”; “Look, an undetached rabbit-part.” Some of these might become less likely – that is, become more unwieldy hypotheses – in the light of subsequent observation.

Frankly I find all of this ruminating quite pointless. Yes, it’s certainly true that there is slippage of meaning when translating one language to another. Nuances are perpetually lost in all manner of ways, and there are dozens of ways in which mistakes can be made. But anthropological field linguists have been dealing with such problems for over a century, and somehow they manage to come up with grammars and dictionaries for new languages that can be used to develop fluency. The fact that there is always going to be a degree of uncertainty (indeterminacy) is neither news nor earth shattering.

I had lunch with Quine and a number of other luminaries of the philosophical world back in the 1970s when he was attending an annual conference at my university. The group was talking about the philosophical problems associated with language acquisition and even then, as a raw doctoral candidate in anthropology, I was perplexed as to why he and others were speculating about issues that were being addressed more fruitfully by neuroscientists, anthropologists and the like. It made me think that Western analytic philosophy was sheer speculating – at great length – about ideas in a vacuum. This tradition leads to some fascinating mind puzzles, but ultimately has no value for me beyond exercising my brain. This was perhaps not the best conclusion to reach given that I was married to an analytic philosopher of language at the time.

Quine spent 70 of his 92 years at Harvard, so a Harvard recipe is in order on his birthday. One with a small linguistic twist seems in order, so I thought of Harvard beets. If I told you we had Harvard beets for dinner what would you think? Were the beets grown at Harvard? Or are they cooked in a style common to Harvard? Or what? This is a simple question in the philosophy of language concerning modifiers. How do we know that baby shoes are shoes for babies, but crocodile shoes are made out of crocodile skin, not shoes for crocodiles (any more than baby shoes are made from baby skin)? We can be reasonably sure that Harvard beets are beetroots cooked in some fashion, but what does the modifier “Harvard” refer to? The simple answer is that it is a way of cooking beets in a sweet and sour sauce, but why Harvard? Why not Princeton or Chicago? For that question there is no answer. Cookbooks say that “Harvard” refers to the crimson color of the beets, and crimson is the university color for Harvard. That is a terrible answer because by that token, beets cooked in any fashion, or eaten raw, could be called Harvard beets because they are all crimson. Anyway, this recipe calls for roasting beetroots and then preparing a thick sweet and sour sauce for them.

Harvard Beets

Ingredients

1 ½ lbs medium-sized fresh beets
⅓ cup sugar
2 tsp cornstarch
¼ cup cider vinegar
¼ cup water
1 tbsp unsalted butter
salt

Instructions

Brush excess dirt off the beets, trim the tops and roots leaving about 1” and do not break the skin. Wrap them in foil and bake them for 1 hour in a 400˚F oven. Remove them from the oven and let them cool to the touch. When they are still a little warm, cut off the tops and roots and peel them. Then cut them in cubes.

Mix the sugar, cornstarch, vinegar and water in a saucepan and bring to a boil, whisking until thickened. Remove from the heat and whisk in the butter.

Add the beets to the sauce and heat them through gently over low heat. Serve warm.

Today is the birthday (1815) of George Boole, English mathematician, philosopher and logician. He worked in the fields of differential equations and algebraic logic, and is best known as the author of The Laws of Thought (1854) which contains Boolean algebra. Without Boolean logic we would not have digital computers. Let me try to break that thought down for you (a little). There is an important philosophical issue here summed up in the question: “How do humans think?” What Boole called “The Laws of Thought” are actually the laws of mathematical logic. Well . . . I think we all know that humans are not logical. Humans are not very complicated digital computers – not even very, very, very complicated digital computers. Computers can emulate human thought in a lot of ways. They can become very skilled at chess, for example. They can also be very good at problem solving, using algorithms that can be better than human methods. But human thought processes are qualitatively different in important ways. Let’s explore. First, a smattering of history.

Boole was born in Lincoln, the son of John Boole (1779–1848), a shoemaker and Mary Ann Joyce. He had a primary school education, and received lessons from his father, but had little further formal and academic teaching. William Brooke, a bookseller in Lincoln, may have helped him with Latin, which he may also have learned at the school of Thomas Bainbridge. He was self-taught in modern languages. At age 16 Boole became the breadwinner for his parents and three younger siblings, taking up a junior teaching position in Doncaster at Heigham’s School. He also taught briefly in Liverpool.

Boole participated in the Lincoln Mechanics’ Institution, which was founded in 1833. Edward Bromhead, who knew John Boole through the institution, helped George Boole with mathematics texts, and he was given the calculus text of Sylvestre François Lacroix by the Rev. George Stevens Dickson of St Swithin’s, Lincoln. Boole had no teacher, but after many years mastered calculus. At age 19, Boole successfully established his own school in Lincoln. Four years later he took over Hall’s Academy in Waddington, outside Lincoln, following the death of Robert Hall. In 1840 he moved back to Lincoln, where he ran a boarding school. Boole became a prominent local figure and an admirer of John Kaye, the bishop. With E. R. Larken and others he set up a building society in 1847. He associated also with the Chartist Thomas Cooper, whose wife was a relation.

From 1838 onwards Boole was making contacts with sympathetic British academic mathematicians and reading more widely. He studied algebra in the form of the symbolic methods that were understood at the time, and began to publish research papers on calculus and algebra. Boole’s status as mathematician was soon recognized with his appointment in 1849 as the first professor of mathematics at Queen’s College, Cork (now University College Cork (UCC)) in Ireland. He met his future wife, Mary Everest, there in 1850 while she was visiting her uncle John Ryall who was Professor of Greek. They married some years later in 1855. He maintained his ties with Lincoln, working there with E. R. Larken in a campaign to reduce prostitution.

It’s hard to explain briefly how Boole’s algebra, now known as (the foundations of) Boolean algebra, revolutionized mathematics and logic. Anyone who studies mathematics or computer science needs to know some of the basics of Boolean algebra – created by a man who finished primary school only, and otherwise studied mathematics on his own without teachers. Astonishing.  Boolean algebra is the branch of algebra in which the values of the variables are the truth values true and false, usually denoted as 1 and 0 respectively. In elementary algebra (the kind you start with in school), the values of the variables are numbers, and the main operations are addition and multiplication. In basic Boolean algebra the operations are the conjunction “and” denoted as ∧, the disjunction “or” denoted as ∨, the negation “not” denoted as ¬, and the implication “therefore” denoted as →. In other words, Boolean algebra is a formal system for describing logical relations in the same way that ordinary algebra describes numeric relations – and it needs only 4 operations and 2 values. With this simple basis you can perform (or describe) any logical procedure that you want – and it can become extremely sophisticated. If you know any set theory, you’ll also recognize the basic operations there too, and if you’ve done any computer programming, you know how important this algebra is.

What’s more important for the modern world, all logical procedures can be turned into electric systems using this algebra. Crudely put, an electric pulse that turns a switch on can be called “1” (true), and no electrical pulse can be called “0” (false).  You can also build logical gates that emulate the 4 Boolean operations using electrical currents that you send electrical pulses into (input). The gates perform the operation, and produce an output. For example, if you send an electric pulse (1/true) into a “not” gate, no pulse (0/false) comes out the other end. A digital computer’s chip consists of billions and billions of these logic gates set up in complex ways so that when you enter input, it goes through these gates and emerges as output. To make the input usable by the computer it has to be translated into binary code first. Binary mathematics uses only 1s and 0s, which become electrical pulses.

This all gets very complicated very quickly, so I won’t go on about the computing side any more. What I will say, though, is that when it was discovered that human brains contain synapses in complex networks such that they could pass around electric pulses between neurons (brain cells), seemingly like logic gates in a digital computer, both neuroscientists and psychologists started thinking of the brain as a computer. A single synapse is either firing (sending a pulse), or not – that is, it is either 1 or 0. Nest all the synapses together in complex ways and it appears that you have a flesh and blood digital computer. Many physical and social scientists believed that what evolution had created naturally, humans had stumbled on artificially. In time, therefore, computers could be built that were exactly like human brains, and eventually you’d have robots that were indistinguishable from humans.

It doesn’t take a whole lot of thinking (using our non-digital brains) to see the problem here. For example, a properly functioning computer does not forget things; properly functioning humans forget things all the time – including important things. A properly functioning computer does not make mathematical errors; humans make them all the time. Put crudely again, computers are logical; humans are not. Logic is only a fraction of our thinking process – and, in my experience, not a very common one. That’s why characters such as Mr Spock in the Star Trek series (a person who is strictly logical), are so weird. In part this is because our brains are much more than logic circuits, and we still don’t really understand how our brains work. We do know that we don’t work by logic, and nor do our brains. Attempts to reduce personal and social systems of thought to Boolean algebra have yielded interesting results in all manner of fields – linguistics, psychology, sociology, anthropology, philosophy, history, etc. etc. – but all have failed because human thought just isn’t digital, let alone logical.

Let’s move Boolean algebra into the food sphere. Here’s a logical operation: “If I have flour and water, I can make dough.” This contains the Boolean “and” as well as the implication, “therefore.” If I have flour (flour = true) AND if I have water (water = true), then I can make dough (dough = true) or in symbolic form: flour ∧   water → dough. Sorry to readers who know any Boolean algebra or symbolic logic for the slight oversimplification here.

Let’s be just slightly more complex: sugar ∧   water  ∧  heat → toffee, which we can translate as, “If I have sugar and water and a source of heat, I can make toffee. OK, let’s do it. I have sugar and water and a source of heat. This recipe is extremely simple. I used to do it when I was 8 years old.

You will need:

1 cup sugar

¼ cup water

Put the water and sugar in a saucepan and heat gently, stirring constantly, to dissolve the sugar. When the sugar is completely dissolved, turn the heat to high and let the mixture boil. Keep a very close eye on it. At this stage you do not have to stir. After about 20 minutes (depending on your heat source) the sugar will begin to show little strands of brown as the sugar caramelizes. This is the critical stage. Begin stirring until the whole mixture is brown then IMMEDIATELY remove it from the heat. I then pour it on to a marble slab where it cools and hardens into toffee. You can also use toffee molds if you want.

If you get experienced at toffee making you can select the darkness that you want. Darker toffees need to cook a bit longer, and are more flavorful and more brittle. Be careful though – it’s an easy step from brown to black. Black is not good.

Today is the birthday (1834) of John Venn, English mathematician and logician, primarily remembered for his use of diagrams, which we now call Venn diagrams, to help explain concepts in set theory. Venn was not exactly a giant in his field, but I’d settle for having something reasonably commonplace named after me. Venn diagrams have served me very well in my own work.

Venn was born in Hull, and educated at private schools in London before studying mathematics at Cambridge University, at Gonville and Caius College where he subsequently became a fellow and then head of the college.

Venn’s father was an Anglican clergyman and Venn followed suit in the late 1850s, as was normal for fellows at Cambridge at the time. In fact, after receiving his degree in 1857 he did parish work for a few years before devoting himself full time to mathematics. Even after he left the clergy in the 1880s he continued being involved in the church, although he found strict Anglicanism incompatible with logic and mathematics.

Venn’s first major publication was The Logic of Chance (1866), a significant accounting of the laws of probability. He then turned to George Boole’s work in logic and produced Symbolic Logic in 1881. It was in this work that he introduced Venn diagrams which he had been using as a teaching device for several years:

I began at once somewhat more steady work on the subjects and books which I should have to lecture on. I now first hit upon the diagrammatical device of representing propositions by inclusive and exclusive circles. Of course the device was not new then, but it was so obviously representative of the way in which any one, who approached the subject from the mathematical side, would attempt to visualise propositions, that it was forced upon me almost at once.

As Venn notes, other mathematicians, notably Gottfried Leibniz and Leonhard Euler, had used similar diagrams earlier, but Venn popularized them as well as extending their application to a wide variety of fields outside of mathematics and logic, and making their application more rigorous than previous attempts.

Venn diagrams don’t actually serve a technical function in mathematics or logic, but they do make certain concepts easier to grasp by displaying them visually. Here’s a simple example showing the Greek, Russian, and Roman alphabets each contained within circles which are drawn to overlap. Symbols in common between two of the alphabets are shown at the intersections of their circles, and symbols common to all three are shown in the central intersection.

Venn diagrams also have the possibility of excluding items from any of the circles, such as in the example below of a diagram concerning “people I know” and the use of social media. Harry uses neither Facebook nor Twitter so fits inside the rectangle representing people I know but outside the circles representing social media.

Venn was elected to the Royal Society in 1883 and continued to publish other works, including The Principles of Empirical or Inductive Logic (1889) and volumes on the history of Cambridge and a list of its alumni, compiled with the aid of his son, John Archibald Venn.

Venn died on April 4, 1923, in Cambridge at the age of 88. He is memorialized by a stained glass window at his old college.

When dealing with mathematical subjects before I’ve focused on mathematical objects. The Venn diagram is not a mathematical object per se, but it does lend itself to cooking ideas. This site gives an idea for a Venn diagram pie. http://www.quirkbooks.com/post/happy-pi-day-make-venn-pie-agram  It was created for Pi Day (3/14 in countries that use month/day format), so it is more about being a pie than being an accurate Venn diagram. But you can take the original and modify it.

The site shows you how to cut two disposable pie pans to make the Venn diagram shape, and the crust conforms to the general idea of sets – no crust and full crust intersect to make a lattice crust. The recipe fails with the fillings. It just suggests using three different ones. It shouldn’t be too difficult to come up with categories such as fruit and dairy, so that one side is fruit, no dairy, the other is dairy, no fruit, and the middle is fruit and dairy. I’ll leave it to you.

Today is the birthday (1674) of Isaac Watts, an English Christian minister, hymn writer, theologian and logician. Although not a household name these days he has been called the “Father of English Hymnody,” credited with around 750 hymns many of which remain in use today and have been translated into numerous languages.

Watts was born in Southampton and brought up in the home of a committed religious Nonconformist. His father, also Isaac Watts, had been incarcerated twice for his views. He attended King Edward VI School in Southampton where he had a classical education.

From an early age, Watts displayed a propensity for rhyme. Once, he responded when asked why he had his eyes open during prayers:

A little mouse for want of stairs
ran up a rope to say its prayers.

He was caned for this attempt at humor.

Because he was a Nonconformist, Watts could not attend Oxford or Cambridge, which were restricted by law to Anglicans, as were government positions at the time. He went to the Dissenting Academy at Stoke Newington in 1690. Much of the remainder of his life centered on that village, which is now part of Inner London.

Following his education, Watts was called as pastor of a large independent chapel in London, where he helped train preachers, despite his poor health. Isaac Watts held religious opinions that were more non-denominational or ecumenical than was common for a Nonconformist at that time. He had a greater interest in promoting education and scholarship than preaching particular creeds.

Watts lived with the Nonconformist Hartopp family at Fleetwood House, on Church Street in Stoke Newington and worked with them as a private tutor. Through them he became acquainted with their immediate neighbours, Sir Thomas Abney and Lady Mary. Watts eventually lived for a total of 36 years in the Abney household, most of the time at Abney House, their second residence. (Lady Mary had inherited the Manor of Stoke Newington in 1701 from her late brother, Thomas Gunston.)

On the death of Sir Thomas Abney in 1722, the widow Lady Mary and her last unmarried daughter, Elizabeth, moved all her household to Abney Hall from Hertfordshire. She invited Watts to continue with their household. Consequently he lived at Abney Hall until his death in 1748. Watts particularly enjoyed the grounds at Abney Park, which Lady Mary planted with two elm walks leading down to an island heronry in the Hackney Brook. Watts often sought inspiration there for the many books and hymns he wrote. Watts died in Stoke Newington in 1748, and was buried in Bunhill Fields.

It may not be too exaggerated a claim to say that we owe Christian hymn singing to Watts. Before Watts, Christian singing, such as it was, was based on the poetry of the Bible, mostly the Psalms. This tradition grew out of John Calvin’s practice of encouraging setting vernacular translations of Biblical verses to music for congregational singing. Before Calvin’s time church singing was virtually unknown.  Watts introduced extra-Biblical poetry to church singing as part of his evangelical efforts and, thus, opened up a new era of Protestant hymnody which other poets quickly picked up upon.

Watts also introduced a new way of rendering the Psalms in verse for church services. Watts proposed that the metrical translations of the Psalms as sung by Protestant Christians should give them a specifically Christian perspective. While he granted that David [to whom authorship of many of the Psalms is traditionally ascribed] was unquestionably a chosen instrument of God, Watts claimed that his religious understanding could not have fully apprehended the truths later revealed through Jesus Christ. The Psalms should therefore be “renovated” as if David had been a Christian, or as Watts put it in the title of his 1719 metrical Psalter, they should be “imitated in the language of the New Testament.”

Watts made the Christian experience personal in his hymns. He frequently used the first person pronoun as in, for example, “When I Survey the Wondrous Cross.” One of my personal favorites – which I used often in services – is “We’re Marching to Zion.”  Here it is, not sung quite as lustily as I encouraged, but not bad:

Watts is perhaps better known in the Shape Note tradition of the Southern U.S. than in contemporary worship. There are dozens of Watts’s hymns in shape-note hymnals old and new. For example:

Besides writing hymns, Isaac Watts was also a theologian and logician. Watts wrote a text book on logic which was particularly popular down to the 19th century: Logic, or The Right Use of Reason in the Enquiry After Truth With a Variety of Rules to Guard Against Error in the Affairs of Religion and Human Life, as well as in the Sciences. This was first published in 1724, and it was printed in twenty editions. Watts wrote this work for beginners of logic, and arranged the book methodically. He divided the content of his elementary treatment of logic into four parts: perception, judgement, reasoning, and method, which he treated in this order. In Watts’ Logic, there are notable departures from other works of the time, and some notable innovations. The influence of British empiricism may be seen, especially that of contemporary philosopher and empiricist John Locke. Logic includes several references to Locke and his Essay Concerning Human Understanding. Watts distinguished between judgments and propositions, unlike some other logic authors. According to Watts, judgment is “to compare… ideas together, and to join them by affirmation, or disjoin then by negation, according as we find them to agree or disagree.” He continues, “when mere ideas are joined in the mind without words, it is rather called a judgement; but when clothed with words it is called a proposition.” Watts’ Logic follows the scholastic tradition and divides propositions into universal affirmative, universal negative, particular affirmative, and particular negative.

By stressing a practical and non-formal part of logic, Watts gave rules and directions for any kind of inquiry, including the inquiries of science and the inquiries of philosophy. These rules of inquiry were given in addition to the formal content of classical logic common to text books on logic from that time. Watts’ conception of logic as being divided into its practical part and its speculative part marks a departure from the conception of logic of most other authors. Logic became the standard text on logic at Oxford, Cambridge, Harvard and Yale, being used at Oxford for well over 100 years (ironic given that he was barred from that institution).

Whatever you cook today, you should belt out a Watts hymn in the process (it is Sunday, after all). Here’s a recipe for roast turkey roughly contemporary with Watts, taken from The Cookbook of Unknown Ladies:

Take a large turkey. After a day kild, slit it down ye back, & bone it & then wash it. Clean stuf it as much in ye shape it was as you can with forc’d meat made of 2 pullits yt has been skin’d, 2 handfulls of crumbs of bread, 3 handfulls of sheeps sewit, some thyme, & parsley, 3 anchoves, some pepper & allspice, a whole lemon sliced thin, ye seeds pick’d out & minced small, a raw egg. Mix all well together stuf yr turkey & sow it up nicely at ye back so as not to be seen. Then spit it & rost it with paper on the breast to preserve ye coler of it nicely. Then have a sauce made of strong greavy, white wine, anchoves, oysters, mushrooms slic’d, salary first boyl’d a littile, some harticholk bottoms, some blades of mace, a lump of butter roll’d in flower. Toss up all together & put ym in yr dish. Don’t pour any over ye turkey least you spoyl ye coler. Put ye gisard & liver in ye wings. Put sliced lemon & forc’d balls for garnish.

By contemporary standards this recipe is rather rich. The stuffing is made of chicken [pullits] and breadcrumbs (plus suet), and the gravy is laden with all manner of things – anchovies, oysters, mushrooms, celery, and artichokes. 18th century English cooking was dominated by meat and protein. Fruits and vegetables came in a distant second, and were never eaten raw as this practice was considered bad for one’s health.

Today is the birthday (1889) of Ludwig Josef Johann Wittgenstein, Austrian-British philosopher who worked primarily in logic, the philosophy of mathematics, the philosophy of mind, and the philosophy of language. One of my great heroes. During his lifetime he published just one slim book, the 75-page Tractatus Logico-Philosophicus (1921), one article, one book review, and a children’s dictionary. He spent years editing his voluminous manuscripts into the magnum opus, Philosophical Investigations, which was published posthumously in 1953. It became a classic, ranking Wittgenstein with the powerhouses of modern philosophy. Bertrand Russell described him as “the most perfect example I have ever known of genius as traditionally conceived; passionate, profound, intense, and dominating.”

Wittgenstein was born in Vienna into one of Europe’s richest families and inherited a large fortune from his father in 1913. He gave considerable sums to poor artists, and in a period of severe depression after World War I, he gave away his entire fortune to his brothers and sisters. Three of his brothers committed suicide, and Wittgenstein contemplated it too. He left academia several times: serving as an officer on the front line during World War I, where he was decorated a number of times for his courage; teaching in schools in remote Austrian villages; and working during World War II as a hospital porter in London, where he told patients not to take the drugs they were prescribed, and where he largely managed to keep secret the fact that he was one of the world’s most famous philosophers. He described philosophy, however, as “the only work that gives me real satisfaction.”

His philosophy is often divided into an early period, exemplified by the Tractatus, and a later period, articulated in the Philosophical Investigations. In his early work Wittgenstein was concerned with the logical relationship between propositions and the world, and believed that by providing an account of the logic underlying this relationship he had solved all philosophical problems. In the later period Wittgenstein rejected many of the assumptions of the Tractatus, arguing that the meaning of words is best understood as their use within a given context. This is classic:

Think of the following use of language: I send someone shopping. I give him a slip marked ‘five red apples’. He takes the slip to the shopkeeper, who opens the drawer marked ‘apples’, then he looks up the word ‘red’ in a table and finds a colour sample opposite it; then he says the series of cardinal numbers—I assume that he knows them by heart—up to the word ‘five’ and for each number he takes an apple of the same colour as the sample out of the drawer.—It is in this and similar ways that one operates with words—”But how does he know where and how he is to look up the word ‘red’ and what he is to do with the word ‘five’?” Well, I assume that he acts as I have described. Explanations come to an end somewhere.—But what is the meaning of the word ‘five’? No such thing was in question here, only how the word ‘five’ is used.

The very last piece is priceless. What is “five-ness”? This is the kind of question I used to ask my students (along with “what is blue-ness?” etc.), much to their frustration. We can use these words effectively, but definition is illusive. Look up “left” or “right” in the dictionary and see what you get. Here’s one of my common questions for students, following Wittgenstein: “prove to me that 2 + 2 = 4.” Most would show me by putting items (such as fingers) into two groups of two and then putting them together to make four. But I would reply by pointing out that this was merely demonstrating (using), and not proving. At that point they all got baffled, as well they should. You can (sort of) prove it mathematically, but you need a sophisticated understanding of set theory and other complicated stuff. Even then you have to accept certain things on faith such as that zero and the natural numbers exist at all! By the way, these were anthropology classes; I am not a philosopher. Wittgenstein has a long reach.

Wittgenstein’s influence has been felt in nearly every field of the humanities and social sciences, yet there are diverging interpretations of his thought and, therefore, of its value. In the words of his friend and colleague Georg Henrik von Wright:

He was of the opinion that his ideas were generally misunderstood and distorted even by those who professed to be his disciples. He doubted he would be better understood in the future. He once said he felt as though he were writing for people who would think in a different way, breathe a different air of life, from that of present-day men.

There is a story told of someone going up to Wittgenstein and saying, “What a lot of morons they were back in the Middle Ages. They looked up at the dawn every morning and thought what they were seeing was the Sun going around the Earth, when every school kid knows that the Earth goes around the Sun,” to which Wittgenstein replied, “Yes, but I wonder what it would have looked like if the Sun had been going around the Earth?” (Incidentally, Einstein showed that without a frame of reference external to the universe it is just as legitimate to say that the sun goes round the earth as that the earth goes round the sun. Very few people understand the implications of relativity).

Here are some of my favorite quotes from the Tractatus and Philosophical Investigations in no particular order:

I don’t know why we are here, but I’m pretty sure that it is not in order to enjoy ourselves.

A serious and good philosophical work could be written consisting entirely of jokes.

Whereof one cannot speak, thereof one must be silent.

Hell isn’t other people. Hell is yourself.

The real question of life after death isn’t whether or not it exists, but even if it does what problem this really solves.

Nothing is so difficult as not deceiving oneself.

If people never did silly things nothing intelligent would ever get done.

Don’t for heaven’s sake, be afraid of talking nonsense! But you must pay attention to your nonsense.

Only describe, don’t explain.

I am sitting with a philosopher in the garden; he says again and again ‘I know that that’s a tree’, pointing to a tree that is near us. Someone else arrives and hears this, and I tell him: ‘This fellow isn’t insane. We are only doing philosophy.’

If we take eternity to mean not infinite temporal duration but timelessness, then eternal life belongs to those who live in the present.

Not how the world is, but that it is, is the mystery.

The limits of my language are the limits of my mind. All I know is what I have words for.

Our life has no end in the way in which our visual field has no limits.

A man will be imprisoned in a room with a door that’s unlocked and opens inwards; as long as it does not occur to him to pull rather than push.

Problems are solved, not by giving new information, but by arranging what we have known all long.

Never stay up on the barren heights of cleverness, but come down into the green valleys of silliness.

If you and I are to live religious lives, it mustn’t be that we talk a lot about religion, but that our manner of life is different. It is my belief that only if you try to be helpful to other people will you in the end find your way to God.

Philosophy is a battle against the bewitchment of our intelligence by means of language.

How small a thought it takes to fill a life.

I act with complete certainty. But this certainty is my own.

If anyone is unwilling to descend into himself, because this is too painful, he will remain superficial in his writing.

The world of the happy is quite different from that of the unhappy.

It is a dogma of the Roman Church that the existence of God can be proved by natural reason. Now this dogma would make it impossible for me to be a Roman Catholic.

If I thought of God as another being like myself, outside myself, only infinitely more powerful, then I would regard it as my duty to defy him.

To imagine a language is to imagine a form of life.

I hope Wittgenstein will approve of today’s “recipe.” It comes in the form of a story. John Maynard Keynes wrote this about him after meeting Wittgenstein in Cambridge in 1929:

My wife gave him some Swiss cheese and rye bread for lunch, which he greatly liked. Thereafter he more or less insisted on eating bread and cheese at all meals, largely ignoring the various dishes that my wife prepared. Wittgenstein declared that it did not much matter to him what he ate, so long as it always remained the same. When a dish that looked especially appetizing was brought to the table, I sometimes exclaimed “Hot Ziggety!” — a slang phrase that I learned as a boy in Kansas. Wittgenstein picked up this expression from me. It was inconceivably droll to hear him exclaim “Hot Ziggety!” when my wife put the bread and cheese before him.

So . . .bread and cheese it is. Have yourself a ploughman’s lunch. Here’s one I made for Plough Monday 2012.