Today is the birthday (1815) of George Boole, English mathematician, philosopher and logician. He worked in the fields of differential equations and algebraic logic, and is best known as the author of *The Laws of Thought* (1854) which contains Boolean algebra. Without Boolean logic we would not have digital computers. Let me try to break that thought down for you (a little). There is an important philosophical issue here summed up in the question: “How do humans think?” What Boole called “The Laws of Thought” are actually the laws of mathematical logic. Well . . . I think we all know that humans are not logical. Humans are not very complicated digital computers – not even very, very, very complicated digital computers. Computers can emulate human thought in a lot of ways. They can become very skilled at chess, for example. They can also be very good at problem solving, using algorithms that can be better than human methods. But human thought processes are qualitatively different in important ways. Let’s explore. First, a smattering of history.

Boole was born in Lincoln, the son of John Boole (1779–1848), a shoemaker and Mary Ann Joyce. He had a primary school education, and received lessons from his father, but had little further formal and academic teaching. William Brooke, a bookseller in Lincoln, may have helped him with Latin, which he may also have learned at the school of Thomas Bainbridge. He was self-taught in modern languages. At age 16 Boole became the breadwinner for his parents and three younger siblings, taking up a junior teaching position in Doncaster at Heigham’s School. He also taught briefly in Liverpool.

Boole participated in the Lincoln Mechanics’ Institution, which was founded in 1833. Edward Bromhead, who knew John Boole through the institution, helped George Boole with mathematics texts, and he was given the calculus text of Sylvestre François Lacroix by the Rev. George Stevens Dickson of St Swithin’s, Lincoln. Boole had no teacher, but after many years mastered calculus. At age 19, Boole successfully established his own school in Lincoln. Four years later he took over Hall’s Academy in Waddington, outside Lincoln, following the death of Robert Hall. In 1840 he moved back to Lincoln, where he ran a boarding school. Boole became a prominent local figure and an admirer of John Kaye, the bishop. With E. R. Larken and others he set up a building society in 1847. He associated also with the Chartist Thomas Cooper, whose wife was a relation.

From 1838 onwards Boole was making contacts with sympathetic British academic mathematicians and reading more widely. He studied algebra in the form of the symbolic methods that were understood at the time, and began to publish research papers on calculus and algebra. Boole’s status as mathematician was soon recognized with his appointment in 1849 as the first professor of mathematics at Queen’s College, Cork (now University College Cork (UCC)) in Ireland. He met his future wife, Mary Everest, there in 1850 while she was visiting her uncle John Ryall who was Professor of Greek. They married some years later in 1855. He maintained his ties with Lincoln, working there with E. R. Larken in a campaign to reduce prostitution.

It’s hard to explain briefly how Boole’s algebra, now known as (the foundations of) Boolean algebra, revolutionized mathematics and logic. Anyone who studies mathematics or computer science needs to know some of the basics of Boolean algebra – created by a man who finished primary school only, and otherwise studied mathematics on his own without teachers. Astonishing. Boolean algebra is the branch of algebra in which the values of the variables are the truth values true and false, usually denoted as 1 and 0 respectively. In elementary algebra (the kind you start with in school), the values of the variables are numbers, and the main operations are addition and multiplication. In basic Boolean algebra the operations are the conjunction “and” denoted as ∧, the disjunction “or” denoted as ∨, the negation “not” denoted as ¬, and the implication “therefore” denoted as →. In other words, Boolean algebra is a formal system for describing logical relations in the same way that ordinary algebra describes numeric relations – and it needs only 4 operations and 2 values. With this simple basis you can perform (or describe) any logical procedure that you want – and it can become extremely sophisticated. If you know any set theory, you’ll also recognize the basic operations there too, and if you’ve done any computer programming, you know how important this algebra is.

What’s more important for the modern world, all logical procedures can be turned into electric systems using this algebra. Crudely put, an electric pulse that turns a switch on can be called “1” (true), and no electrical pulse can be called “0” (false). You can also build logical gates that emulate the 4 Boolean operations using electrical currents that you send electrical pulses into (input). The gates perform the operation, and produce an output. For example, if you send an electric pulse (1/true) into a “not” gate, no pulse (0/false) comes out the other end. A digital computer’s chip consists of billions and billions of these logic gates set up in complex ways so that when you enter input, it goes through these gates and emerges as output. To make the input usable by the computer it has to be translated into binary code first. Binary mathematics uses only 1s and 0s, which become electrical pulses.

This all gets very complicated very quickly, so I won’t go on about the computing side any more. What I will say, though, is that when it was discovered that human brains contain synapses in complex networks such that they could pass around electric pulses between neurons (brain cells), seemingly like logic gates in a digital computer, both neuroscientists and psychologists started thinking of the brain as a computer. A single synapse is either firing (sending a pulse), or not – that is, it is either 1 or 0. Nest all the synapses together in complex ways and it appears that you have a flesh and blood digital computer. Many physical and social scientists believed that what evolution had created naturally, humans had stumbled on artificially. In time, therefore, computers could be built that were exactly like human brains, and eventually you’d have robots that were indistinguishable from humans.

It doesn’t take a whole lot of thinking (using our non-digital brains) to see the problem here. For example, a properly functioning computer does not forget things; properly functioning humans forget things all the time – including important things. A properly functioning computer does not make mathematical errors; humans make them all the time. Put crudely again, computers are logical; humans are not. Logic is only a fraction of our thinking process – and, in my experience, not a very common one. That’s why characters such as Mr Spock in the Star Trek series (a person who is strictly logical), are so weird. In part this is because our brains are much more than logic circuits, and we still don’t really understand how our brains work. We do know that we don’t work by logic, and nor do our brains. Attempts to reduce personal and social systems of thought to Boolean algebra have yielded interesting results in all manner of fields – linguistics, psychology, sociology, anthropology, philosophy, history, etc. etc. – but all have failed because human thought just isn’t digital, let alone logical.

Let’s move Boolean algebra into the food sphere. Here’s a logical operation: “If I have flour and water, I can make dough.” This contains the Boolean “and” as well as the implication, “therefore.” If I have flour (flour = true) AND if I have water (water = true), then I can make dough (dough = true) or in symbolic form: flour ∧ water → dough. Sorry to readers who know any Boolean algebra or symbolic logic for the slight oversimplification here.

Let’s be just slightly more complex: sugar ∧ water ∧ heat → toffee, which we can translate as, “If I have sugar and water and a source of heat, I can make toffee. OK, let’s do it. I have sugar and water and a source of heat. This recipe is extremely simple. I used to do it when I was 8 years old.

You will need:

*1 cup sugar*

*¼ cup water*

Put the water and sugar in a saucepan and heat gently, stirring constantly, to dissolve the sugar. When the sugar is completely dissolved, turn the heat to high and let the mixture boil. Keep a very close eye on it. At this stage you do not have to stir. After about 20 minutes (depending on your heat source) the sugar will begin to show little strands of brown as the sugar caramelizes. This is the critical stage. Begin stirring until the whole mixture is brown then IMMEDIATELY remove it from the heat. I then pour it on to a marble slab where it cools and hardens into toffee. You can also use toffee molds if you want.

If you get experienced at toffee making you can select the darkness that you want. Darker toffees need to cook a bit longer, and are more flavorful and more brittle. Be careful though – it’s an easy step from brown to black. Black is not good.