Jan 242018
 

The original Apple Macintosh (branded as Mac since 1998) first went on sale on this date in 1984. The Macintosh was the company’s first mass-market personal computer that featured a graphical user interface, built-in screen, and mouse. Apple sold the Macintosh alongside its popular Apple II family of computers for almost ten years before the latter was cancelled in 1993. Early Macintosh models were expensive, hindering its competitiveness in a market already dominated by the Commodore 64 for consumers, as well as the IBM Personal Computer and its accompanying clone market for businesses. Macintosh systems did, however, find a successful market in education and desktop publishing, and kept Apple as the second-largest PC manufacturer for the next decade.

The Macintosh project was begun in 1979 by Jef Raskin, an Apple employee who envisioned an easy-to-use, low-cost computer for the average consumer. He wanted to name the computer after his favorite type of apple, the McIntosh, but the spelling was changed to “Macintosh” for legal reasons because the original was the same spelling as that used by McIntosh Laboratory, Inc., the audio equipment manufacturer. Steve Jobs requested that McIntosh Laboratory give Apple a release for the name with its changed spelling so that Apple could use it, but the request was denied, forcing Apple to buy the rights to use the name.

Raskin

The original Macintosh featured a radically new graphical user interface. Users interacted with the computer using a metaphorical desktop that included icons of real life items, instead of abstract textual commands, as was the case with MS-DOS based IBM PCs and clones, plus a mouse which allowed the user to point at the icons and click on them to open the files or applications. This was a radically new concept at the time.

In 1978 Apple began to organize the Apple Lisa project, aiming to build a next-generation machine similar to an advanced Apple III or the yet-to-be-introduced IBM PC. In 1979, Steve Jobs learned of the advanced work on graphical user interfaces (GUI) taking place at Xerox PARC. He arranged for Apple engineers to be allowed to visit PARC to see the systems in action. The Apple Lisa project was immediately redirected to use a GUI, which at that time was well beyond the state of the art for microprocessor capabilities. The Xerox Alto required a custom processor that spanned several circuit boards in a case which was the size of a small refrigerator. Things had changed dramatically with the introduction of the 32-bit Motorola 68000 in 1979, which offered at least an order of magnitude better performance than existing designs and made a software GUI machine a practical possibility. The basic layout of the Lisa was largely complete by 1982, at which point Jobs’s constant suggestions for improvements led to him being kicked off the project.

At the same time that the Lisa was becoming a GUI machine in 1979, Jef Raskin started the Macintosh project. The design at that time was for a low-cost, easy-to-use machine for the average consumer. Instead of a GUI, it was supposed to use a text-based user interface that allowed several programs to be running and easily switched between, and special command keys on the keyboard that accessed standardized commands in the programs. Raskin was authorized to start hiring for the project in September 1979, and he immediately asked his long-time colleague, Brian Howard, to join him. His initial team  eventually consisted of 15 engineers and programmers with Steve Jobs leading the project. In a 2013 interview, Steve Wozniak said that he had been leading the initial design and development phase of the Macintosh project until 1981 when he experienced a traumatic airplane crash and temporarily left the company, leaving Jobs in charge. In that same interview, Wozniak said that the original Macintosh “failed” under Jobs, and that it was not until Jobs left that it became a success. He attributed the eventual success of the Macintosh to people like John Sculley “who worked to build a Macintosh market when the Apple II went away.” Sculley had been CEO of Pepsi but was lured away to Apple by Jobs with the now immortal line: “Do you want to sell sugared water for the rest of your life? Or do you want to come with me and change the world?”

Jobs and Sculley

Burrell Smith built the first Macintosh board to Raskin’s design specifications: it had 64 kilobytes (kB) of RAM, used the 8-bit Motorola 6809E microprocessor, and was capable of supporting a 256×256-pixel black-and-white bitmap display. Bud Tribble, a member of the Mac team, was interested in running the Apple Lisa’s graphical programs on the Macintosh, and asked Smith whether he could incorporate the Lisa’s Motorola 68000 microprocessor into the Mac while still keeping the production cost down. By December 1980, Smith had succeeded in designing a board that not only used the 68000, but also increased its speed from the Lisa’s 5 MHz to 8 MHz. This board also had the capacity to support a 384×256-pixel display. Smith’s design used fewer RAM chips than the Lisa, which made production of the board significantly more cost-efficient. The final Mac design was self-contained and had the complete QuickDraw picture language and interpreter in 64 KB of ROM – far more than most other computers which typically had around 4 to 8 KB of ROM. It had 128 kB of RAM, in the form of sixteen 64-kilobit (kb) RAM chips soldered to the logicboard. Though there were no memory slots, its RAM was expandable to 512 kB by soldering sixteen IC sockets to accept 256 kb RAM chips in place of the factory-installed chips. The final product’s screen was a 9-inch (230 mm), 512×342 pixel monochrome display, exceeding the size of the planned screen.

Burrell’s innovative design, combining the low production cost of an Apple II with the computing power of Lisa’s Motorola 68000 CPU, began to receive Jobs’s attentions. Realizing that the Macintosh was more marketable than the Lisa, he began to focus his attention on the project. Raskin left the team in 1981 over a personality conflict with Jobs. When Jobs was forced out of the Lisa team in 1982, he devoted his entire attention to the Macintosh. Jobs’s leadership at the Macintosh project did not last. Following an internal power struggle with then-new Apple CEO John Sculley, Jobs resigned from Apple in 1985.

In 1982, Regis McKenna was brought in to shape the marketing and launch of the Macintosh. The launch of the Macintosh pioneered many different tactics that are used today in launching technology products, including the “multiple exclusive,” event marketing (credited to John Sculley, who brought the concept over from Pepsi), creating a mystique around a product and giving an inside look into a product’s creation. After the Lisa’s announcement, John Dvorak discussed rumors of a mysterious “MacIntosh” project at Apple in February 1983. The company announced the Macintosh 128K—manufactured at an Apple factory in Fremont, California—in October 1983, followed by an 18-page brochure included with various magazines in December.

The Macintosh was introduced by a $1.5 million Ridley Scott television commercial, “1984.” It most notably aired during the third quarter of Super Bowl XVIII on January 22, 1984, and is now often called a “watershed event” and a “masterpiece.” McKenna called the ad “more successful than the Mac itself.” “1984” used an unnamed heroine to represent the coming of the Macintosh (indicated by a Picasso-style picture of the computer on her white tank top) as a means of saving humanity from the “conformity” of IBM’s attempts to dominate the computer industry. The ad alludes to George Orwell’s novel Nineteen Eighty-Four with its dystopian future ruled by a televised “Big Brother.” Take a look:

Two days after “1984” aired, the Macintosh went on sale, and came bundled with two applications designed to show off its interface: MacWrite and MacPaint. It was first demonstrated by Steve Jobs in the first of his famous Mac keynote speeches, and though the Mac garnered an immediate, enthusiastic following, some labeled it a mere “toy.”

Because the operating system was designed largely around the GUI, existing text-mode and command-driven applications had to be redesigned and the programming code rewritten. This was a time-consuming task that many software developers chose not to undertake, which can be regarded as the main reason for an initial lack of software for the new system. In April 1984, Microsoft’s MultiPlan migrated over from MS-DOS, with Microsoft Word following in January 1985. In 1985, Lotus Software introduced Lotus Jazz for the Macintosh platform after the success of Lotus 1-2-3 for the IBM PC, although it was largely a flop. Apple introduced the Macintosh Office suite the same year with the “Lemmings” commercial: infamous for insulting its own potential customers.

Apple spent $2.5 million purchasing all 39 advertising pages in a special, post-election issue of Newsweek, and ran a “Test Drive a Macintosh” promotion, in which potential buyers with a credit card could take home a Macintosh for 24 hours and return it to a dealer afterwards. While 200,000 people participated, dealers disliked the promotion, the supply of computers was insufficient for demand, and many were returned in such a bad condition that they could no longer be sold. This marketing campaign caused CEO John Sculley to raise the price from $1,995 to $2,495 (about $5,900 when adjusted for inflation in 2017). The computer sold well, nonetheless, reportedly outselling the IBM PCjr which also began shipping early that year. By April 1984 the company sold 50,000 Macintoshes and hoped for 70,000 by early May and almost 250,000 by the end of the year.

Price was a major factor separating IBM PC clones and Macs at that time – and still is. I could not afford $2,495 in 1984 on my assistant professor’s salary which was around $18,000 per annum. Adjusting for inflation only tells half the story. A Mac would have cost me around 15% of my annual salary before taxes, and there is no way I could have afforded that. Genuine IBM PCs were even more expensive, but clones were much cheaper. I got an IBM clone for $1000, which still necessitated a loan, but an affordable one. 95% of my work on a computer was (and is) word processing making a GUI largely unnecessary. Also, the clone I bought had a 16-color monitor, and there were scores of applications available (including games in full color). For me it was a no brainer. I had been a programmer in the 1970s on mainframes, so the text-based interface was no problem. I was an early warrior in the Mac versus PC wars that have continued unabated down to today.

The McIntosh apple has to be on the menu for today’s anniversary. The fruit has red and green skin, a tart flavor, and tender white flesh, which ripens in late September. In the 20th century it was the most popular cultivar in Eastern Canada and New England, and is considered an all-purpose apple, suitable both for cooking and eating raw. John McIntosh discovered the original McIntosh sapling on his Dundela farm in Upper Canada in 1811. He and his wife bred it, and the family started grafting the tree and selling the fruit in 1835. In 1870, it entered commercial production, and became common in northeastern North America after 1900. Apple crisp is a dessert that is popular in both the US and Canada, using Macs. It is dead easy to make, consisting of baked chopped apples, topped with a crisp streusel crust. It is similar to apple crumble, but not the same. Apple crumble uses rolled oats in the topping. You can use fine Graham cracker crumbs in place of the flour in apple crisp.

Apple (McIntosh) Crisp

Ingredients

6 McIntosh apples, peeled, cored, and diced into ½” pieces
½ lemon, juiced
1 tsp ground cinnamon
½ tsp freshly grated nutmeg
2 tbsp granulated white sugar
½ cup flour
½ cup brown sugar
½ stick/2oz butter, cubed

Instructions

Preheat the oven to 400˚F/200.

In a 9 by 12-inch baking dish, combine the apples, lemon juice, cinnamon, nutmeg and white sugar.

In a small bowl, mix the flour, brown sugar, and butter together using the tines of a fork and your fingers, working until the ingredients are evenly distributed, and the mix resembles coarse sand. Or (as I do), pulse the mix in a food processor.

Sprinkle the topping evenly over the apples and bake for 15 to 20 minutes, or until the apples are just tender and topping is golden brown.

Serve warm, plain, or with vanilla ice cream.

May 172017
 

On this date in 1902, archaeologist Valerios Stais found among some pieces of rock that had been retrieved from the Antikythera shipwreck in Greece 2 years earlier, one piece of rock that had a gear wheel embedded in it. Stais initially believed it was an astronomical clock, but most scholars at the time considered the device to be an anachronism of some sort, too complex to have been constructed during the same period as the other pieces that had been discovered with it (dated around the 1st and 2nd centuries BCE). Nope !! What is now called the Antikythera mechanism is, in fact, an ancient Greek analogue computer and orrery used to predict astronomical positions and eclipses for calendrical and astrological purposes, as well as a four-year cycle of athletic games that was similar, but not identical, to an Olympiad, the cycle of the ancient Olympic Games.  Nothing like it would re-emerge in Europe for 15 centuries. There is so much about the ancient world that remains a mystery (Stonehenge, the Pyramids, etc.).

The Antikythera mechanism was found to be housed in a 340 mm (13 in) × 180 mm (7.1 in) × 90 mm (3.5 in) wooden box but full analysis of its form and uses has only recently been fully performed.  In fact after Stais discovered it, it was ignored for 50 years, but then gradually scientists of various stripes, including historians of science, looked into it, and research into the mechanism is ongoing. Derek J. de Solla Price of Yale became interested in it in 1951, and in 1971, both Price and Greek nuclear physicist Charalampos Karakalos made X-ray and gamma-ray images of the 82 fragments.

The mechanism is clearly a complex clockwork device composed of at least 30 meshing bronze gears. Using modern computer x-ray tomography and high resolution surface scanning, a team led by Mike Edmunds and Tony Freeth at Cardiff University were able to look inside fragments of the crust-encased mechanism and read the faintest inscriptions that once covered the outer casing of the machine. Detailed imaging of the mechanism suggests it dates back to around 150-100 BCE and had 37 gear wheels enabling it to follow the movements of the moon and the sun through the zodiac, predict eclipses and even recreate the irregular orbit of the moon. The motion, known as the first lunar anomaly, was first described by the astronomer Hipparchus of Rhodes in the 2nd century BCE, and so it’s possible that he was consulted in the machine’s construction. Its remains were found as one lump later separated into three main fragments, which are now divided into 82 separate fragments after conservation work. Four of these fragments contain gears, while inscriptions are found on many others. The largest gear is approximately 140 mm (5.5 in) in diameter and originally had 224 teeth.

It is not known how the mechanism came to be on the sunken cargo ship, but it has been suggested that it was being taken from Rhodes to Rome, together with other looted treasure, to support a triumphal parade being staged by Julius Caesar. The mechanism is not generally referred to as the first known analogue computer, and the quality and complexity of the mechanism’s manufacture suggests it has undiscovered predecessors made during the Hellenistic period.

In 1974, Price concluded from the gear settings and inscriptions on the mechanism’s faces that it was made about 87 BCE and lost only a few years later. Jacques Cousteau and associates visited the wreck in 1976 and recovered coins dated to between 76 and 67 BCE. Though its advanced state of corrosion has made it impossible to perform an accurate compositional analysis, it is believed the device was made of a low-tin bronze alloy (of approximately 95% copper, 5% tin). All its instructions are written in Koine Greek, and the consensus among scholars is that the mechanism was made in the Greek-speaking world.

In 2008, continued research by the Antikythera Mechanism Research Project suggested the concept for the mechanism may have originated in the colonies of Corinth, since they identified the calendar on the Metonic Spiral as coming from Corinth or one of its colonies in Northwest Greece or Sicily. Syracuse was a colony of Corinth and the home of Archimedes, which, so the Antikythera Mechanism Research project argued in 2008, might imply a connection with the school of Archimedes. But the ship carrying the device also contained vases in the style common in Rhodes of the time, leading to a hypothesis that the device was constructed at an academy founded by the Stoic philosopher Posidonius on that island. Rhodes was busy trading port in antiquity, and also a center of astronomy and mechanical engineering, home to the astronomer Hipparchus, active from about 140 BCE to 120 BCE. That the mechanism uses Hipparchus’s theory for the motion of the moon suggests the possibility he may have designed, or at least worked on it. Finally, the Rhodian hypothesis gains further support by the recent decipherment of text on the dial referring to the dating (every 4 years) of the relatively minor Halieia games of Rhodesl. In addition, it has recently been argued that the astronomical events on the parapegma (almanac plate) of the Antikythera Mechanism work best for latitudes in the range of 33.3-37.0 degrees north. Rhodes is located between the latitudes of 35.5 and 36.25 degrees north.

Using analysis of existing fragments various attempts have been made on paper, and in metal, to reconstruct a working model of the mechanism.

Some of the earliest Greek recipes extant mention the use of cheese. In book 9 of Homer’s Odyssey, Odysseus meets the Cyclops Polyphemus in cave who, on returning with his sheep and goats from the fields, milks them and makes cheese with the milk. Feta is made from sheep’s milk or a mix of sheep’s and goat’s milk, so some food historians conjecture that feta or something akin may date from the 8th century BCE (Homer’s era).

One of the oldest Greek recipes, although hard to interpret accurately, calls for fish baked with cheese and herbs.  I don’t have the necessary ingredients to hand to experiment at the moment, and recipes for baked or fried fish and feta that I have available, all call for New World ingredients such as tomatoes and zucchini. My suggestion would be to coat a roasting pan with olive oil, lay in some Mediterranean fish fillets, and top them with crumbled feta mixed with either yoghurt or breadcrumbs seasoned with dill, salt and pepper. Garlic and onions would make good seasonings as well. Bake at 375˚F for 20 to 25 minutes and serve with boiled potatoes and a green salad.

If you don’t want to be quite so adventurous, fill halved pitas with a mix of feta, chives and herbs, drizzle with olive oil, and grill briefly until the pitas are golden and the feta is soft.

May 072017
 

Today, the first Sunday in May, is World Laughter Day. The first celebration was actually on January 10, 1998, in Mumbai, and was arranged by Dr. Madan Kataria, founder of the worldwide Laughter Yoga movement. Now there are special World Laughter Day events in at least 105 countries worldwide. Kataria, a family doctor in India, was inspired to start the Laughter Yoga movement in part by the facial feedback hypothesis, which postulates that a person’s facial expressions can have an effect on their emotions. There is also some scientific evidence that laughter is medically helpful. Kataria’s speculation is that it does not matter whether laughter is forced or natural to have a beneficial effect. I can understand the hypothesis although I have no evidence to support it other than anecdotal. It is, of course, fundamental to yoga that body posture influences mental state. I think that this is unquestionably true, but whether it applies to deliberate laughter is not clear to me. However, I see no reason why we can’t deliberately provoke actual laughter. If I want to laugh I can watch this video, for example. It cracks me up – every single time.

Why this particular cat fail clip should make me laugh so reliably is not clear, and brings up the whole question of the nature of humor which has been studied endlessly and with little profit. Incongruity is one facet of humor, as in this case. The cat so clearly wants to jump up on the shelf, and fails. But . . . it does not jump and miss; its “jump” is not even worthy of the name. It just falls off the table. It is the combination of obvious desire and epic failure that appeals to me; that, and the fact that I know cats and their desires very well.

As a graduate student I wrote a paper on incongruity in comic strips for my sociolinguistics class. My (lame) hypothesis involved showing that sometimes cartoonists tried to be funny by making their characters say things that were grossly out of characters, such as, children being wise well beyond their years, or, conversely, adults talking like children. The latter is the stock in trade of the immensely popular television series The Big Bang Theory, which I detest precisely for that reason. The premise that highly intelligent men typically act like children in their social lives annoys me beyond words. First, the premise is demonstrably false, and, second, seeing grown men acting like boys does not amuse me.

Although some animals, especially non-human primates, exhibit physical behaviors that look like laughter, I find it highly unlikely that animals are capable of actual laughter. Chimpanzees and orangutans sometimes display laughter-like behavior when they are enjoying themselves, but human laughter extends well beyond simple enjoyment. It is much more complex. Much of human laughter comes from language, and this is outside of non-human capability.

There is no question that laughter can be infectious. This classic English music hall song, The Laughing Policeman, relies on infection for success (or failure):

I’ve always enjoyed provoking laughter from my students when I teach. It’s not a deliberate strategy; I can’t help myself. I see the funny side of things. In fact I see the funny side of just about everything when I am with other people. But there’s the thing. For me laughter is sociable. If I watch a movie by myself that amuses me, I don’t laugh, but if I am with other people, I do. Back in my college days no one had a television, but we had a television room and we would pack it on certain occasions, such as when Monty Python came on. The place would be in hysterics from start to finish, and I would laugh along with the others.

This point reminds me that laughter is intensely culturally specific. I had many colleagues in the US who did not find Monty Python funny in the slightest. On the other side of the coin, when I was in China I could not for the life of me figure out what Chinese jokes were all about, and they were perplexed at my humor. There was also the complication that Chinese university students generally think it is impolite to laugh out loud in class.

I had two separate ideas for recipes today. The first was to talk about “joke” dishes, that is, dishes that look like one thing but are actually another. Here, for example, is a “grilled cheese” sandwich that is actually toasted pound cake slices with a yellow icing for filling:

However, I’ve covered this idea before several times. So, instead I want to look at amusing recipes. I found this online (click to enlarge).

It’s a recipe generated by a computer program trying to emulate the activity of neural networks – that is, getting a computer learn how to think the way humans think. They were produced  by Janelle Shane using char-rnn, an open-source program on GitHub that she (and others) can customize to build their own neural networks. She gave it a cookbook to analyze and then asked it to produce new recipes. Granting computers human intelligence has a long way to go. I think we’re safe from a robot takeover for a while. Or . . . maybe they are already ingenious enough to know how to chop beer. Frightening.

Here’s another recipe that will keep you guessing:

Pears Or To Garnestmeam

meats

¼ lb bones or fresh bread; optional
½ cup flour
1 teaspoon vinegar
¼ teaspoon lime juice
2  eggs

Brown salmon in oil. Add creamed meat and another deep mixture.

Discard filets. Discard head and turn into a nonstick spice. Pour 4 eggs onto clean a thin fat to sink halves.

Brush each with roast and refrigerate.  Lay tart in deep baking dish in chipec sweet body; cut oof with crosswise and onions.  Remove peas and place in a 4-dgg serving. Cover lightly with plastic wrap.  Chill in refrigerator until casseroles are tender and ridges done.  Serve immediately in sugar may be added 2 handles overginger or with boiling water until very cracker pudding is hot.

Yield: 4 servings

Nov 022016
 

gb2

Today is the birthday (1815) of George Boole, English mathematician, philosopher and logician. He worked in the fields of differential equations and algebraic logic, and is best known as the author of The Laws of Thought (1854) which contains Boolean algebra. Without Boolean logic we would not have digital computers. Let me try to break that thought down for you (a little). There is an important philosophical issue here summed up in the question: “How do humans think?” What Boole called “The Laws of Thought” are actually the laws of mathematical logic. Well . . . I think we all know that humans are not logical. Humans are not very complicated digital computers – not even very, very, very complicated digital computers. Computers can emulate human thought in a lot of ways. They can become very skilled at chess, for example. They can also be very good at problem solving, using algorithms that can be better than human methods. But human thought processes are qualitatively different in important ways. Let’s explore. First, a smattering of history.

Boole was born in Lincoln, the son of John Boole (1779–1848), a shoemaker and Mary Ann Joyce. He had a primary school education, and received lessons from his father, but had little further formal and academic teaching. William Brooke, a bookseller in Lincoln, may have helped him with Latin, which he may also have learned at the school of Thomas Bainbridge. He was self-taught in modern languages. At age 16 Boole became the breadwinner for his parents and three younger siblings, taking up a junior teaching position in Doncaster at Heigham’s School. He also taught briefly in Liverpool.

gb3

Boole participated in the Lincoln Mechanics’ Institution, which was founded in 1833. Edward Bromhead, who knew John Boole through the institution, helped George Boole with mathematics texts, and he was given the calculus text of Sylvestre François Lacroix by the Rev. George Stevens Dickson of St Swithin’s, Lincoln. Boole had no teacher, but after many years mastered calculus. At age 19, Boole successfully established his own school in Lincoln. Four years later he took over Hall’s Academy in Waddington, outside Lincoln, following the death of Robert Hall. In 1840 he moved back to Lincoln, where he ran a boarding school. Boole became a prominent local figure and an admirer of John Kaye, the bishop. With E. R. Larken and others he set up a building society in 1847. He associated also with the Chartist Thomas Cooper, whose wife was a relation.

gb5

From 1838 onwards Boole was making contacts with sympathetic British academic mathematicians and reading more widely. He studied algebra in the form of the symbolic methods that were understood at the time, and began to publish research papers on calculus and algebra. Boole’s status as mathematician was soon recognized with his appointment in 1849 as the first professor of mathematics at Queen’s College, Cork (now University College Cork (UCC)) in Ireland. He met his future wife, Mary Everest, there in 1850 while she was visiting her uncle John Ryall who was Professor of Greek. They married some years later in 1855. He maintained his ties with Lincoln, working there with E. R. Larken in a campaign to reduce prostitution.

It’s hard to explain briefly how Boole’s algebra, now known as (the foundations of) Boolean algebra, revolutionized mathematics and logic. Anyone who studies mathematics or computer science needs to know some of the basics of Boolean algebra – created by a man who finished primary school only, and otherwise studied mathematics on his own without teachers. Astonishing.  Boolean algebra is the branch of algebra in which the values of the variables are the truth values true and false, usually denoted as 1 and 0 respectively. In elementary algebra (the kind you start with in school), the values of the variables are numbers, and the main operations are addition and multiplication. In basic Boolean algebra the operations are the conjunction “and” denoted as ∧, the disjunction “or” denoted as ∨, the negation “not” denoted as ¬, and the implication “therefore” denoted as →. In other words, Boolean algebra is a formal system for describing logical relations in the same way that ordinary algebra describes numeric relations – and it needs only 4 operations and 2 values. With this simple basis you can perform (or describe) any logical procedure that you want – and it can become extremely sophisticated. If you know any set theory, you’ll also recognize the basic operations there too, and if you’ve done any computer programming, you know how important this algebra is.

gb6

What’s more important for the modern world, all logical procedures can be turned into electric systems using this algebra. Crudely put, an electric pulse that turns a switch on can be called “1” (true), and no electrical pulse can be called “0” (false).  You can also build logical gates that emulate the 4 Boolean operations using electrical currents that you send electrical pulses into (input). The gates perform the operation, and produce an output. For example, if you send an electric pulse (1/true) into a “not” gate, no pulse (0/false) comes out the other end. A digital computer’s chip consists of billions and billions of these logic gates set up in complex ways so that when you enter input, it goes through these gates and emerges as output. To make the input usable by the computer it has to be translated into binary code first. Binary mathematics uses only 1s and 0s, which become electrical pulses.

gb8

This all gets very complicated very quickly, so I won’t go on about the computing side any more. What I will say, though, is that when it was discovered that human brains contain synapses in complex networks such that they could pass around electric pulses between neurons (brain cells), seemingly like logic gates in a digital computer, both neuroscientists and psychologists started thinking of the brain as a computer. A single synapse is either firing (sending a pulse), or not – that is, it is either 1 or 0. Nest all the synapses together in complex ways and it appears that you have a flesh and blood digital computer. Many physical and social scientists believed that what evolution had created naturally, humans had stumbled on artificially. In time, therefore, computers could be built that were exactly like human brains, and eventually you’d have robots that were indistinguishable from humans.

Nerve cells firing, artwork

It doesn’t take a whole lot of thinking (using our non-digital brains) to see the problem here. For example, a properly functioning computer does not forget things; properly functioning humans forget things all the time – including important things. A properly functioning computer does not make mathematical errors; humans make them all the time. Put crudely again, computers are logical; humans are not. Logic is only a fraction of our thinking process – and, in my experience, not a very common one. That’s why characters such as Mr Spock in the Star Trek series (a person who is strictly logical), are so weird. In part this is because our brains are much more than logic circuits, and we still don’t really understand how our brains work. We do know that we don’t work by logic, and nor do our brains. Attempts to reduce personal and social systems of thought to Boolean algebra have yielded interesting results in all manner of fields – linguistics, psychology, sociology, anthropology, philosophy, history, etc. etc. – but all have failed because human thought just isn’t digital, let alone logical.

Let’s move Boolean algebra into the food sphere. Here’s a logical operation: “If I have flour and water, I can make dough.” This contains the Boolean “and” as well as the implication, “therefore.” If I have flour (flour = true) AND if I have water (water = true), then I can make dough (dough = true) or in symbolic form: flour ∧   water → dough. Sorry to readers who know any Boolean algebra or symbolic logic for the slight oversimplification here.

Let’s be just slightly more complex: sugar ∧   water  ∧  heat → toffee, which we can translate as, “If I have sugar and water and a source of heat, I can make toffee. OK, let’s do it. I have sugar and water and a source of heat. This recipe is extremely simple. I used to do it when I was 8 years old.

gb1

You will need:

1 cup sugar

¼ cup water

Put the water and sugar in a saucepan and heat gently, stirring constantly, to dissolve the sugar. When the sugar is completely dissolved, turn the heat to high and let the mixture boil. Keep a very close eye on it. At this stage you do not have to stir. After about 20 minutes (depending on your heat source) the sugar will begin to show little strands of brown as the sugar caramelizes. This is the critical stage. Begin stirring until the whole mixture is brown then IMMEDIATELY remove it from the heat. I then pour it on to a marble slab where it cools and hardens into toffee. You can also use toffee molds if you want.

If you get experienced at toffee making you can select the darkness that you want. Darker toffees need to cook a bit longer, and are more flavorful and more brittle. Be careful though – it’s an easy step from brown to black. Black is not good.