Nov 282018
 

The first observation of a pulsar took place on this date in 1967, by Jocelyn Bell, a postgraduate student in astrophysics at Cambridge university, and later confirmed by her doctoral adviser Antony Hewish. She observed pulses separated by 1.33 seconds that originated from the same location in the sky, and kept to sidereal time (time measured by the stars rather than the sun). In looking for explanations for the pulses, the short period of the pulses eliminated most astrophysical sources of radiation, such as stars, and since the pulses followed sidereal time, it could not be human radio frequency interference. When observations with another telescope confirmed the emission, it eliminated any sort of instrumental effects.

At this point, Bell said of herself and Hewish that “we did not really believe that we had picked up signals from another civilization, but obviously the idea had crossed our minds and we had no proof that it was an entirely natural radio emission. It is an interesting problem—if one thinks one may have detected life elsewhere in the universe, how does one announce the results responsibly?” Even so, they nicknamed the signal LGM-1, for “little green men”. It was not until a second pulsating source was discovered in a different part of the sky that the “LGM hypothesis” was entirely abandoned. The first pulsar was later dubbed CP 1919, and is now known by a number of designators including PSR 1919+21 and PSR J1921+2153. Although CP 1919 emits in radio wavelengths, pulsars have subsequently been found to emit pulses in visible light, X-ray, and gamma ray wavelengths. The word “pulsar” is a portmanteau of ‘pulsating’ and ‘quasar’, and first appeared in print in March 1968 in the Daily Telegraph.

The existence of neutron stars was first proposed by Walter Baade and Fritz Zwicky in 1934, when they argued that a small, dense star consisting primarily of neutrons would result from a supernova. Lodewijk Woltjer proposed in 1964 that such neutron stars might contain magnetic fields as large as 1014 to 1016 G. In 1967, shortly before the discovery of pulsars, Franco Pacini suggested that a rotating neutron star with a magnetic field would emit radiation, and even noted that such energy could be pumped into a supernova remnant around a neutron star, such as the Crab Nebula. After the discovery of the first pulsar, Thomas Gold independently suggested a rotating neutron star model similar to that of Pacini, and explicitly argued that this model could explain the pulsed radiation observed by Bell and Hewish. The discovery of the Crab pulsar later in 1968 seemed to provide confirmation of the rotating neutron star model of pulsars. The Crab pulsar has a 33-millisecond pulse period, which was too short to be consistent with other proposed models for pulsar emission. Moreover, the Crab pulsar is so named because it is located at the center of the Crab Nebula, consistent with the 1933 prediction of Baade and Zwicky.

In 1974, Antony Hewish and Martin Ryle became the first astronomers to be awarded the Nobel Prize in Physics, with the Royal Swedish Academy of Sciences noting that Hewish played a “decisive role in the discovery of pulsars”. Considerable controversy is associated with the fact that Hewish was awarded the prize while Bell, who made the initial discovery while she was his doctoral student, was not. That Bell did not receive recognition in the 1974 Nobel Prize in Physics has been a point of controversy ever since. She helped build the Interplanetary Scintillation Array over two years, which was used to observe quasars, and initially noticed the anomaly in the Array’s data, sometimes reviewing as much as 96 feet (29 m) of paper data per night. Bell later claimed that she had to be persistent in reporting the anomaly in the face of skepticism from Hewish, who was initially insistent that it was due to interference from human sources. She spoke of meetings held by Hewish and Ryle to which she was not invited. In 1977, she commented on the issue:

First, demarcation disputes between supervisor and student are always difficult, probably impossible to resolve. Secondly, it is the supervisor who has the final responsibility for the success or failure of the project. We hear of cases where a supervisor blames his student for a failure, but we know that it is largely the fault of the supervisor. It seems only fair to me that he should benefit from the successes, too. Thirdly, I believe it would demean Nobel Prizes if they were awarded to research students, except in very exceptional cases, and I do not believe this is one of them. Finally, I am not myself upset about it – after all, I am in good company, am I not!

She is certainly in good company. As I reported here — http://www.bookofdaystales.com/world-diabetes-day/ — the 1923 Nobel Prize in medicine was awarded to Frederick Banting and J.J.R Macleod for the discovery of insulin. Banting was the lead researcher, and Macleod was the director of the lab where Banting made his discovery, assisted by Charles Best, a medical student. Macleod was on holiday when Banting and Best were conducting their experiments. Yet Banting and Macleod got the Nobel, and Best, the student, was left out even though he made major contributions and Macleod did next to nothing. Likewise, Bell got overlooked, while Hewish, whom she had to convince that pulsars were real, got the honor.

Bell has since received many prestigious honors. This year she was awarded the Special Breakthrough Prize in Fundamental Physics, worth three million dollars (£2.3 million), for her discovery of radio pulsars. The Special Prize, in contrast to the regular annual prize, is not restricted to recent discoveries. She donated all of the money “to fund women, under-represented ethnic minority and refugee students to become physics researchers.”

If you have star-shaped cookie cutters you can make star cookies today, or something of the sort. But, in honor of the Crab pulsar in the Crab nebula, you might consider a nebula cake. This site gives a thorough recipe.

https://www.sprinklebakes.com/2016/04/black-velvet-nebula-cake.html

Apr 252018
 

Today is sometimes called World DNA Day although it is not an official holiday for any organization. It commemorates the day in 1953 when James Watson, Francis Crick, Maurice Wilkins, Rosalind Franklin and colleagues published papers in the journal Nature on the structure of DNA. Furthermore, on this date in 2003 (the 50th anniversary) it was declared that the Human Genome Project was very close to complete, and “the remaining tiny gaps” were considered too costly to fill. I have often used the first paper on DNA published by Crick and Watson (originally 2 handwritten pages), as a caution to my students that a paper does not have to be long to be good. Of course, Crick and Watson were not anthropology undergraduates. Some scientific or mathematical discoveries can be put on paper briefly, because that is one of the hallmarks of science itself – reducing complex datasets to elegant and simple equations or formulae. At one level, the structure of DNA is simple to understand; it is the ramifications of that structure that are so wondrously complex and fascinating.

With an “alphabet” of just four nucleotides that form pairs at the center of the DNA molecule – cytosine [C], guanine [G], adenine [A] and thymine [T] – we can “spell” the genetic code of every living thing on earth. The implications of this fact completely revolutionized biology (and allied sciences) in my lifetime. DNA analysis changed the way we think about species and evolution for starters.  With DNA analysis we can plot the exact lines of the development of species and the relationships between them. Both old taxonomic systems of species and the lines of their evolution have been upended by the introduction of DNA analysis. The broad strokes remain the same, of course, but there has been an enormous amount of shuffling around inside those broad strokes. DNA has also confirmed what anthropologists have known for a long time: race is not a biological fact. Human biological variation exists on a continuum and, therefore, there are no biological markers – DNA or otherwise – for specific races. Race is a cultural, not a biological, classification system.

DNA analysis was as important an additional to the criminal forensic science toolkit as fingerprinting was 100 years ago. In fact, DNA matching is much more accurate that fingerprinting and has been used, not only to convict criminals, but also to free the wrongly convicted (including from death row). Analysis of DNA can be used to identify potential medical hazards for an individual, and some methods are available now for repairing DNA. Without doubt, the identification of the structure of the DNA molecule was the single biggest step forward in the biological sciences in the 20th century. What was known very well by the scientific community, but less well known by the general public, is that Crick and Watson should be given a great deal of credit, but by no means all of it, for the discovery.

Isaac Newton was not being very original when he said, “If I have seen further than others, it is by standing upon the shoulders of giants,” but he was displaying suitable humility. James Watson was not quite so humble when he wrote The Double Helix: A Personal Account of the Discovery of the Structure of DNA, published in 1968. When I read it, shortly after it was published, I was struck by Watson’s arrogance, not to mention his obvious sexism. Since then I have noted that I was far from alone in that opinion. As Watson notes, repeatedly, the race was on in the early 1950s to be the first to publish an accurate account of the structure of DNA, and this was not a particular priority for Crick. The race to be the first in certain scientific fields can be intense because the rewards, in terms of money, prestige, and influence, are so high. Great minds can disagree as to whether competition is the best road to discovery. It is certainly the norm in Western science, deriving from cultures that seem to thrive on competition.

In The Double Helix, Watson acknowledges, but downplays, the contributions of others in the search for the structure of DNA, and occasionally makes egregious, and unwarranted, remarks, such as his comment that Rosalind Franklin, whose research was vital for Crick and Watson’s final discovery, was not quite attractive enough to be called “pretty.” What does that remotely have to do with anything? He never says anything about the appearance of male colleagues. I could say that Francis Crick has the appearance of a bad Spike Milligan impersonator, but you would not, on the basis of that remark, think I had keen insight into anything – scientific or otherwise.

The Nobel committee was, thankfully, more open minded than Watson in awarding the prize jointly to Crick, Watson, and Maurice Wilkins. Nobels may be shared by no more than three scientists, and they must be living at the time of the award. These conditions excluded Rosalind Franklin (who had died) and Raymond Gosling, her graduate assistant, whose X-ray diffraction imagery of the DNA molecule was fundamental to Crick and Watson’s breakthrough. Today’s anniversary celebration in this post is not of the original announcement of the discovery, which was made in the Eagle pub in Cambridge on 28th February, but of the publication in Nature of related papers by Crick and Watson, Maurice Wilkins, Rosalind Franklin and Raymond Gosling, and others, acknowledging the need for collaborative effort to achieve significant results.

Understanding the structure of DNA has led to genetic engineering technologies in a variety of fields. One of the best-known and controversial applications of genetic engineering is the creation and use of genetically modified crops or genetically modified livestock to produce genetically modified food. Crops have been developed to increase production, increase tolerance to abiotic stresses, alter the composition of the food, or to produce novel products.

The first genetically modified crops to be grown commercially on a large scale provided protection from insect pests or tolerance to herbicides. Fungal and virus resistant crops have also being developed or are in development. This make the insect and weed management of crops easier and can therby increase crop yield. GM (genetically modified) crops that directly improve yield by accelerating growth or making the plant hardier (by improving salt, cold or drought tolerance) are also under development. In 2016, salmon were genetically modified with growth hormones to reach normal adult size much faster.

GMOs (genetically modified organisms) have been developed that modify the quality of produce by increasing the nutritional value or providing more industrially useful qualities or quantities.] The Amflora potato produces a more industrially useful blend of starches. Soybeans and canola have been genetically modified to produce healthier oils. The first commercialized GM food was a tomato that had delayed ripening, increasing its shelf life.

This site fascinates me, although its claims seem a bit far-fetched, and they could use a decent copy editor: https://thespoon.tech/personalizing-food-directed-by-your-dna/  The Commonwealth Scientific and Industrial Research Organization (CSIRO), has just launched a three-year study into the personalized fabrication of smart food systems. The basic idea is to examine a person’s DNA, find any flaws in it, and then – via technology not yet invented – create foods that are optimal in restoring that person’s DNA to health. Imagine that you have a machine, perhaps rather like a refrigerator, that takes a blood sample, analyzes it, than manufactures foods for your next meal that are perfectly matched to your current genetic needs. This machine would be something like a 3-D printer, except that instead of making objects it would make foods. Sounds like something out of the Jetsons, I know, and I expect that the technology is still some way in the future despite the optimism of the writers.

There are 2 things that trouble me about this sci-fi scenario. First, I do not trust biochemical engineers to come up with the right food for me from a machine. Bioengineering does not have a great track record, and there are countless mistakes that have been made. Second, rather related to the first, I cannot imagine a machine-made food product more satisfying – nor healthier –  than the food I buy from markets and cook for myself. I may be unusual, but I do not believe I am unique in paying attention to my day-to-day appetites for finding the foods that are the best choices for my body at that moment. Surely everyone at one time or another has had the experience of getting sick, hankering after specific things (or nothing at all) and, it turns out, those foods are what the body needs in those circumstances to get better. I am well aware that this is not a trustworthy system by any means, but I believe we could do a lot better by uncovering natural physiological solutions to our dietary needs, than dreaming of a food-making machine.

Oct 212017
 

Today is the birthday (1833) of Alfred Bernhard Nobel, a Swedish chemist best known for inventing dynamite and for establishing the Nobel prizes. The two are inextricably entwined, so, at the risk of repeating what you already know, I’ll dribble on for a while about Nobel, explosives, bombs, guns, and prizes before giving you a recipe.

Alfred Nobel was born in Stockholm, the third son of Immanuel Nobel (1801–1872), an inventor and engineer, and Carolina Andriette (Ahlsell) Nobel (1805–1889). The couple married in 1827 and had 8 children, only 4 of whom survived past childhood. As a boy Alfred Nobel was interested in engineering, particularly explosives, learning the basic principles from his father at a young age. Because of various business failures, Nobel’s father moved to Saint Petersburg in 1837 and was successful there as a manufacturer of machine tools and explosives. He invented modern plywood and started work on the torpedo. In 1842, the family joined him in the city. Now prosperous, his parents were able to send Nobel to private tutors and the boy excelled in his studies, particularly in chemistry and languages, becoming fluent in English, French, German and Russian. For 18 months, from 1841 to 1842, Nobel went to the only school he ever attended as a child, the Jacobs Apologistic School in Stockholm.

As a young man, Nobel studied with chemist Nikolai Zinin; then, in 1850, went to Paris for further studies. There he met Ascanio Sobrero, who had invented nitroglycerin three years earlier. Sobrero strongly opposed the use of nitroglycerin, as it was unpredictable, exploding when subjected to heat or pressure. But Nobel became interested in finding a way to control and use nitroglycerin as a commercially viable explosive, since it had much more power than gunpowder. At age 18, he went to the United States for 1 year to study chemistry, working for a short period under John Ericsson, who designed the American Civil War ironclad USS Monitor.

The Nobel family factory produced armaments for the Crimean War (1853–1856), but had difficulty switching back to regular domestic production when the fighting ended and they filed for bankruptcy. In 1859, Nobel’s father left his factory in the care of the second son, Ludvig Nobel (1831–1888), who greatly improved the business. Nobel and his parents returned to Sweden from Russia and Nobel devoted himself to the study of explosives, and especially to the safe manufacture and use of nitroglycerin. Nobel invented a detonator in 1863, and in 1865 designed the blasting cap. On 3 September 1864, a shed used for preparation of nitroglycerin exploded at the factory in Heleneborg, Stockholm, killing five people, including Nobel’s younger brother Emil. Dogged and unfazed by more minor accidents, Nobel went on to build further factories, focusing on improving the stability of the explosives he was developing.

Nobel found that when nitroglycerin was incorporated in an absorbent inert substance like kieselguhr (diatomaceous earth) it became safer and more convenient to handle, and this mixture he patented in 1867 as “dynamite.” Nobel demonstrated his explosive for the first time that year, at a quarry in Redhill, Surrey in England. In order to help reestablish his name and improve the image of his business from the earlier controversies associated with the dangerous explosives, Nobel had also considered naming the highly powerful substance “Nobel’s Safety Powder” but settled with Dynamite instead, referring to the Greek word for “power” (δύναμις — dynamis).

Nobel later combined nitroglycerin with various nitrocellulose compounds, similar to collodion, but settled on a more efficient recipe combining another nitrate explosive, and obtained a transparent, jelly-like substance, which was a more powerful explosive than dynamite. ‘Gelignite’, or blasting gelatin, as it was named, was patented in 1876; and was followed by a host of similar combinations, modified by the addition of potassium nitrate and various other substances. Gelignite was more stable, transportable and conveniently formed to fit into bored holes, like those used in drilling and mining, than the previously used compounds and was adopted as the standard technology for mining in the late 19th century bringing Nobel significant wealth, though at a cost to his health. An offshoot of this research resulted in Nobel’s invention of ballistite, the precursor of many modern smokeless powder explosives and still used as a rocket propellant.

In 1888 Alfred’s brother Ludvig died while visiting Cannes and a French newspaper erroneously published Alfred’s obituary. It condemned him for his invention of dynamite and is said to have brought about his decision to leave a better legacy after his death. The obituary stated, “Le marchand de la mort est mort” (“The merchant of death is dead”) and went on to say, “Dr. Alfred Nobel, who became rich by finding ways to kill more people faster than ever before, died yesterday.” Nobel, who never had a wife or children, was disappointed with what he read and was subsequently concerned with how he would be remembered.

Nobel’s brothers, Ludvig and Robert, had exploited oilfields along the Caspian Sea and became hugely rich in their own right. Nobel invested in these and amassed great wealth through the development of these new oil regions. During his life Nobel was issued 355 patents internationally and by the time of his death his business had established more than 90 armaments factories, despite his stated belief in pacifism.

On 27 November 1895, at the Swedish-Norwegian Club in Paris, Nobel signed his last will and testament and set aside the bulk of his estate to establish a set of prizes, to be awarded annually without distinction of nationality. After taxes and bequests to individuals, Nobel’s will allocated 94% of his total assets, 31,225,000 Swedish kronor (SEK), to establish the five Nobel Prizes. This converted to £1,687,837 (GBP) at the time. In 2012, the capital was worth around SEK 3.1 billion (USD 472 million, EUR 337 million).

The first three of these prizes are awarded for eminence in physical science, in chemistry, and in medical science or physiology; the fourth is for literary work “in an ideal direction” and the fifth prize is to be given to the person or society that renders the greatest service to the cause of international unity, in the suppression or reduction of standing armies, or in the establishment or furtherance of peace congresses.

On December 10, 1896, Alfred Nobel succumbed to a lingering heart ailment, suffered a stroke, and died.  His family, friends and colleagues were unaware that had left most of his wealth in trust to fund what are now known as the Nobel Prizes. He is buried in Norra begravningsplatsen in Stockholm.

Today, as it happens, is Apple Day. The celebration started in England and has since spread to other countries where apples are grown. If I am still at it next year I’ll write a blog post on Apple Day on this day. For now, let’s have a recipe for Swedish apple pie in honor of Nobel. Swedish apple pie is actually a cross between a pie and a pudding, but it is one of my favorites for a quick dessert. Granny Smith apples are probably the best for this recipe, but use whatever good baking apple suits your fancy. Don’t use eating apples. See if you can find true cinnamon as well; it beats the generic cassia you get in cheap spice racks in supermarkets. You’ll need to go online, but it’s worth it – trust me.

Swedish Apple Pie

Ingredients

1 ½ lb apples – peeled, cored, and sliced
1 cup plus 1 tbsp sugar
1 cup flour
1 tsp powdered cinnamon
¾ cup melted butter
1 egg, beaten

Instructions

Preheat the oven to 350˚F/175˚C.

Toss the apple slices with 1 tablespoon of sugar, and pour them into a pie plate.

Thoroughly mix together 1 cup of sugar with the flour, cinnamon, butter, and egg in a bowl. Spread this mixture evenly over the top of the pie.

Bake in the preheated oven for about 45 minutes, or until the apples have cooked and the topping is golden brown.

Jul 262016
 

gbs5

Today is the birthday (1856) of George Bernard Shaw, who preferred simply Bernard Shaw but is often referred to now as Shaw or GBS. He was an Irish playwright, critic and polemicist whose influence on Western theatre, culture and politics has extended from the 1880s to the present day. He wrote more than sixty plays, including perennial favorites such as Man and Superman (1902), Pygmalion (1912) and Saint Joan (1923). Pygmalion was the basis for My Fair Lady, of course. Shaw was the leading dramatist of his generation, and is the only person to have won both a Nobel Prize in Literature, and an Oscar.

gbs2

Shaw was born in Dublin, and moved to London in 1876, where he struggled to establish himself as a writer and novelist. By the mid-1880s he had become a respected theatre and music critic. Following a political awakening, he joined the gradualist Fabian Society and became its most prominent pamphleteer. Shaw had been writing plays for years before his first public success, Arms and the Man in 1894. He sought to introduce a new realism into English-language drama, using his plays as vehicles to disseminate his political, social and religious ideas. By the early 20th century his reputation as a dramatist was secured with a series of critical and popular successes that included Major Barbara, The Doctor’s Dilemma, and Caesar and Cleopatra.

gbs4

Shaw’s views were, let us say, controversial. On the more mundane side, he wanted a reform of the system of writing English, including an end to the use of the apostrophe. One certainly can’t quarrel with his demonstrations that English spelling lacks logic, and is an impediment to literacy. He promoted eugenics, and opposed vaccination and organized religion. He courted unpopularity by denouncing both sides in the First World War as equally culpable, and castigated British policy on Ireland in the postwar period. By the late 1920s he spoke favorably of dictatorships on the right and left, expressing admiration for both Mussolini and Stalin. In the final decade of his life he was largely a recluse, but continued to write prolifically.  He refused all state honors including the Order of Merit in 1946.

I don’t believe that there is any need to ramble on about Shaw’s life nor his beliefs. I’m not particularly keen on his plays, but I do like In Good King Charles’s Golden Days, because it’s his opportunity to explore key themes of the Enlightenment period. It’s a discussion play in which the issues of nature, power, and leadership are debated between King Charles II (‘Mr Rowley’), Isaac Newton, George Fox and the artist Godfrey Kneller, with interventions by three of the king’s mistresses (Barbara Villiers, 1st Duchess of Cleveland; Louise de Kérouaille, Duchess of Portsmouth; and Nell Gwynn) as well as his queen, Catherine of Braganza.

gbs8

This little exchange at the beginning gives the flavor:

MRS BASHAM.  And you have been sitting out there forgetting everything else since breakfast.  However, since you have one of your calculating fits on I wonder would you mind doing a little sum for me to check the washing bill.  How much is three times seven?

NEWTON.  Three times seven?  Oh, that is quite easy.

MRS BASHAM.  I suppose it is to you, sir; but it beats me.  At school I got as far as addition and subtraction; but I never could do multiplication or division.

NEWTON.  Why, neither could I: I was too lazy.  But they are quite unnecessary: addition and subtraction are quite sufficient.  You add the logarithms of the numbers; and the antilogarithm of the sum of the two is the answer.  Let me see: three times seven?  The logarithm of three must be decimal four seven seven or thereabouts.The logarithm of seven is, say, decimal eight four five.  That makes one decimal three two two, doesnt it?  What’s the antilogarithm of one decimal three two two?  Well, it must be less than twentytwo and more than twenty.  You will be safe if you put it down as–

Sally returns.

SALLY.  Please, maam, Jack says it’s twentyone.

NEWTON.  Extraordinary!  Here was I blundering over this simple problem for a whole minute; and this uneducated fish hawker solves it in a flash!  He is a better mathematician than I.

Let me add a few more quotes from Shaw’s other works that I like:

Life isn’t about finding yourself. Life is about creating yourself.

Progress is impossible without change, and those who cannot change their minds cannot change anything.

Without art, the crudeness of reality would make the world unbearable.

A life spent making mistakes is not only more honorable, but more useful than a life spent doing nothing.

The man with a toothache thinks everyone happy whose teeth are sound. The poverty-stricken man makes the same mistake about the rich man.

A broken heart is a very pleasant complaint for a man in London if he has a comfortable income.

Everything happens to everybody sooner or later if there is time enough.

Human beings are the only animals of which I am thoroughly and cravenly afraid.

Atrocities are not less atrocities when they occur in laboratories and are called medical research.

There is no sincerer love than the love of food.

The last quote is often repeated. Shaw was well known for his vegetarianism, inspired by his desire to avoid harm to animals. In his day his avoidance of meat was heavily remarked upon because it was so unusual. I had no luck discovering what, if any, was Shaw’s favorite dish, but I figured an Irish vegetarian dish would be suitable.

gbs6

In digging I found this 8th century Irish poem, “The Hermit’s Song” or “Marbán to Guaire” all about wild foods in Ireland:

To what meals the woods invite me
All about!
There are water, herbs and cresses,
Salmon, trout.
A clutch of eggs, sweet mast and honey
Are my meat,
Heathberries and whortleberries for a sweet.
All that one could ask for comfort
Round me grows,
There are hips and haws and strawberries,
Nuts and sloes.
And when summer spreads its mantle
What a sight!
Marjoram and leeks and pignuts,
Juicy, bright.

Pignuts are mentioned at the tail end, so let’s begin there. The pignut, Conopodium majus is a small perennial herb, whose underground part resembles a chestnut and is sometimes eaten as a wild or cultivated root vegetable. The plant has many English names (many of them shared with Bunium bulbocastanum, a related plant with similar appearance and uses) including kippernut, cipernut, arnut, jarnut, hawknut, earth chestnut, groundnut, and earthnut. From its popularity with pigs come the names pignut, hognut, and more indirectly Saint Anthony’s nut, for Anthony the Great or Anthony of Padua, both patron saints of swineherds. The plant is common through much of Europe and parts of North Africa. It grows in woods and fields, and is an indicator of long-established grassland.

Pignuts are favorites of wild food foragers. You can find a good description here:

https://cumbriafoodie.com/2011/06/04/pignuts-a-little-hidden-gem-for-the-forager/

gbs7

Pignuts remind me a little of Jerusalem artichokes although they are smaller and the taste is rather different. Because I love leeks so much and because marjoram, leeks, and pignuts are mentioned in the same line in the poem, why not make a soup of all three. I’d normally use chicken stock as the base but because I want to be vegetarian here I’ll use vegetable stock. Quantities are not important as long as you have equal portions of pignuts and leeks. Jerusalem artichokes or salsify will work in place of pignuts, but will have to be cut into chunks.

© Pignut and Leek Soup

Ingredients

½ kg pignuts, washed and peeled
½ kg leeks, washed and sliced thickly
vegetable stock
fresh marjoram, finely shopped
salt and pepper

Instructions

Place the pignuts in a heavy pot and cover with stock. Season to taste with marjoram, salt, and freshly ground black pepper. Bring to a simmer and cook gently, covered, for about 30 minutes. Add the leeks and cook for another 15 minutes or so. Add more stock if needed, but don’t make the soup too thin. Cooking times really depend on how you like your vegetables. I like mine al dente. Add more fresh marjoram at the very end, and serve in deep bowls with crusty bread.