Displaying colored chicken’s eggs has been an Easter custom for a very long time; just exactly how long is a matter of debate. Decorating eggs in general is an ancient art. Furthermore, eggs have been an enduring symbol of death and rebirth in numerous Mesopotamian cultures for thousands of years. Thus, their association with Easter seems perfectly natural. What intrigues me is how diverse the traditions are these days.
There seems to me to be some merit in the speculation that boiled eggs were eaten at Easter for practical reasons. In the Middle Ages eggs were forbidden during the Lenten fast in some traditions, but, being Spring time, chickens did not stop laying. You can keep eggs for quite some time without spoilage, but not forever. Three weeks is about the limit. Boiling them allows you to keep them a little longer, and then at Easter, when the Lenten fast is over, they can be eaten. Boiling them with certain natural dye materials, such as onion skins or some tree barks, adds a whole new dimension – including additional decoration.
Let me just interject a quick note here about refrigerating versus not refrigerating fresh eggs. People in the US refrigerate EVERYTHING, including many items that should NOT be refrigerated. Chocolate, bread, and tomatoes, for example, will degrade much more quickly if refrigerated – but people do it anyway (not me!!). Eggs are complicated. Generally they are refrigerated in the US, but not in Europe. There is a reason for the difference. Eggs in the US are scrupulously washed before storage, and the washing removes a thin protective film which they acquire from the hen in the laying process, making the shells porous and open to invasion by harmful bacteria. So after washing they must be refrigerated. Eggs in Europe are not washed, so the protective film is preserved and they can be safely stored at room temperature. I prefer room temperature eggs for cooking under most circumstances, so when I lived in the US I had to take them out of the refrigerator some time before using them. Here in Italy there is no need – likewise when I lived in Argentina and China. Trying to change habits in the US is almost certainly a lost cause.
There are so many different ways to decorate eggs that it would take me a fortnight to enumerate them all. One simple, very traditional, way is to affix a pattern to the eggs before boiling them in colored water so that the stain penetrates only the bare surface of the eggs. Pace eggs in the north of England are made this way (“pace” being a dialect variant of “pesach” – Aramaic for Passover/Easter, giving the common Romance words – via Latin (pascha) – for Easter such as Pascua, Pasqua, or Pâques). Pace egging was a longstanding tradition in rural England involving a death and resurrection play and a begging song. This traditional version comes from Burscough in Lancashire:
In eastern European countries, notably, Ukraine, a tradition of dyeing eggs in highly developed patterns using a wax-resist method (batik) has evolved into an art form that is still popular, with many regional variations.
Similar traditions have evolved throughout Mediterranean and Slavic cultures, and sometimes displaying them on Easter “trees”.
There is also a rather rarer tradition throughout Europe of carving lacey patterns into the uncolored shells. This is incredibly delicate work that requires years of practice.
Chocolate eggs are a relative newcomer to the Easter scene; not possible until the perfection of techniques for making solid chocolate in the 19th century, allied with industrial processes for making hollow shapes.
Of course you can make decorative or artistic egg-shaped forms for Easter out of any material from marzipan to gold.
There’s probably no need to extol the enormous versatility of the chicken egg. Instead I’ll showcase a dish I made several years ago based on a 14th century recipe: poached egg with a saffron and ginger flavored Hollandaise. You should be able to work it out without a detailed recipe from me.
For Easter breakfast or brunch you can whip up a frittata, tortilla, omelet, or quiche is plain eggs are too bland for you. Later you can have a baked egg custard, pancake, flan, or egg-anything-you want. Let’s instead consider the virtues of eggs other than chicken eggs.
Duck. Duck eggs are not easy to find in the West, but in Chinese markets they are as common as chicken eggs and can be used in much the same way. I bought them all the time in Yunnan. They are a little more flavorful than chicken eggs – perhaps earthier.
Quail. Once quail eggs were hard to find in the West, but I have no trouble getting them in northern Italy now. They’re a little fiddly to cook with. You can boil them, but peeling them is a chore. I usually fry them, but you’ll need quite a few if you are making a meal of them !!! In China they have special utensils for frying them in a row on a stick. This is a great street snack. Usually I chose the option of dusting them with a hot spicy powder. The fun is in the size more than the taste. They’re not so different from chicken eggs in that regard.
Goose. The goose egg is larger than duck or chicken eggs and is decidedly more robust in flavor. They’re hard to find and I don’t care to go to the trouble these days because I’m not a fan of the taste.
Ostrich. I’ve never seen ostrich eggs for sale outside of Africa, and even there they are not common. Ostriches don’t produce very many eggs and breeders generally use them to make more ostriches. They are gigantic with an exceedingly tough shell that takes a hammer, or the like, to break into. One egg will serve more than one person – scrambled or made into an omelet. They are delicious if you can ever get hold of one that is fresh enough to eat.
Passover begins at sundown today this year (2017). This post is the last concerning the three major moveable Jewish holy days, the others being Sukkot and Shavuot, which I have already covered. According to Torah prescriptions, Jews were required to celebrate these three festivals in Jerusalem, Passover being the most central to tradition. Jesus, as a faithful Jew, is reported to have traveled to Jerusalem for Passover at least once (when he fell afoul of the law and was executed). Hence Passover and Easter are inextricably linked, but since early Medieval times the Christian church has gone to great lengths to make sure that their observations do not coincide. Given that Passover can fall on any day of the week, but Easter must fall on a Sunday, it’s not all that difficult to keep them apart. The fact that they are so close together at all this year is relatively rare.
I simply cannot imagine that the entire Jewish population in antiquity downed tools and traveled to Jerusalem three times a year. It makes no sense in practical terms. Who’s going to mind the sheep or the shop whilst everyone is making a beeline for Jerusalem? I can see it happening a few times in a lifetime, but not every single year. Passover is, however, very deeply embedded in Jewish history and tradition and continues to be an important aspect of Jewish identity to this day. Observant and non-observant Jews of all stripes have a Passover seder, at the very least, every year with varying degrees of commitment to established religious practice. Not to do so would be the equivalent of a family of Christian background not celebrating Christmas. It does happen of course. Preparing a seder is a lot of work. But almost all of the Jews that I know, even the most vehemently non-religious, mark Passover in some way or another.
If I get too deeply mired in discussing the history and evolution of Passover we’ll be here all year. So I’ll try to keep it simple (dangerously teetering on the edge of the simplistic). My views on the matter are not very popular among Jews anyway — nor most Christians either. It was one of those great turning points in my life when I learned as a first year theology student at Oxford that Biblical historians and archeologists simply did not believe that the slavery in Egypt of the Israelites, the exodus under Moses, the wandering in the desert for 40 years, and the ultimate conquest of Canaan, had any basis in historical fact. Say what ???? That’s pretty fundamental to Jewish (and Christian) belief. People who’ve barely cracked the Bible know about parting the Red Sea and the like. BUT . . . extra-Biblical sources for any of this narrative are non-existent, and archeology flatly contradicts all of the details. The current explanation for the appearance of the Israelites in the Levant that has the most favor among archeologists and historians (the ones who have no religious or ethnic axes to grind, that is), is that the putative 12 tribes of Israel were at the outset a loosely confederated group of related Semitic peoples who had migrated into the land from various places and unified for a time against other indigenous cultures. The centrality of Judah and Jerusalem were a consequence of the defeat and expulsion of the northern tribes by Assyria which left only the tribes of Judah and Benjamin in the south intact and soldiering on. Through a combination of relative isolation and shrewd political maneuvering they were able to tough it out a little longer until they were crushed and deported by the forces of Babylon.
The two periods that, for me (and a great many other Biblical historians), are crucial in understanding how Passover emerged and evolved as central to Jewish tradition and identity are the reforms of Josiah (649-609 BCE) and the Babylonian Exile which are inextricably linked. Until Josiah was king of Judah the nation had managed to stave off attack by neighboring empires such as Egypt and Assyria by being relatively subservient and compliant – paying tribute, accepting multiple religious traditions and the like – as ways of keeping a low profile. Under Josiah that all changed. He came to the throne at the age of 8 and ruled for 31 years. During this time the neighboring empires were struggling with one another for supremacy and went through periods of waxing and waning fortunes. This situation left Judah in a relatively strong position to assert itself. It had no chance against the likes of Egypt or Babylon when they were at full strength, but when they were weak(er) powerful people in Judah could entertain visions of grandeur. Hence Judah under Josiah, swayed by politician-scholars, created a bold new identity and was (seemingly) ready to take on the world.
During Josiah’s middle years Judah underwent a nativist revolution led by a group now called the Deuteronomists (after one of the texts they wrote). Nativism involves stripping a culture of what it perceives as “foreign” elements (religion, literature, language, clothing, foodways, etc) and highlighting the “original” (or “native”) core as it is perceived. According to the Hebrew Bible, in his 18th regnal year (when he was 26), Josiah ordered tax money to be used to renovate the Temple and during the renovation a “Book of the Law” (sefer ha-torah) was “discovered.” Modern scholars now generally believe that the “discovery” was a plant by the Deuteronomists and the book they “discovered” was one they had written: either Deuteronomy itself or a portion of it. Josiah took the book seriously, was horrified discovering all the laws in it that were not being followed (and the penalties for such crimes against God), and immediately set about stripping away all practices that were foreign and opposed the law, and establishing all the laws that were enshrined in the document. Among other things, the law prescribed that Passover should be held in Jerusalem every year on a certain date, with explanations concerning why it was to be observed, and how. When the Temple renovations were complete and all the foreign cults removed (and their priests executed), Josiah held a massive celebratory Passover.
Thus the story of the Israelite slavery in Egypt, the attempts by Moses to free the people from bondage, the various plagues that God sent to convince the Pharoah to release the people, and, finally, God’s commandment to an angel to kill every firstborn male in Egypt who lived in a house whose doorposts were not smeared with the blood of a sacrificed lamb, became an indelible part of the history and identity of the Jewish people – commemorated every year with the ritual slaughter and consumption of sacrificial lambs. My (not terribly well supported) conjecture is that Josiah’s great Passover was the first, and that it has been celebrated every year since following the rules laid down in Deuteronomy and other books of the Torah. The symbolism of bondage and release received a boost a generation later when the Babylonian army defeated Judah, destroyed the Temple, and deported the bulk of the population to Babylon in the period now known as the Exile or the Captivity. During this seminal period I believe that classic Jewish belief solidified. Following the return to Jerusalem, the Jews suffered multiple conquests by empires including the Greek and Roman which, again, strengthened the symbolism until in 70 CE the Romans essentially wiped out the population of Judah, destroyed the Second Temple (built after the return from the Exile) and scattered the Jews across Europe and the world with no homeland. This new Diaspora once more reinforced the Passover message of bondage, alienation, and oppression – offering an eventual release, which was partially granted by the creation of the state of Israel after 2 millennia of separation from the land.
The Passover meal, the seder, is, of course central to the celebration. Where it was once made up of (ritually slaughtered) lamb which recalled the blood of lambs saving the people in bondage, bitter herbs, recalling the bitterness of slavery, and unleavened bread, recalling the haste with which the people left Israel with no time to let the bread rise, now all but the unleavened bread are tokens. The classic seder dish, often using a special platter reserved for that one night, consists typically of a roasted lamb shank or chicken wing, a roasted boiled egg, 2 kinds of bitter herbs, a leafy herb to be dipped in salt water, and a brown sweet paste of ground fruit and nuts. Each has symbolic meaning which is explained during the meal. There are also three whole matzot, which are stacked and separated from each other by cloths or napkins. The middle matzoh will be broken and half of it put aside for the ritual of the afikoman (a game played with children to maintain their interest and help in the process of understanding the symbolism). The top and other half of the middle matzoh will be used for the hamotzi (blessing over bread), and the bottom one will be used for the korech (Hillel sandwich).
It always seems to me a shame at these meals that these elements are merely symbolic. They are all great food items. What’s not to love about lamb, roast eggs, salty greens, horseradish, and unleavened bread washed down with cups of wine? These days the principal seder dishes vary according to the underlying ethnicity of the family. I’ve only ever attended eastern Ashkenazi seders where matzoh ball soup, gefilte fish, and brisket reign supreme. There are recipes galore for these classics all over the place. Matzoh brei is a lesser known Passover treat used as a sweet interlude, and involving the central unleavened bread.
2 sheets matzoh
2 large eggs
salt and pepper
jam or syrup
Break the matzoh into small places and place in a bowl. Cover with very hot water and let steep for about 30 seconds, then drain thoroughly. Meanwhile beat the eggs in a separate bowl with salt and pepper to taste.
Heat enough vegetable oil in a skillet for very shallow frying (2 or 3 tablespoons) over medium-high heat.
Combine the eggs and matzoh and mix thoroughly. Divide into 4, shaping each into a thin, flat pancake.
Fry the pancakes one at a time until golden on both sides, about one minute per side (turning only once).
Serve slightly broken up with whatever jam or syrup you prefer.
The Pontifical Swiss Guard takes today’s date as the official date of its foundation. In September 1505 a contingent of 150 Swiss soldiers started their march towards Rome, under the command of Kaspar von Silenen, and entered the city on 22 January 1506 to take up their duties as the pope’s guard. Tourists to the Vatican are well aware of the guys who look like they might be refugees from a Renaissance Fayre. Don’t be fooled. These guys are not toy soldiers; they are the real deal. Furthermore, their swords and halberds are not toys either. They are razor sharp and the halberdiers know how to use them.
Swiss mercenaries were fierce and highly respected in 14th and 15th century Europe, somewhat in contrast with the general popular image of Switzerland as a nation of peace-loving clockmakers. Pope Sixtus IV (1471–1484) made an alliance with the Swiss Confederacy and built barracks in Via Pellegrino after foreseeing the possibility of recruiting Swiss mercenaries. The pact was renewed by Innocent VIII (1484–1492) in order to use them against the Duke of Milan. Alexander VI (1492–1503) later actually used the Swiss mercenaries during their alliance with the King of France. During the time of the Borgias, however, the Italian Wars began in which the Swiss mercenaries were a fixture in the front lines among the warring factions, sometimes for France and sometimes for the Holy See or the Holy Roman Empire. The mercenaries enlisted when they heard King Charles VIII of France was going to war with Naples. Among the participants in the war against Naples was Cardinal Giuliano della Rovere, the future Pope Julius II (1503–1513), who was well acquainted with the Swiss, having been Bishop of Lausanne years earlier.
The expedition failed, in part thanks to new alliances made by Alexander VI against the French. When Cardinal della Rovere became Pope Julius II in 1503, he asked the Swiss Diet to provide him with a constant corps of 200 Swiss mercenaries. This was made possible through the financing of the German merchants from Augsburg, Bavaria, Ulrich and Jacob Fugger, who had invested in the Pope and saw it fit to protect their investment. There have been a few short periods when the Swiss Guard was disbanded for one reason or another, so they cannot claim to have an absolutely unbroken record. But, even so, they are one of the oldest standing armies in existence. They are also the smallest.
The force has varied greatly in size over the years. Its most significant hostile engagement was on 6 May 1527, when 147 of the 189 Guards, including their commander, died fighting the troops of Holy Roman Emperor Charles V in the stand of the Swiss Guard during the Sack of Rome in order to allow Clement VII to escape through the Passetto di Borgo, escorted by the other 40 guards. The last stand battlefield is located on the left side of St Peter’s Basilica, close to the Campo Santo Teutonico (German Graveyard). Clement VII was forced to replace the Swiss Guard by a contingent of 200 German mercenaries (Custodia Peditum Germanorum). Ten years later, under Pope Paul III, the Swiss Guard was reinstated, under commander Jost von Meggen.
After the end of the Italian Wars, the Swiss Guard ceased to be used as a military combat unit in the service of the pope and its role became mostly that of the protection of the person of the pope and of a ceremonial guard. However, twelve members of the Pontifical Swiss Guard of Pius V served as part of the Swiss Guard of admiral Marcantonio Colonna in the Battle of Lepanto in 1571.
The office of commander of the Papal Guard came to be a special honor in the Catholic part of the Swiss Confederacy. It became strongly associated with the leading family of Lucerne, Pfyffer von Altishofen. Between 1652 and 1847, nine out a total of ten commanders were members of this family (the exception being Johann Kaspar Mayr von Baldegg, also of Lucerne, served 1696–1704).
In 1798, commander Franz Alois Pfyffer von Altishofen went into exile with the deposed Pius VI. After the death of the pope on 29 August 1799, the Swiss Guard was disbanded and only reinstated by Pius VII in 1801. In 1808, Rome was again captured by the French and the guard was disbanded again. Pius VII was exiled to Fontainebleau. The guard was reinstated under the same commander, Karl Leodegar Pfyffer von Altishofen, when the pope returned from exile in 1814. The guard was disbanded yet again in 1848, when Pius IX fled to Gaeta, but the guard was reinstated when the pope returned to Rome in the following year.
In the later 19th century, the Swiss Guard developed into a purely ceremonial function. Guards in the Vatican were “Swiss” only in name, mostly born in Rome to parents of Swiss descent and speaking the Roman Trastevere dialect. The modern Swiss Guard is the product of the reforms pursued by Jules Repond, commander during 1910–1921. Repond proposed to recruit only native citizens of Switzerland and he introduced rigorous military exercise. He also attempted to introduce modern arms, but Pius X only permitted the presence of firearms if they were not functional. Repond’s reforms and strict discipline were not well received by the corps, culminating in a week of open mutiny in July 1913. In his project to restore the Swiss Guard to its former prestige, Repond also dedicated himself to the study of historical costume, with the aim of designing a new uniform that would be both reflective of the historical Swiss costume of the 16th century and suited for military exercise. The result of his studies was published as Le costume de la Garde suisse pontificale et la Renaissance italienne (1917). Repond designed the distinctive Renaissance-style uniforms still worn by the modern Swiss Guard. The introduction of the new uniforms was completed in May 1914.
The foundation of Vatican City as a modern sovereign state was negotiated in the Lateran Treaty of 1929. The duties of protecting public order and security in the Vatican lay with the Papal Gendarmerie Corps, while the Swiss Guard, the Palatine Guard and the Noble Guard served mostly ceremonial functions. The Palatine and Noble Guards were disbanded by Paul VI in 1970, leaving the Swiss Guard as the only ceremonial guard unit of the Vatican. At the same time, the Gendarmerie Corps was transformed into a Central Security Office, with the duties of protecting the Pope, defending Vatican City, and providing police and security services within its territory, while the Swiss Guard continued to serve primarily ceremonial functions. Paul VI in a decree of 28 June 1976 defined the nominal size of the corps at 90 men. This was increased to 100 men by John Paul II on 5 April 1979.
Since the assassination attempt on John Paul II of 13 May 1981, a much stronger emphasis has been placed on the guard’s non-ceremonial roles. The Swiss Guard was developed into a modern guard corps equipped with modern small arms, and members of the Swiss Guard in plain clothes now accompanied the pope on his travels abroad for his protection.
To be considered for the guard a man must be between the ages of 19 and 30, unmarried, Swiss by birth, Catholic. Most also hold university or professional degrees. At minimum they must have completed basic training in the Swiss military. Service in the guard may be from 2 to 25 years. They may marry after three years’ service if they are over the age of 25.
Although their duties are largely ceremonial, the guards are all fit and well trained, and their armory looks like a weapons museum stocked with everything from cutlasses and muskets to the latest in automatic pistols and rifles – and, they are all in good condition, and are routinely used.
To celebrate the Swiss Guard I’ve chosen a very popular recipe, originally from the German-speaking cantons of Switzerland, but now widespread, and considered a mainstay of Swiss cuisine: rösti. These may look like conventional US hash browns, but they are infinitely more toothsome.
660 g Yukon Gold potatoes
125 g unsalted butter
7 g kosher salt
150 g crème fraîche
Preheat oven to 375 °F / 190 °C.
Clean and peel the potatoes. Shave potatoes lengthwise on a mandoline (approximately 0.03 in / 1 mm thick). Stack the slices in several small piles and cut them into shreds.
Place the shredded potatoes into cold water and thoroughly rinse away the excess starch. Then drain the shreds and dry them with paper towels.
Melt the butter, add the salt, and toss the shreds in the butter mixture. Do not do this until you are ready to start cooking or the salt will draw water from the potatoes.
Heat a large (10 in / 25 cm) nonstick, ovenproof frying pan over medium-high heat. Add the prepared potato shreds to the hot pan. Press them flat so that they completely cover the bottom of the pan. Cook until the bottom surface becomes golden and crisp (about 5 minutes).
Put the pan in the pre-heated oven and bake for 10–15 minutes. Remove the pan from the oven, and then carefully flip the rösti over. Bake for another 15 minutes.
Some cooks add an additional step at this point, although the rösti is ready to eat at this point. Remove the rösti from the pan and place it directly on a baking rack set over a baking sheet. Return this assembly to the oven and bake for another 5 minutes. This makes the surface becomes very crisp.
Remove from the oven and let the rösti rest for 5 minutes. Then cut into wedges and serve with a garnish of crème fraîche and chopped chives.
On this date in 1889 the world’s first jukebox was installed at the Palais Royale Saloon in San Francisco. It became an overnight sensation, and soon spread to public meeting places around the world. I had to do a double take when I first saw the date because these were the very early days of the Edison phonograph. It is correct, however. The first jukebox was constructed by the Pacific Phonograph Company. Four stethoscope-like tubes were attached to an Edison Class M electric phonograph fitted inside an oak cabinet. The tubes operated individually, each being activated by the insertion of a coin, meaning that four different listeners could be plugged in to the same song simultaneously. Towels were supplied to patrons so they could wipe off the end of the tube after each listening.
The machine was originally called the “nickel-in-the-slot player” by Louis Glass, the entrepreneur who installed it at the Palais Royale. (A nickel then had the buying power of $1.08 today.) It came to be known as the jukebox only later, although the origin of the word remains a bit vague. Coin-operated music boxes and player pianos were the first forms of automated coin-operated musical devices. These instruments used paper rolls, metal disks, or metal cylinders to play a musical selection on the instrument, or instruments, enclosed within the device.
Early designs of the jukebox, upon receiving a coin, unlocked a mechanism, allowing the listener to turn a crank that simultaneously wound the spring motor and placed the reproducer’s stylus in the starting groove. Frequently, exhibitors would equip many of these machines with listening tubes (acoustic headphones) and array them in “phonograph parlors”, allowing the patron to select between multiple records, each played on its own machine. Some machines even contained carousels and other mechanisms for playing multiple records. Most machines were capable of holding only one musical selection, the automation coming from the ability to play that one selection at will. In 1918 Hobart C. Niblack patented an apparatus that automatically changed records, leading to one of the first selective jukeboxes being introduced in 1927 by the Automated Musical Instrument Company, later known as AMI.
In 1928, Justus P. Seeburg, who was manufacturing player pianos, combined an electrostatic loudspeaker with a record player that was coin-operated, and gave the listener a choice of eight records. This Audiophone machine was wide and bulky, and had eight separate turntables mounted on a rotating Ferris wheel-like device, allowing patrons to select from eight different records. Later versions of the jukebox included Seeburg’s Selectophone, with 10 turntables mounted vertically on a spindle. By maneuvering the tone arm up and down, the customer could select from 10 different records.
Greater levels of automation were gradually introduced. As electrical recording and amplification improved there was increased demand for coin-operated phonographs. The term jukebox came into use in the United States beginning in 1940, perhaps derived from the familiar usage “juke joint”, derived from the Gullah word “juke” or “joog” meaning disorderly, rowdy, or wicked.
Song-popularity counters told the owner of the machine the number of times each record was played (A and B side were generally not distinguished), with the result that popular records remained, while lesser-played songs could be replaced.
Wallboxes were an important, and profitable, part of any jukebox installation. Serving as a remote control, they enabled patrons to select tunes from their table or booth. One example is the Seeburg 3W1, introduced in 1949 as companion to the 100-selection Model M100A jukebox. Stereo sound became popular in the early 1960s, and wallboxes of the era were designed with built-in speakers to provide patrons a sample of this latest technology.
Initially playing music recorded on wax cylinders, the shellac 78 rpm record dominated jukeboxes in the early part of the 20th century. The Seeburg Corporation introduced an all 45 rpm vinyl record jukebox in 1950; since the 45s were smaller and lighter, they became dominant for the last half of the 20th century. 33⅓-R.P.M., C.D.s, and videos on DVDs were all introduced and used in the last decades of the century. MP3 downloads, and Internet-connected media players came in at the start of the 21st century. The jukebox’s history has followed the wave of technological improvements in music reproduction and distribution. With its large speaker size, facilitating low-frequency reproduction, and large amplifier, the jukebox played sound with higher quality and volume than listeners could in their homes.
Jukeboxes were most popular from the 1940s through the mid-1960s, particularly during the 1950s. By the middle of the 1940s, three-quarters of the records produced in America went into jukeboxes. While often associated with early rock and roll music, their popularity extends back much earlier, including classical music, opera and the swing music era.
Styling progressed from the plain wooden boxes in the early thirties to beautiful light shows with marbleized plastic and color animation in the Wurlitzer 850 Peacock of 1941. But after the United States entered the war, metal and plastic were needed for the war effort. Jukeboxes were considered “nonessential”, and no new ones were produced until 1946. The 1942 Wurlitzer 950 featured wooden coin chutes to save on metal. At the end of the war, in 1946, jukebox production resumed and several “new” companies joined the fray. Jukeboxes started to offer visual attractions: bubbles, waves, circles of changing color which came on when a sound was played.
Models designed and produced in the late 20th century needed more panel space for the increased number of record titles they needed to present for selection, reducing the space available for decoration, leading to less ornate styling in favor of functionality and less maintenance. Traditional jukeboxes once were an important source of income for record publishers. Jukeboxes received the newest songs first. They played music on demand without commercials. They offered a means to control the music listened to beyond what was available through the technology of their heyday. The invention of the transistor, which made possible the portable radio, in the 1950s was a key factor in the demise of the jukebox. Nowadays jukeboxes have largely been replaced by personal digital audio players with easy and free access to music on demand.
Back in the 1970s and 1980s I would come across jukeboxes quite often in pubs, bars, and diners and could be moved once in a while to put something on. They were cheap and choosing music for public consumption was attractive once in a while. Sometimes you would come across a place with a really old or special selection of records and would feel the need to put something on. My wife and I often played jukeboxes in diners where every booth had it’s own individual selection box. Of course, the association of jukeboxes and diners goes back a long way – as evidenced in the prominence of the jukebox in the diner in the sitcom Happy Days:
This association conjures up corned beef hash and poached eggs for me, because that was my commonest diner breakfast when I was out with my wife. But I used to make it myself quite often too. You can get corned beef hash in cans in the U.S. but home made is much, much better.
You’ll need to start by cutting off a good hunk of cooked corned beef and chopping it reasonably fine with a knife. Dice a cooked potato, along with a medium green bell pepper, and a medium onion. Heat a little olive oil in a heavy skillet, add the peppers and onion and sauté until soft. Next add the corned beef and potatoes, stir the mixture well and moisten with a little beef stock. Cover and let steam for a few minutes. Then uncover and continue to cook and stir until the hash is heated through and moist but not mushy. Serve on a heated platter topped with one or two poached eggs, buttered toast, and hot sauce on the side.
On this date in 1493, on his second voyage to the New World, Christopher Columbus landed on the island now called Puerto Rico, naming it San Juan Bautista (Saint John the Baptist).
The first settlement, Caparra, was not founded until August 1508 (by Juan Ponce de León, a lieutenant under Columbus). November 19 is now a federal holiday on Puerto Rico called Día del Descubrimiento (Discovery Day). For Hispanic peoples the word “discovery” is apt enough, but it’s worth me stating again that it was a discovery for Europeans only. Indigenous peoples already knew it was there !! The word “discovery” is deeply ethnocentric. Unfortunately there are no longer any indigenous peoples left on Puerto Rico to protest, as there are elsewhere in the Americas. The deep irony is that the once proud and haughty slave-owning Spanish conquistadors in the Caribbean and the southeast and southwest of the U.S. have now become an exploited minority. What goes around, comes around. I suggest that white men pay attention.
Ponce de León had been the leader of the Higuey massacre on Hispaniola (now Haiti and the Dominican Republic). In 1502 the newly appointed governor, Nicolás de Ovando, arrived in Hispaniola. The Spanish Crown expected Ovando to bring order to a colony in disarray. Ovando interpreted this as authorizing subjugation of the native Taínos. Thus, Ovando authorized the Jaragua Massacre in November 1503. In 1504, when Tainos overran a small Spanish garrison in Higüey on the island’s eastern side, Ovando assigned Ponce de León to crush the rebellion about which friar Bartolomé de las Casas attempted to notify Spanish authorities. Ovando rewarded Ponce de León by appointing him frontier governor of the newly conquered province, then named Higüey also. Ponce de León received a substantial land grant which authorized sufficient Indian slave labor to farm his new estate.
Ponce de León prospered in this new role. He found a ready market for his farm produce and livestock at nearby Boca de Yuma where Spanish ships stocked supplies before the long voyage back to Spain. In 1505 Ovando authorized Ponce de León to establish a new town in Higüey, which he named Salvaleón. In 1508 King Ferdinand (Queen Isabella having opposed the exploitation of natives but dying in 1504) authorized Ponce de León to conquer the remaining Taínos and exploit them in gold mining.
As provincial governor, Ponce de León had occasion to meet with the Taínos who visited his province from neighboring San Juan (Puerto Rico) which had not, as yet, been colonized by the Spanish. They told him stories of a fertile land with much gold to be found in the many rivers. Inspired by the possibility of riches, Ponce de León requested and received permission from Ovando to explore the island. His first reconnaissance of the island is usually dated to 1508 but there is evidence that he had made a previous exploration as early as 1506. This earlier trip was done quietly because the Spanish crown had commissioned Vicente Yáñez Pinzón to settle the island in 1505. Pinzón did not fulfill his commission and it expired in 1507, leaving the way clear for Ponce de León.
His earlier exploration had confirmed the presence of gold and gave him a good understanding of the geography of the island. In 1508, Ferdinand II of Aragon gave permission to Ponce de León for the first official expedition to the island. This expedition, consisting of about 50 men in one ship, left Hispaniola on July 12, 1508 and eventually anchored in San Juan Bay, near today’s city of San Juan. Ponce de León searched inland until he found a suitable site about two miles from the bay. Here he erected a storehouse and a fortified house, creating the first settlement in Puerto Rico, Caparra. Although a few crops were planted, they spent most of their time and energy searching for gold. By early 1509 Ponce de León decided to return to Hispaniola. His expedition had collected a good quantity of gold but was running low on food and supplies.
The expedition was deemed a great success and Ovando appointed Ponce de León governor of San Juan Bautista. This appointment was later confirmed by Ferdinand II on August 14, 1509. He was instructed to extend the settlement of the island and continue mining for gold. He returned to the island, bringing with him his wife and children. Ponce de León parceled out the native Taínos amongst himself and other settlers using a system of forced labor known as the repartimiento system, under which natives were distributed to Spanish officials to be used as slave labor. On December 27, 1512, under pressure from the Roman Catholic Church, Ferdinand II of Aragon issued the Burgos’ Laws, which modified the repartimiento into a system called encomiendas, aimed at ending the exploitation. The laws prohibited the use of any form of punishment toward the indigenous people, regulated their work hours, pay, hygiene, and care, and ordered them to be catechized.
In 1511, the Taínos revolted against the Spanish. Cacique (chief) Urayoán, as ordered by Agüeybaná II, had his warriors drown the Spanish soldier Diego Salcedo to determine whether the Spaniards were immortal. After drowning Salcedo, they kept watch over his body for three days to confirm his death and then revolted. The revolt was easily crushed by Ponce de León and within a few decades much of the native population had been decimated by disease, violence, and a high rate of suicide. As a result, Taíno culture, language, and traditions were generally destroyed, and were claimed to have “vanished” 50 years after Christopher Columbus arrived.
The Roman Catholic Church, realizing the opportunity to expand its influence, also participated in colonizing the island. On August 8, 1511, Pope Julius II established three dioceses in the New World, one in Puerto Rico and two on the island of Hispaniola under the archbishop of Seville. The Canon of Salamanca, Alonso Manso, was appointed bishop of the Puerto Rican diocese. On September 26, 1512, before his arrival on the island, the first school of advanced studies was established by the bishop. Taking possession in 1513, he became the first bishop to arrive in the Americas. Puerto Rico would also become the first ecclesiastical headquarters in the New World during the reign of Pope Leo X and the general headquarters of the Spanish Inquisition in the New World.
As part of the colonization process, African slaves were brought to the island in 1513. Following the decline of the Taíno population, more slaves were brought to Puerto Rico; however, the number of slaves on the island paled in comparison to those in neighboring islands. Also, early in the colonization of Puerto Rico, attempts were made to wrest control of Puerto Rico from Spain. The Caribs, a raiding ethic group of the Caribbean, attacked Spanish settlements along the banks of the Daguao and Macao rivers in 1514 and again in 1521 but each time they were easily repelled by the superior Spanish firepower. The rest, as they say, is history.
The other European powers realized the potential of Puerto Rico and attempted to wrestle control of it from Spain for centuries. Successes were small and varied in intensity, but ultimately Spain retained control until the late 19th century. In 1890, Captain Alfred Thayer Mahan, a member of the Navy War Board and leading U.S. strategic thinker, wrote The Influence of Sea Power upon History in which he argued for the creation of a large and powerful navy modeled after the British Royal Navy. Part of his strategy called for the acquisition of colonies in the Caribbean. These would serve as coaling and naval stations, as well as strategic points of defense after construction of a canal in the Panama Isthmus. Since 1894, the Naval War College had been formulating plans for war with Spain and by 1896, the Office of Naval Intelligence had prepared a plan which included military operations in Puerto Rican waters.
On March 10, 1898, Dr. Julio J. Henna and Robert H. Todd, leaders of the Puerto Rican section of the Cuban Revolutionary Party, began to correspond with United States President William McKinley and the United States Senate in hopes that they would consider including Puerto Rico in the intervention planned for Cuba. Henna and Todd also provided the US government with information about the Spanish military presence on the island. On April 24, Spanish Minister of Defense Segismundo Bermejo sent instructions to Spanish Admiral Cervera to proceed with his fleet from Cape Verde to the Caribbean, Cuba and Puerto Rico. In May, Lt. Henry H. Whitney of the United States Fourth Artillery was sent to Puerto Rico on a reconnaissance mission. He provided maps and information on the Spanish military forces to the US government that would be useful for an invasion.
The Spanish–American War broke out in late April 1898. The US strategy was to seize Spanish colonies in the Atlantic—Puerto Rico and Cuba—and their possessions in the Pacific—the Philippines and Guam. On May 10, Spanish forces at Fort San Cristóbal under the command of Capt. Ángel Rivero Méndez in San Juan exchanged fire with the USS Yale under the command of Capt. William C. Wise. Two days later, on May 12, a squadron of 12 US ships commanded by Rear Admiral William T. Sampson bombarded installations at San Juan. On June 25, the USS Yosemite blocked San Juan harbor. On July 18, General Nelson A. Miles, commander of US forces, received orders to sail for Puerto Rico and to land his troops. On July 21, a convoy with nine transports and 3,300 soldiers, escorted by USS Massachusetts, sailed for Puerto Rico from Guantánamo. General Nelson Miles landed unopposed at Guánica, located in the southern coast of the island, on July 25, 1898, with the first contingent of American troops. Opposition was met in the southern and central regions of the island but by the end of August the island was under United States control.
On August 12, peace protocols were signed in Washington and Spanish Commissions met in San Juan on September 9 to discuss the details of the withdrawal of Spanish troops and the cession of the island to the United States. On October 1, an initial meeting was held in Paris to draft the Peace Treaty and on December 10, 1898, the Treaty of Paris was signed (ratified by the US Senate February 6, 1899). Spain renounced all claim to Cuba, ceded Guam and Puerto Rico and its dependent islets to the United States, and transferred sovereignty over the Philippines to the United States and in turn was paid $20,000,000 ($570 million in 2016 dollars) by the U.S. General John R. Brooke became the first United States military governor of the island. Henceforth Puerto Rico was under U.S. control and now one of the simmering issues is whether Puerto Rico should become the 51st state (or 52nd if Washington D.C. gets statehood).
Puerto Rican cuisine has been influenced by an array of cultures including the Taino, Spanish, and West African, in earlier times, and the United States more recently. Puerto Rican cuisine shares a lot in common with other Caribbean and Latin American cuisines, but, of course, is distinctive and has found its way to the United States and beyond following emigrants. Locals call their cuisine cocina criolla. By the end of the 19th century, before U.S. control, traditional Puerto Rican cuisine was well established. El Cocinero Puertorriqueño, the island’s first cookbook, was published in 1849. The piña colada originated in Puerto Rico in the 19th century is now the national drink.
One distinctive sauce from Puerto Rico is mojito isleño which most likely originated in Salinas nicknamed “La Cuna del Mojito Isleño (the cradle of mojito isleño). It is used primarily over fish and shellfish, but can be used with meat or pasta if you want. It is similar to the sauce used in Italy for spaghetti alla puttanesca.
½ cup olive oil
2 green bell peppers, seeded, trimmed and chopped
2 onions, peeled and chopped
2 cloves garlic, peeled and chopped
1 bunch fresh cilantro, washed and chopped
salt and pepper
2 cups canned crushed tomatoes
hot sauce (optional)
½ cup chopped pimento-stuffed green olives
Heat the olive oil in a large skillet over medium heat, pour in oil and sauté the bell pepper and onion until soft but not browned. Add the garlic and cilantro, and season to taste with salt and pepper. Cook five minutes over low heat. Cook, stirring occasionally, for 1 minute.
Add the crushed tomatoes and simmer for about 10 minutes. You can add hot sauce to taste if you wish.
Add the olives, stir to combine and remove from the heat.
You can use the sauce warm over fish, meat, or pasta, or chilled as a dip.
On this date in 1873, Yale, Princeton, Columbia, and Rutgers universities met to draft the first code of American football rules. Longtime readers will known that I do not like using “American” as an adjective referring to the United States, but I’ll let my rule slide here because “American football” is a well-understood term. This meeting of university representatives to draft the rules that eventually led to the current game cannot be considered the main event in the history of the game, but it was a milestone. Fair warning: I don’t like the game at all. Two rules that were developed later and are hallmarks of the game – blocking and the forward pass – completely ruin the game for me. The idea that some players have no other role than to block players on the other team means that they rarely, if ever, touch the ball, and this rule seems completely ludicrous to me. It puts an emphasis on brute force and raw strength over skill in ball handling which is limited to only a small number of players. I like football games where every player has the ability, and need, to touch the ball during the game, and in which interfering with a player who is not in possession of the ball is not allowed. The forward pass does not sit well with me either. In all the classic English ball sports lingering around the goal or making forward passes to players ahead of defenders was always considered unsporting and is now illegal (the offside rule).
Forms of traditional football have been played throughout Europe and beyond since antiquity. Many of these involved handling of the ball, and scrummage-like formations. Several of the oldest examples of football-like games include the Greek game of Episkyros and the Roman game of Harpastum. Over time many countries across the world have also developed their own national football-like games. For example, New Zealand had Ki-o-rahi, Australia marn grook, Japan kemari, China cuju, Georgia lelo burti, the Scottish Borders Jeddart Ba’ and Cornwall Cornish hurling, Central Italy Calcio Fiorentino, South Wales cnapan, East Anglia Campball and Ireland had caid, which was an ancestor of Gaelic football.
Archaic forms of football in England, typically classified as mob football, were played between neighboring towns and villages, involving an unlimited number of players on opposing teams, who would clash in a heaving mass of people struggling to drag a ball of some sort by any means possible to markers at each end of a town. By some accounts, in some such events any means could be used to move the ball towards the goal, as long as it did not lead to manslaughter or murder. These antiquated games went into sharp decline in the 19th century when the Highway Act 1835 was passed banning the playing of football on public highways. What arose instead were football games at various public (i.e. private) boarding schools, notably Rugby. Football was adopted by these public schools as a way of encouraging competitiveness and keeping boys fit. Each school drafted its own rules, which varied widely between different schools and were changed over time with each new intake of pupils. Two schools of thought developed regarding rules. Some schools favored a game in which the ball could be carried (as at Rugby, Marlborough and Cheltenham), while others preferred a game where kicking and dribbling the ball was promoted (as at Eton, Harrow, Westminster and Charterhouse). The division into these two camps was partly the result of circumstances in which the games were played. For example, Charterhouse and Westminster at the time had restricted playing areas; the boys were confined to playing their ball game within the school cloisters, making it difficult for them to adopt rough and tumble running games. Out of this diversity of games and rules evolved a number of games both in England and abroad. I’ve dealt with several here:
Let’s now turn to American football which evolved in the United States from a Rugby-style of football. Early football games in the United States appear to have had much in common with the traditional mob football played in England. The games remained largely unorganized until the 19th century, when intramural games of football began to be played on college campuses. Each school played its own variety of football. Princeton University students played a game called “ballown” as early as 1820. A Harvard tradition known as “Bloody Monday” began in 1827, which consisted of a mass ballgame between the freshman and sophomore classes. In 1860, both the town police and the college authorities agreed the Bloody Monday had to go. The Harvard students responded by going into mourning for a mock figure called “Football Fightum,” for whom they conducted funeral rites. The authorities held firm and it was a dozen years before football was once again played at Harvard. Dartmouth played its own version called “Old division football,” the rules of which were first published in 1871, though the game dates to at least the 1830s. All of these games, and others, shared certain common features. They remained largely “mob” style games, with huge numbers of players attempting to advance the ball into a goal area, often by any means necessary. Rules were simple, violence and injury were common. The violence of these mob-style games led to widespread protests and a decision to abandon them. Yale, under pressure from the city of New Haven, banned the play of all forms of football in 1860.
The game began to return to college campuses by the late 1860s. Yale, Princeton, Rutgers University, and Brown University began playing the popular “kicking” game during this time. In 1867, Princeton used rules based on those of the London Football Association. A “running game,” resembling rugby football, was taken up by the Montreal Football Club in Canada in 1868. On November 6, 1869, Rutgers University faced Princeton University (then known as the College of New Jersey) in a game that was played with a round ball and, used a set of rules suggested by Rutgers captain William J. Leggett, based on the Football Association’s first set of rules. It is still usually regarded as the first game of intercollegiate American football even though it bore no resemblance to the modern game. The game was played at a Rutgers field. Two teams of 25 players attempted to score by kicking the ball into the opposing team’s goal. Throwing or carrying the ball was not allowed, but there was plenty of physical contact between players. The first team to reach six goals was declared the winner. Rutgers won by a score of six to four. A rematch was played at Princeton a week later under Princeton’s own set of rules (one notable difference was the awarding of a “free kick” to any player who caught the ball on the fly, which was a feature adopted from the Football Association’s rules. The fair catch kick rule has survived through to modern American game). Princeton won that game by a score of 8–0. Columbia joined the series in 1870, and by 1872 several schools were fielding intercollegiate teams, including Yale and Stevens Institute of Technology.
By 1873, the college students playing football had made significant efforts to standardize their fledgling game. Teams had been scaled down from 25 players to 20. The only way to score was still to bat or kick the ball through the opposing team’s goal, and the game was played in two 45 minute halves on fields 140 yards long and 70 yards wide. On October 20, 1873, representatives from Yale, Columbia, Princeton, and Rutgers met at the Fifth Avenue Hotel in New York City to codify the first set of intercollegiate football rules. Before this meeting, each school had its own set of rules and games were usually played using the home team’s own particular code. At this meeting, a list of rules, based more on the Football Association’s rules than the rules of the recently founded Rugby Football Union, was drawn up for intercollegiate football games.
Harvard refused to attend the rules conference organized by the other schools and continued to play under its own code. While Harvard’s voluntary absence from the meeting made it hard for them to schedule games against other U.S. universities, it agreed to a challenge to play McGill University, from Montreal, in a two-game series. Inasmuch as Rugby football had been transplanted to Canada from England, the McGill team played under a set of rules which allowed a player to pick up the ball and run with it whenever he wished. Another rule, unique to McGill, was to count tries (the act of grounding the football past the opposing team’s goal line; it is important to note that there was no end zone during this time), as well as goals, in the scoring. In the Rugby rules of the time, a touchdown only provided the chance to try to kick a free goal from the field. There were no points for the touchdown which was, and still is, called in rugby a “try,” that is, a “try at goal.”
Harvard quickly took a liking to the Rugby game, and its use of the try which, until that time, was not used in American football. The try would later evolve into the score known as the touchdown. On June 4, 1875, Harvard faced Tufts University in the first game between two U.S. colleges played under rules similar to the McGill/Harvard contest, which was won by Tufts. The rules included each side fielding 11 men at any given time, the ball was advanced by kicking or carrying it, and tackles of the ball carrier stopped play. Further elated by the excitement of McGill’s version of football, Harvard challenged its closest rival, Yale. The two teams agreed to play under a set of rules called the “Concessionary Rules”, which involved Harvard conceding something to Yale’s soccer and Yale conceding a great deal to Harvard’s rugby. They decided to play with 15 players on each team. On November 13, 1875, Yale and Harvard played each other for the first time ever. Among the 2000 spectators attending the game that day, was the future “father of American football” Walter Camp. Camp, who enrolled at Yale the next year, was torn between an admiration for Harvard’s style of play and the misery of Yale’s defeat (4-0), and became determined to avenge it. Spectators from Princeton, also carried the game back home, where it quickly became the most popular version of football.
Walter Camp is widely considered to be the most important figure in the development of American football. As a youth, he excelled in sports including track athletics, baseball, and association football, and after enrolling at Yale in 1876, he earned varsity honors in every sport the school offered. Following the introduction of rugby-style rules to American football, Camp became a fixture at the Massasoit House conventions where rules were debated and changed. Dissatisfied with what seemed to him to be a disorganized mob, he proposed his first rule change at the first meeting he attended in 1878: a reduction from fifteen players to eleven. The motion was rejected at that time but passed in 1880. The effect was to open up the game and emphasize speed over strength. Camp’s most famous change, the establishment of the line of scrimmage and the snap from center to quarterback, was also passed in 1880. Originally, the snap was executed with the foot of the center. Later changes made it possible to snap the ball with the hands, either through the air or by a direct hand-to-hand pass. Rugby league followed Camp’s example, and in 1906 introduced the play-the-ball rule, which greatly resembled Camp’s early scrimmage and center-snap rules. In 1966, Rugby league introduced a four-tackle rule based on Camp’s early down-and-distance rules.
Camp’s new scrimmage rules revolutionized the game, though not always as intended. Princeton, in particular, used scrimmage play to slow the game, making incremental progress towards the end zone during each down. Rather than increase scoring, which had been Camp’s original intent, the rule was exploited to maintain control of the ball for the entire game, resulting in slow, unexciting contests. At the 1882 rules meeting, Camp proposed that a team be required to advance the ball a minimum of five yards within three downs. These down-and-distance rules, combined with the establishment of the line of scrimmage, transformed the game from a variation of rugby football into the distinct sport of American football.
Camp was central to several more significant rule changes that came to define American football. In 1881, the field was reduced in size to its modern dimensions of 120 by 53 1⁄3 yards (109.7 by 48.8 meters). Several times in 1883, Camp tinkered with the scoring rules, finally arriving at four points for a touchdown, two points for kicks after touchdowns, two points for safeties, and five for field goals. Camp’s innovations in the area of point scoring influenced rugby union’s move to point scoring in 1890. In 1887, game time was set at two halves of 45 minutes each. Also in 1887, two paid officials—a referee and an umpire—were mandated for each game. A year later, the rules were changed to allow tackling below the waist, and in 1889, the officials were given whistles and stopwatches.
The last, and arguably most important innovation, which would at last make American football uniquely “American,” was the legalization of interference, or blocking, a tactic which was highly illegal under the rugby-style rules, and remains so. At first, U.S. players would find creative ways of aiding the runner by pretending to accidentally knock into defenders trying to tackle the runner. When Walter Camp witnessed this tactic being employed against his Yale team, he was at first appalled, but the next year had adopted the blocking tactics for his own team.
So much for history. Here’s one of my favorite monologues by a young (and largely unknown) Andy Griffith – “What it Was, Was Football” – produced in 1953. The rural North Carolina accent alone is priceless, let alone the cheerful innocence of the country bumpkin.
Eating and football are natural twins in the United States. So-called tailgate parties are legendary in most stadium parking lots, where people come hours early and set up picnic and BBQ areas around their cars. The “tailgate” part comes from the old-fashioned use of pickup trucks for transport whose tailgate can be folded down to make a table for preparing and serving food. What I like about the idea of a tailgate party is that it is an outdoor picnic in autumn or winter. Most households in the U.S. see Labor Day (beginning of September) as the end of the picnic and BBQ season and shut up shop until the following Memorial Day in May. But I love cooking and eating outdoors in the colder weather.
October was a great month for outdoor cooking for me because in the northeastern U.S. it is normally a dry month with warm, sunny days and starry nights – perfect for gathering around a roaring fire as light fades. This is the time for pig roasts and big gatherings. One year I held a campfire birthday party for my son (http://www.bookofdaystales.com/badger/ ) where his friends got to roast hot dogs and marshmallows on sticks whilst I cooked up a giant pot of chili over the coals (supplemented with fire-roasted potatoes and apples). It wasn’t elegant, of course, but great fun for everybody. Break the mold – eat outdoors today.
One commercial food was actually created specifically for tailgate parties – Palmetto Cheese, which was developed by Sassy Henry for tailgating at Atlanta Braves games. When Sassy and her husband, Brian, moved to Pawleys Island, South Carolina, they bought the Sea View Inn where the cook, Vertrella Brown, created the original recipe – a spread made from cheddar cheese, cream cheese, mayonnaise and spices. Brown’s image can be found on the label of Palmetto Cheese. In 2006, Sassy and Vertrella’s pimento cheese recipe made the leap from the Sea View Inn menu to the first 20 packages put for sale at Independent Seafood in Georgetown, South Carolina. Now it is widely available at major chains in the U.S. and comes in Original, Jalapeño and Bacon. I’m not a fan of pre-made spreads, but it’s your choice. Me? You’ll find me out back at the fire pit.
Today, the first Monday in September, is Labor Day in the United States and Canada. Ostensibly it honors the North American labor movement and the contributions that workers have made to the strength, prosperity, and well-being of their countries. Nowadays, however, the trade union and labor movement ties are relatively week, but the day makes a three-day weekend which people use as a last hurrah of summer.
Beginning in the late 19th century, as the trade union and labor movements grew, trade unionists in the US (where I will focus) proposed that a day be set aside to celebrate labor. “Labor Day” was promoted by the Central Labor Union and the Knights of Labor, which organized the first parade in New York City. By the time it became an official federal holiday in 1894, thirty U.S. states officially celebrated Labor Day.
In 1882, Matthew Maguire, a machinist, first proposed a Labor Day holiday while serving as secretary of the Central Labor Union (CLU) of New York. Some maintain that Peter J. McGuire of the American Federation of Labor put forward the first proposal in May 1882, after witnessing the annual labour festival held in Toronto in Canada. In 1887 Oregon became the first state of the United States to make Labor Day an official public holiday.
Following the deaths of workers at the hands of United States Army and United States Marshals Service during the Pullman Strike of 1894, the United States Congress unanimously voted to approve legislation to make Labor Day a national holiday and President Grover Cleveland signed it into law six days after the end of the strike. Cleveland supported the creation of the national holiday in an attempt to shore up support among trade unions following the Pullman Strike. The date of May 1 was an alternative date, celebrated then (and now) as International Workers Day, but President Cleveland was concerned that observance of Labor Day on May 1 would encourage Haymarket-style protests and would strengthen socialist and anarchist movements that, though distinct from one another, had rallied to commemorate the Haymarket Affair on International Workers’ Day.
The form for the celebration of Labor Day was outlined in the first proposal of the holiday: A street parade to exhibit to the public “the strength and esprit de corps of the trade and labor organizations,” followed by a festival for the workers and their friends and families. This became the pattern for Labor Day celebrations. Speeches by prominent men and women were introduced later, as more emphasis was placed upon the civil significance of the holiday. Still later, by a resolution of the American Federation of Labor convention of 1909, the Sunday preceding Labor Day was adopted as Labor Sunday and dedicated to the spiritual and educational aspects of the Labor movement.
Nowadays most of the overtly labor and union activities are muted in the US, and the day is seen primarily as a time for family gatherings. When I began working as a professor in New York in 1980 I was expected to work on Labor Day because students moved into the dormitories over the weekend and needed advising before commencing classes after Labor Day. This practice did not sit well with the faculty, especially in the Social Sciences – many of whom simply refused to work that day. The problem was solved about 10 years later when the university moved the start of term to the end of August, which allowed Labor Day to be a proper holiday for everyone.
Family barbecues and picnics are the order of the day. I always got a bit tired of big gatherings that tended to feature the same cast of characters year after year – hot dogs, hamburgers, potato salad, and beans. People in the US have a bad habit of looking down their noses at UK cuisine whilst overlooking the numbingly bland and repetitive aspects of their own cooking. For me, Labor Day was an opportunity to build a big blaze in my fire pit and grill or roast whatever I felt like.
Roasted corn on the cob was always a big family favorite to complement meats and other vegetables. If you want to be dead simple and lazy, leave the cobs unshucked, remove as much tassel as you can without breaking the husk, and whack the cobs on a grill over your fire until the husks are charred and the insides are steaming. It’s best to place the cobs over medium heat for this. When cooked just shuck and enjoy.
If you want to have a bit more finesse, and less mess, shuck the corn cobs and wrap them in heavy aluminum foil along with a knob of butter. Then grill them in the same manner. Either way, the cobs will char a little, adding to the flavor. Cooking times vary, but usually 25 minutes are sufficient if you have a steady fire going. With a foil wrapping you can check regularly before serving by opening the foil just a crack. Make sure you rotate all the cobs periodically so that cooking is even.
Either way, serve the cobs with extra butter, and a salt shaker for those who want it.
Today is National Banana Split Day in the US. I don’t normally celebrate these semi-fake food holidays, but this one seems more worthy than most because it celebrates the actual creation of the dish, and the history is fairly well documented, although murky in places.
The generally accepted account is that David Evans Strickler, a 23-year-old apprentice pharmacist at Tassel Pharmacy, located at 805 Ligonier Street in Latrobe, Pennsylvania, who enjoyed inventing sundaes at the store’s soda fountain, invented the banana split in 1904. The sundae originally cost 10 cents, twice the price of other sundaes, but caught on with students of nearby Saint Vincent College. News of the new variety of sundae quickly spread by word-of-mouth and through correspondence far beyond Latrobe. A popular recipe published in 1907 called for a lengthwise split banana, two scoops of ice cream at each end and a spoon of whipped cream in between with a maraschino cherry on a top. One end was covered with chopped mixed nuts and the other with chopped mixed fruits.
The city of Latrobe celebrated the 100th anniversary of the invention of the banana split in 2004 and, in the same year, the National Ice Cream Retailers Association (NICRA) certified the city as its birthplace. It is now the location of an annual Great American Banana Split Celebration and of the original soda fountain where the first banana split was made. The Great American Banana Split Celebration is held throughout the downtown Latrobe area in late August with food, fun and events for kids and adults to enjoy.
Shortly after its invention by Strickler, a Boston ice cream entrepreneur came up with the same sundae, with one (minor) flaw — he served his banana splits with the bananas unpeeled until he discovered that people preferred them peeled. I’d say the guy might have needed an IQ test.
Wilmington, Ohio also claims an early connexion. In 1907, restaurant owner Ernest Hazard wanted to attract students from Wilmington College during the slow days of winter. He staged an employee contest to come up with a new ice cream dish. When none of his workers were up to the task, he split a banana lengthwise, threw it into an elongated dish and created his own dessert. The town commemorates the event each June with its own Banana Split Festival.
Walgreens is credited with spreading the popularity of the banana split. The early drug stores operated by Charles Rudolph Walgreen in the Chicago area adopted the banana split as a signature dessert. Fountains in the stores proved to be drawing cards, attracting customers who might otherwise have been just as satisfied having their prescriptions filled at some other drug store in the neighborhood.
I am not sure if it is the original form, but for the classic banana split you cut a banana in half lengthwise and place it in a long dish called a “boat.” Then add three scoops of ice cream, vanilla, chocolate and strawberry in a row between the split banana. In no particular order spoon pineapple, strawberry and chocolate sauces over the ice creams. Top with whipped cream, garnished with chopped nuts, and finish with a maraschino cherry on the top.
I took my son to Latrobe to get his first banana split for his 9th birthday. Actually, horrible father that I am, I took him out of school for the day and ended up spending three days on the road visiting geological sites, historic factories, and an ecological park. Two years later I took him out of school completely and home schooled him until he went off to university. That way, when we were studying the US Civil War I could stick him in the car and tour famous battle sites, and when we studied geology we went to glaciated areas, descended a coal mine, and collected a ton of rocks. We also applied physics, chemistry, biology, and mathematics to cooking, of course.
My son’s first ever banana split was his own creation because the store let him pick what he wanted – flavors of ice cream, sauces, and garnishes. Only the banana and the whipped cream were obligatory. Oh, and the cherry. So it’s cook’s choice again today. Just remember to be fabulous.
Today is Madras Day, an annual festival that commemorates the founding of the city of Madras (now Chennai) in Tamil Nadu. 22nd August 1639 is the widely agreed upon date for the purchase of the village of Madraspatnam or Chennapatnam by East India Company factors Andrew Cogan and Francis Day from Damerla Venkatapathy, the viceroy of the Vijayanagar Empire. There has been some debate as to whether the deed of purchase was dated 22nd July or 22nd August, the latter being widely accepted. The motives of the celebrations have sometimes been criticized by academicians and state government organizations who feel that it gives undue importance to the city’s colonial heritage. Nonetheless the celebrations are popular and growing.
The first recorded celebration of the founding of Madras was its tercentenary commemoration in 1939. Unlike later anniversaries, the celebrations were officially sponsored by the British government and a special tercentenary commemoration volume was issued with essays on the different aspects of Madras city written by leading experts of the time. An exhibition of pictures, portraits, maps, records and coins was inaugurated by Diwan Bahadur S. E. Runganadhan, the Vice-Chancellor of Madras University, and there was a short-play writing competition as well. The 350th anniversary in 1989 was celebrated with the opening of a commemorative monument titled “Madras 350” built in the Classical Style by Frankpet Fernandez at the junction of the Poonamallee High Road and the New Avadi Road. Other major events included the commissioning of a book by S. Muthiah titled Madras — The Gracious City by the Murugappa Group which also organized the first Madras Quiz which has continued to the present day.
Modern Chennai had its origins as a colonial city and its initial growth was closely tied to its importance as an artificial harbor and trading center. When the Portuguese arrived in 1522, they built a port and named it São Tomé, after the Christian apostle St. Thomas, who by legend is said to have preached there between the years 52 and 70. The region then passed into the hands of the Dutch, who established themselves near Pulicat just north of the city in 1612. Both groups strived to grow their colonial populations and although their joint populations had reached about 10,000 when the British arrived, they remained substantially outnumbered by the local Indian population.
By 1612, the Dutch established themselves in Pulicat to the north. In the 17th century when the English East India Company decided to build a factory on the east coast and in 1626 selected as its site Armagon (Dugarazpatnam), a village around 35 miles (56 km) north of Pulicat. The calico cloth from the local area, which was in high demand locally, was of poor quality and not suitable for export to Europe. The English also soon realized that, as a port, Armagon was unsuitable for trade purposes. Francis Day, one of the officers of the company, who was then a Member of the Masulipatam Council and the Chief of the Armagon Factory, made a voyage of exploration in 1637 down the coast as far as Pondicherry with a view to choosing a site for a new settlement.
At that time the Coromandel Coast was ruled by Peda Venkata Raya, from the Aravidu Dynasty of Vijayanagara Empire based in Chandragiri-Vellore. Damarla Venkatadri Nayaka, local governor of the Vijayanagar Empire and Nayaka of Wandiwash (Vandavasi), ruled the coastal part of the region, from Pulicat to the Portuguese settlement of San Thome. He had his headquarters at Wandiwash, and his brother Ayyappa Nayakudu resided at Poonamallee, a few miles to the west of Madras, where he looked after the affairs of the coast. Ayyappa Nayakudu persuaded his brother to lease the sandy strip to Francis Day and promised him trade benefits, army protection, and Persian horses in return. Francis Day wrote to his headquarters at Masulipatam for permission to inspect the proposed site at Madraspatnam and to examine the possibilities of trade there. Madraspatnam seemed favorable during the inspection, and the calicoes woven there were much cheaper than those at Armagon (Durgarazpatam).
On 22 August 1639, Francis Day secured the Grant by the Damarla Venkatadri Nayaka, Nayaka of Wandiwash, giving over to the East India Company a three-mile (4.8 km) long strip of land, a fishing village called Madraspatnam, copies of which were endorsed by Andrew Cogan, the Chief of the Masulipatam Factory. The Grant was for a period of two years and empowered the Company to build a fort and castle on about 2 sq miles (5 km2) of its strip of land.
The English merchants at Masulipatam were satisfied with Francis Day’s work. They requested Day and the Damarla Venkatadri Nayaka to wait until the sanction of the superior English Presidency of Bantam in Java could be obtained for their action. The main difficulty, among the English those days, was lack of money. In February 1640, Day and Cogan, accompanied by a few merchants and writers, a garrison of about twenty-five European soldiers and a few other European artificers, besides a Hindu powder-maker named Naga Battan, proceeded to the land which had been granted and started a new English factory there. They reached Madraspatnam on 20 February 1640.
Day and Cogan can be considered as the founders of Madras/Chennai. They began construction of Fort St George on 23 April 1640 and houses for their residence. Their small fortified settlement quickly attracted other East Indian traders and as the Dutch position collapsed under hostile Indian power they also slowly joined the settlement. By the 1646, the settlement had reached 19,000 people and with the Portuguese and Dutch populations at their forts substantially more. To further consolidate their position, the Company combined the various settlements around an expanded Fort St. George, which, including its citadel, also embraced a larger outside area surrounded by an additional wall. This area became the Fort St. George settlement. As stipulated by the Treaty signed with the Nayak (local administrator), the British and other Europeans were not allowed to decorate the outside of their buildings in any other color but white. As a result, over time, the area came to be known as ‘White Town’.
According to the treaty, only Europeans, principally Protestant British settlers were allowed to live in this area as outside of this confine, non-Indians were not allowed to own property. However, other national groups, chiefly French, Portuguese, and other Catholic merchants had separate agreements with the Nayak which allowed them in turn to establish trading posts, factories, and warehouses. As the East India Company controlled the trade in the area, these non-British merchants established agreements with the Company for settling on Company land near “White Town” per agreements with the Nayak. Over time, Indians also arrived in ever greater numbers and soon, the Portuguese and other non-Protestant Christian Europeans were outnumbered. Following several outbreaks of violence by various Hindu and Muslim Indian communities against the Christian Europeans, White Town’s defenses and its territorial charter was expanded to incorporate most of the area which had grown up around its walls thereby incorporating most of its Catholic European settlements. In turn they resettled the non-European merchants and their families and workers, almost entirely Muslim or of various Hindu nationalities outside of the newly expanded “White Town”. This was also surrounded by a wall. To differentiate these non-European and non-Christian area from “White Town,” the new settlement was called “Black Town.” Collectively, the original Fort St. George settlement, “White Town” and “Black Town,” were called Madras.
During the course of the late 17th century, both plague and genocidal warfare reduced the population of the colony dramatically. Each time, the survivors fell back upon the safety of the Fort St George. As a result, owing to the frequency of outbursts of racial and national violence against the Europeans and especially the English, Fort St George with its impressive fortifications became the nucleus around which the city grew and rebuilt itself. Several times throughout the life of the colony, the Fort became the last refuge of Europeans and their allied Indian communities from various savage pogroms initiated by several Indian rulers and powers, which resulted in the almost complete destruction of the town. Each time the town (and later city) was rebuilt and repopulated with new English and European settlers. The Fort still stands today, and a part of it is used to house the Tamil Nadu Legislative Assembly and the Office of the Chief Minister. Elihu Yale, after whom Yale University is named, was British governor of Madras for five years. Part of the fortune that he amassed in Madras as part of the colonial administration became the financial foundation for Yale University.
The city has changed its boundaries as well as the geographic limits of its quarters several times, principally as a result of destructions of the town by surrounding Hindu and Muslim powers. For instance, Golkonda forces under General Mir Jumla conquered Madras in 1646, massacred or sold into slavery much of the Christian European inhabitants and their allied Indian communities, and brought Madras and its immediate surroundings under his control. Nonetheless, the Fort and its surrounding walls remained under British control who slowly rebuilt their colony with additional colonists despite another mass murder of Europeans in Black Town by anti-colonialists agitated by Golkonda and plague in the 1670s. In 1674, the expanded colony had nearly 50,000 mostly British and European colonists and was granted its own corporate charter, thereby officially establishing the modern day city. Eventually, after additional provocations from Golkonda, the British pushed back until they defeated him.
After the fall of Golkonda in 1687, the region came under the rule of the Mughal Emperors of Delhi who in turn granted new Charters and territorial borders for the area. Subsequently, charters were issued by the Mughal Emperor granting the rights of English East India company in Madras and formerly ending the official capacity of local rulers to attack the British. In the later part of the 17th century, Madras steadily progressed during the period of the East India Company and under many Governors. Although most of the original Portuguese, Dutch, and British population had been killed during the genocides of the Golkonda period, under the Mughul protection, large numbers of British and Anglo-American settlers arrived to replenish these losses. As a result, during the Governorship of Elihu Yale (1687–92), the large number of British and European settlers led to the most important political event which was the formation of the institution of a Mayor and Corporation for the city of Madras. Under this Charter, the British and Protestant inhabitants were granted the rights of self-government and independence from company law. In 1693, a parwana (warrant) was received from the local Nawab granting the towns Tondiarpet, Purasawalkam and Egmore to the company which continued to rule from Fort St. George. The present parts of Chennai like Poonamalee (ancient Tamil name – Poo Iruntha alli), Triplicane (ancient Tamil name – Thiru alli keni) are mentioned in Tamil bhakti literature of the 6th – 9th centuries. Thomas Pitt became the Governor of Madras in 1698 and governed for eleven years. This period witnessed remarkable development of trade and an increase in wealth resulting in the building of many fine houses, mansions, housing developments, an expanded port and city complete with new city walls, and various churches and schools for the British colonists and missionary schools for the local Indian population.
The idea to celebrate the birth of the city every year was crystallized when journalists Shashi Nair and Vincent D’Souza met S. Muthiah at his residence for coffee. It was based on the success of another event called Mylapore Festival which D’Souza had been organizing every year in January. It was decided by the trio to start celebrating Madras Day from 2004. According to them, “the primary motive of celebrating the new Madras Day was to focus on the city, its past and its present.” The idea initially started off with about five events in 2004 but grew gradually. The second in 2005 had events throughout the week. In 2008, there were 60 events. In 2007, a commemorative postal cover was released by Chief Postmaster-General of Tamil Nadu Circle at a function at Fort St George as a part of the Madras Day celebrations, thereby inaugurating a tradition that continued through later celebrations. The 2010 celebrations lasted over a week, and some extended into the following week as well. Since then they have continued to grow.
Outside of south Asia, Indian restaurants frequently feature Madras curry, and supermarkets sell Madras curry powders and pastes. To a degree this is a misnomer, and the designation is for foreigners only. Madras curries are quite varied in their ingredients and spices. Still, modern Indian cooks do resort to such mixes when they are in a hurry. Chennai is locally better known for its street food.
You’d do well to buy a plane ticket than to try to make it at home, although I have been moderately successful when I have been able to get the right ingredients. When I lived near Slough (then in Buckinghamshire) in the 1960s, there was a substantial south Asian immigrant population there with a large food market catering to their cooking needs. I was a constant visitor.
Dosa, crepes made with a fermented urad dhal (black lentils) and rice batter, are common street food which you can make at home with reasonable success (and a lot of experience). This site does a better job than I can to explain the process with detailed descriptions and pictures.
Today is the birthday (1865) of John Radecki (also known as Johann and Jan Radecki) who was a master stained glass artist who was born in Poland but spent most of his professional life working in Australia. He is considered one of the finest stained glass artists of his era. Rather than dwelling exclusively on Radecki, I’m going to take a peek at stained glass manufacture in general, although this will just be a peek.
Colored glass has been produced since ancient times. Both the Egyptians and the Romans excelled at the manufacture of small colored glass objects. Phoenicia was also important in glass manufacture with its chief centers in Sidon, Tyre and Antioch. In early Christian churches of the 4th and 5th centuries, there are a few remaining windows which are filled with ornate patterns of thinly-sliced alabaster set into wooden frames, giving a stained glass like effect. Evidence of stained glass windows in churches and monasteries in Britain can be found as early as the 7th century. The earliest known reference dates from 675 when Benedict Biscop imported workmen from France to glaze the windows of the monastery of St Peter which he was building at Monkwearmouth. Hundreds of pieces of colored glass and lead, dating back to the late 7th century, have been discovered here and at Jarrow.
In the Middle East, the glass industry of Syria continued during the Islamic period with major centers of manufacture at Ar-Raqqah, Aleppo and Damascus and the most important products being highly transparent colourless glass and gilded glass, rather than colored glass. The production of colored glass in Southwest Asia existed by the 8th century, at which time the alchemist Jābir ibn Hayyān, in Kitab al-Durra al-Maknuna, gave 46 recipes for producing colored glass and described the technique of cutting glass into artificial gemstones.
Stained glass, as an art form, reached its height in the Middle Ages when it became a major pictorial form used to illustrate the narratives of the Bible. In the Romanesque and Early Gothic period, from about 950 to 1240, the untraceried windows demanded large expanses of glass which of necessity were supported by robust iron frames, such as may be seen at Chartres Cathedral and at the eastern end of Canterbury Cathedral. As Gothic architecture developed into a more ornate form, windows grew larger, affording greater illumination to the interiors, but were divided into sections by vertical shafts and tracery of stone. This elaboration of form reached its height of complexity in the Flamboyant style in Europe, and windows grew still larger with the development of the Perpendicular style in England.
During the Renaissance stained glass work flourished, beginning with the windows in Florence Cathedral. The stained glass includes three ocular windows for the dome and three for the facade which were designed from 1405-1445 by several of the most renowned artists of this period: Ghiberti, Donatello, Uccello and Andrea del Castagno. Each major ocular window contains a single picture drawn from the Life of Christ or the Life of the Virgin Mary, surrounded by a wide floral border, with two smaller facade windows by Ghiberti showing the martyred deacons, St Stephen and St Lawrence. One of the cupola windows has since been lost, and that by Donatello has lost nearly all of its painted details.
In Europe, stained glass continued to be produced with the style evolving from the Gothic to the Classical, which is well represented in Germany, Belgium and the Netherlands, despite the rise of Protestantism. In France, much glass of this period was produced at the Limoges factory, and in Italy at Murano, where stained glass and faceted lead crystal are often coupled together in the same window. Ultimately, the French Revolution brought about the neglect or destruction of many windows in France. During the Reformation, in England large numbers of Medieval and Renaissance windows were smashed and replaced with plain glass. The Dissolution of the Monasteries under Henry VIII and the injunctions of Thomas Cromwell against “abused images” (that is, veneration), resulted in the loss of thousands of windows. Few remain undamaged. With this wave of destruction the traditional methods of working with stained glass died and were not to be rediscovered in England until the early 19th century.
Some Medieval and Renaissance stained glass techniques (and glass making techniques in general) have, in fact, been completely lost despite continued efforts to re-create them. The late 19th and early 20th centuries saw its own renaissance, of which John Radecki was a major contributor.
Radecki was born 2 August 1865 in Łódź in Poland, son of Pavel Radecki, a coal miner, and his wife Victoria, née Bednarkiewicz. Jan trained at a German art school at Poznań. With his parents and four siblings he migrated to Australia, reaching Sydney in January 1882. The family settled in Wollongong in New South Wales, where he and his father in the coalmines. His parents had two more children in Australia. Jan moved to Sydney in 1883 where he attended art classes. He boarded with the Saunders family from England and on 17 May 1888 married their daughter Emma. He became a naturalized Australian citizen under the name John in November 1904 (aged 39).
From 1885 Radecki had been employed by Frederick Ashwin, who taught him to work with glass. In the 1890s the two men had crafted stained glass windows entitled ‘Sermon on the Mount’ (St Paul’s Church, Cobbitty) and `Nativity’ (St Jude’s, Randwick). Other works included a window at Yanco Agricultural College, produced in 1902 by F. Ashwin & Co. reputedly to Radecki’s design, and the chancel window (1903) in St Clement’s, Mosman. His first, major independent work was the ‘Te Deum’ window in Christ Church St Laurence, Sydney, in 1906. Ashwin and Radecki also collaborated on windows in St James’s, Forest Lodge, and St John’s, Campbelltown.
After Ashwin’s death in 1909, Radecki became chief designer for J. Ashwin & Co, in partnership with Frederick’s brother John; he was proprietor of the company from John Ashwin’s death in 1920 until 1954. The firm was the largest glassmaking establishment in Sydney, with a high reputation. Radecki’s work included windows in such churches as St John the Evangelist’s, Campbelltown, St Patrick’s, Kogarah, St Joseph’s, Rockdale, St Matthew’s, Manly, and Our Lady of Dolours’, North Goulburn, Scots Kirk, Hamilton, Newcastle, and the Presbyterian Church, Wollongong.
Here’s a small gallery.
Certainly Radecki spent most of his life in Australia, but I guarantee that like most European immigrants he retained his Polish roots all of his life despite assimilating in to Australian culture. So, I am going to give a recipe for a classic Polish dish, golonka, or pork knuckle. Central Polish cuisine is a mix of Slavic and German traditions, and classic golonka is much the same as the German Schweinehaxe. I’ll give you a recipe although there’s really not much need. The main difficulty is finding the pork knuckle. You’ll need a good pork butcher. Also, you need a fresh one, not smoked or pickled. That’s the real challenge.
Pork knuckle is classic poor food which has since been elevated to gourmet status. Knuckle is actually quite delicious is cooked properly, but you have to take time. If you go whole hog (sorry !), it’s a three day process – 1. Marinating 2. Poaching 3. Roasting. Two works for me.
1 large fresh pork hock per person
1 bay leaf
6 black peppercorns
2 juniper berries
1 large carrot
1 large onion, quartered
1 rib celery
2 cloves garlic, peeled and minced
1 tbsp chopped fresh parsley
1 tsp caraway seeds
10 oz Polish beer
4 tbsp honey
Put the hocks in a large, heavy pot, cover with water or light stock, add the vegetables and flavorings (including salt to taste), and gently simmer on low heat for at least 2 hours, or until the meat is falling from the bone. This might take 3 hours or longer depending on the meat.
Remove the hocks from the stock with a slotted spoon and reserve the liquid.
Preheat the oven to 375°F/190°C.
Place the hocks in a deep baking pan.
Mix the beer and honey together in a small saucepan and add 2 tablespoons of the reserved cooking broth. (The rest you should use for soup or stock). Heat the glaze to dissolve the honey, then pour it over the hocks.
Bake the hocks for about 40 minutes, or until the glaze is golden.
Serve with boiled potatoes. If you want you can make a gravy with the reserved cooking liquid, by adding a roux or cornstarch as thickener.