San Francisco is often considered to have a large homosexual community, something that statistics back up. But how long has there been a homosexual community in San Francisco? Here, Alison McLafferty tells us the history of the male homosexual community in San Francisco - and that it goes back a very long way.

“The Miner’s Ball,” by Andre Castaigne, depicting a dance among during the 1849 California Gold Rush.

“The Miner’s Ball,” by Andre Castaigne, depicting a dance among during the 1849 California Gold Rush.

Ask almost anyone in the United States to list the first things that come to mind when they think of that glorious “City by the Bay,” San Francisco, and--along with exorbitant rent, candy-colored Victorian houses, and aging hippies-- they will invariably mention: “gay or homosexual men.” 

The LGBT community in the Bay Area makes up 6.2% of the population, which is almost twice the national average of 3.6%. Homosexual men are also more numerous than homosexual women. The Castro neighborhood, the historic center of homosexual activity since the 1970s, is now one of the hubs of tourist activity. The streets are strewn with rainbow flags, and storefronts revel in double-entendres: “The Sausage Factory” is a restaurant and pizzeria, and “Hot Cookie” sells famously delicious cookies as well as--why not?--men’s underwear.

The city became a hub for homosexual activity in World War II, when men from all over the country found themselves in an all-male environment far from the families and small towns who knew and watched them closely. Facing an uncertain future and shrouded with the relative anonymity provided by a bustling urban hub, many sought to satiate previously hidden desires, finding solace in same-sex relationships. “I think the war has caused a great change,” one of the homosexual “Queens” in Gore Vidal’s 1948 novel, The City and the Pillar, mused while admiring a collection of marines and sailors at an all-male party. “Inhibitions have broken down. All sorts of young men are trying out all sorts of new things, away from home and familiar taboos.”

After the war, many men stayed in the city where they’d finally found a community that made them feel safe and welcome. When the “Summer of Love” bloomed in the Haight Ashbury district in 1967, wreathed in a haze of marijuana smoke and set to the rhythm of Jimi Hendrix, Jefferson Airplane, and the Beatles, homosexual men joined in the general celebration of “free love.” The Castro Neighborhood right next-door to Haight Ashbury, with its cleaner streets, its large Victorian houses, and its cheap rent, became a mecca for homosexual men seeking to build their own community and culture.

So goes the usual history of homosexual men in San Francisco, but few people know that this story goes back much farther than this--back to the old Gold Rush days, back ever further to the days when the Miwok, the Ohlone, and the other Native American tribes hunted and fished in the wild coastlands of the Bay far before any foreigners arrived. 

 

The Berdache

When French fur trappers, Spanish missionaries, and American explorers first encountered the Indian tribes of the Great Plains and the Pacific Coast, they were shocked to note the presence--in a wide variety of tribes--of Native American men who wore female clothing, performed female duties, and appeared to be the “wives” of prominent Native American men. 

The generic term for such individuals became “berdache,” though different tribes had their own terms. The Hidatsa, for example (the tribe with whom Sacajawea was living when Lewis and Clark met her), called them“miáti.”The Lakota (the tribe led by Crazy Horse in the Battle of Little Bighorn against General Custer) called them “winkta.” Crazy Horse himself had a berdachein his harem. Such individuals existed in tribes from the Pacific Coast to the Mississippi Valley and Great Lakes-- but it was in California that the berdache were particularly ubiquitous.

Berdache were not considered “homosexual” by their tribes--Native Americans did not consider sexuality as binary as westerners came to do, nor did they consider it something biological. Instead, gender was considered an aspect of a person’s spirit, and berdache possessed both male and female spirits. They were not, however, intersex--or, to use the 19th century term, “hermaphrodites,” who possess both male and female biological characteristics. Berdache were biologically male, but often performed the roles of both men and women--for example, dressing as women but joining male war parties--in their daily lives, and generally had sexual relations and marriages with men. 

Berdache were generally greatly respected by their tribes, as they were considered to be endowed with immense spiritual power: many were healers, medicine men, seers, and priests. But to the western missionaries and federal agents who encountered them, they were an abomination: something to be prayed over, forcefully dressed in men’s clothing, put to men’s work, and strictly punished. 

 

The California Gold Rush

Such a severe crack-down was somewhat ironic: the same Europeans and Americans who exacted harsh punishments on the Native American berdache were quite blind to similar activities among their own people during the gold fever of the 1850s. As men of all ages, all races, and all nationalities flooded San Francisco’s harbor in their head-long rush for the gold fields of California, they found the city--and the newly minted state in general--a hotbed of homosexual activity.

Few men came to San Francisco specifically to seek out other men as sexual partners: the journey was long and arduous, fortunes were fickle, and the city itself was a hastily-built, slap-up affair that burned down every few years and featured, as one young man wrote in his diary, “Far too many drunken men lying in gutters.”

It also featured far too few women: gold digging was a male sport, something to be undertaken by the sex considered more adventurous, hardy, courageous and aggressive. Most men headed to San Francisco for the sole purpose of using it as a gateway to the gold fields: a place to grab some mining equipment, hitch a ride to the gold, strike it rich as soon as possible, and bring the fortune home to lure a lovely bride.

But the dearth of women-- the U.S. Census of 1850 set the population of non-Native women in the entire state of California at just 4.5%-- also provided opportunities for those who did have homosexual inclinations, who struggled with secret desires and new opportunities, or who claimed simple loneliness and the desire for any kind of company. Men who spent their days with their feet in the ice-cold waters of the American River and their backs bent double in the scorching sun sometimes spent their nights in camp with other men, sharing food, tents, and blankets. Starved for some fun and entertainment after long days in a stark, empty landscape, many headed to San Francisco in their free time, carefully hoarding the few flakes of gold they had managed to sift from the churning river waters. 

Because there were never enough women to partner with all the men at dances, it was common practice for a man to tie a handkerchief to his upper arm to signify that he was willing to take the women’s role. Visitors to San Francisco remarked in bemusement on the spinning couples on dance floors-- shaggy, bearded men with faces scrubbed for the occasion, holding each other daintily about the waists, swaying gracefully, and dancing cheek to cheek. Some men even went so far as to don full gowns--often lent by amused prostitutes, who made up the majority of the population--and the practice was generally accepted.

The West, after all, was a space without the usual constraints of civilization: without the laws, taboos, and enforcement agencies that kept Victorian society tightly laced on the East Coast. It was a trans space in many ways--a space for crossing boundaries, lands, identities and sexualities. One Western gentleman (the records just name him “M”) who took several bullets fighting in the Indian Wars, explained that he dressed in women’s clothing because the petticoats covered the holes: “Then I forget all about them, as well as all other troubles,” he explained serenely.

 

The Third Sex

M, as well as other men of the West in general and San Francisco in particular, were tolerated in part because of the relative dearth of enforcement agencies, and in part because the people of that era understood homosexuality differently than people of the 20th or 21st century. The binary of “homosexuality” and “heterosexuality” did not exist until the 20th century, particularly with the rise ofFreud and his psychosexual theory of development. Freud claimed that humans are born innately bisexual, and that influences in early childhood determined whether or not they will follow a “straight” course of sexual development (heterosexual) or a “perverted” course (homosexual). Prior to the popularization of these ideas in the early twentieth century, society generally accepted the existence of a third group of people, commonly termed the “third sex.” This “third sex” was made up of men who identified themselves as women: they dressed as women, did women’s work, and had sexual relations with men. Colloquially, they were often termed “fairies.” 

Although not fully accepted by society--and certainly not by religious institutions or law enforcement agencies--members of the “third sex” were usually tolerated, and even the police tended to leave them alone as long as they didn’t create too much trouble. This was due, in part, to the fact that they played an important role: in Victorian society, white women were considered to be chaste and even asexual, frightened and even repulsed by sex. In many urban centers such as New York City or San Francisco, then, women were not only sparse--as urban centers were considered public spaces where men congregated to work and play, keeping the women at home--but also sexually unavailable. This is one of the primary reasons prostitution was so rampant in the 19th century: prostitutes, themselves either perversions of womanhood or tragic “fallen” women, provided sexual outlets for men whose wives or girlfriends could not satiate their appetites.

Yet in some cases, such as in predominantly male working-class New York neighborhoods, or in Gold Rush San Francisco, even prostitutes were hard to find or too expensive. In those cases, many men sought out members of the “third sex. Because men in this era were thought to be inherently sexual (the constant production of sperm was cited as biological proof), a man who sought out sex was often considered “manly” no matter his choice in sexual partner. Manliness was therefore defined in part by sexual appetite--if no woman existed to slake it, a man might indeed seek out another man, particularly a member of the “third sex,” and no one would question either his manliness or his sexual orientation.

San Francisco, then, has almost always been a haven for homosexual men, in one way or another--and, contrary to popular belief, homosexuality has been historically tolerated across racial and geographic divides. The history of San Francisco’s homosexual community has often been presented as one of recent visibility and recent triumph--and it is true that the gains made since the Gay Liberation Movement of the 1960s-1980s have been momentous. Yet much of this history forgets the long tradition of men whose identities transcended the tenuous binary that 20th and 21st century society has imposed on both contemporary and past societies of different races. Understanding that the standards and codes by which we measure modern society are often of modern, western invention will help us to better understand the actions and experiences of historical subjects.

 

What do you think of the article? Let us know below.

Sources

U.S. Seventh Census 1850: California. [Accessed November 2018]: https://www2.census.gov/prod2/decennial/documents/1850a-01.pdf

Gore Vidal, The City and the Pillar, (E.P Dutton & Co: New York, 1948).

Charles Callender, Lee M Kochens, “The North American Berdache,” Current Anthropology,Vol 24: No 4 (August-October 1983).

David Wishard, “Encyclopedia of the Great Plants,” Univeristy of Nebraska, Lincoln, 2011 [Accessed November 2018]: http://plainshumanities.unl.edu/encyclopedia/doc/egp.gen.004

Timothy C Osborne’s Diary, Bancroft Library, UC Berkeley.

Albert Hurtado, Intimate Frontiers: Sex, Gender, and Culture in Old California(University of New Mexico Press: Albuquerque, 1999).

Susan Johnson, Roaring Camp: The Social World of the California Gold Rush(W. W. Norton & Company: New York, 2000).

George Chauncey, Gay New York: Gender, Urban Culture, and the Making of the Gay Male World, 1890-1940 (Basic Books: New Yorkm 1995).

Barbara Weltman, “The Cult of True Womanhood,”American Quarterly, Vol. 18: No. 2, Part 1 (Summer, 1966).

Timothy Gilfoyle, City of Eros: New York City, Prostitution, and the Commercialization of Sex, 1790-1920 (W.W. Norton & Company: New York, 1994).

Frank Newport and Gary Gates, “San Francisco Metro Area Rates Highest in LGBT Percentage,” Gallup, March 20, 2015 [Accessed November 2018]: https://news.gallup.com/poll/182051/san-francisco-metro-area-ranks-highest-lgbt-percentage.aspx.

Peter Boag, Re-Dressing America’s Frontier Past (University of California Press: Berkeley, 2011).

Brandon Ambrosino, “The Invention of Heterosexuality,” BBC, March 16, 2017 [Accessed November 2018]: http://www.bbc.com/future/story/20170315-the-invention-of-heterosexuality.

The Falkland Islands are some 300 miles (or about 480 kilometers) off the coast of Argentina and have been a British-owned territory since the nineteenth century; in 1982 Argentina and Britain fought a war over ownership of the islands. Here, Matt Austin considers civilian casualties during the Falklands War in the wider context of the decline of the British Empire.

Argentine prisoners of war during the 1982 Falklands War. Source: Ken Griffiths, available here.

Argentine prisoners of war during the 1982 Falklands War. Source: Ken Griffiths, available here.

Introduction

Beginning on the second of April and lasting until the fourteenth of June 1982, Britain was engaged in a seventy-two day war to retain one of its few remaining commonwealth territories. Argentine writer Jorge Luis Borges refers to the Falklands War as “two bald men fighting over a comb,” a comparison that strongly outlines the sheer needlessness of the conflict in the eyes of many historians and writers.[1]It is therefore possible to suggest that the casualties endured during the Falklands War, an estimated eight hundred and seventy eight in total, with the inclusion of Argentine prisoners of war, numbering over eleven thousand, were themselves needless.[2]Ultimately, the motivations behind the Falklands War and the nature of how it was fought have led it to be considered one of the most unique conflicts in British military history.

 

The Decline of the British Empire

Following the Second World War, Britain underwent a period of decline. Due to the heavy economic losses endured during the conflict, the nation was unable to effectively fund its Empire and granted independence to a number of its former colonies from the 1940s onwards. The first of the major colonies to gain independence following the Second World War was India. With warring political groups and a lack of ‘safeguards’ for British business and trade interests, UK Prime Minister Clement Attlee decided Britain was to ‘abandon control’ of India in 1947.[3]

This was followed by the loss of numerous territories in the following decades, such as Ghana in 1957, Uganda in 1962, and Kenya in 1963. Consequently, the loss of Southern Rhodesia, or Zimbabwe, as the newly independent state became known, in 1980, was the last of the British territories in Africa. The loss of Southern Rhodesia represented the end of an era for the British Empire, following its inevitable decline in the decades after the Second World War.[4]This left the former international powerhouse of the British Empire with a severely reduced, sparsely scattered group of commonwealth territories, so threatening the nation’s global influence. With the threat of the Empire being completely lost, a concept that had become gradually apparent throughout the past several decades, Britain would therefore rigorously attempt to retain and protect any of its remaining territories against invasion. 

 

The Falklands War

The origins of the Falklands Warcan be attributed to the militant Argentine government’s decision to invade and occupy the neighboring islands in an attempt to encourage positive public opinion. Despite having a severely weakened economy and dealing with increasing demand for the introduction a democratic voting system, the government, under the control of their military dictator Leopoldo Galtieri, received an outpouring of public support in favor of the invasion of the islands, as Argentine feelings of nationalism surged.[5]This reinforced the decision to defend their newly captured territory against the prospect of a British invasion.

Following news of the Argentine invasion and take over of the Falkland Islands, Britain responded by sending a naval taskforce on April 5, 1982 to defend the islands from the invading forces. Ultimately, the conflict was short lived, as Britain was successful in its attempt to regain the Falkland Islands through the use of more advanced military technology and superior combat training. US president Ronald Reagan was initially skeptical of Britain’s decision to win back the Falklands, suggesting that it was not worth an invasion. However, in an attempt to avoid any political tension between the United States, and the United Kingdom, under Prime Minister Margaret Thatcher, he eventually decided to support the effort, providing Britain with weaponry and munitions, which aided the victory and shortened the conflict.

 

Military Casualties

The Argentine casualties during the Falklands War numbered up to six hundred and forty nine, around four hundred more than those of the British. The majority of the casualties of the Falklands War occurred during the attacks on naval ships carrying large numbers of troops. The specific case of the British attack on the Argentine ship, the General Belgrano, resulted in almost half of all Argentine casualties, with three hundred and twenty one of the ship’s one thousand one hundred crew being killed.[6]This has since been considered a highly controversial moment of the Falklands War, sparking the debate over a possible war crime, as the Belgrano was attacked thirty six miles away from the British exclusion zone that had been set up around the islands.[7]

Nevertheless, despite a vast majority of the casualties originating from naval attacks, friendly fire was a larger issue for British troops in the Falklands than the majority of its other twentieth century conflicts, relative to the scale and nature of the war. The majority of incidents of British friendly fire occurred at night. The reason for this can be attributed to the result of misinterpretation of the identity of British troops, among the ‘monotonous, featureless terrain’ of the Falkland Islands.[8]Furthermore, it was not simply British troops that fell victim to friendly fire, as the only civilian casualties of the Falklands War are attributed to this.

 

Civilian Casualties

The decisive British victory, however, was underpinned by the regularly overlooked deaths of three civilians.[9]Whilst civilian casualties are unfortunately rarely unique during wartime, the case of the death of three Falkland Islanders is in itself a rare occurrence, as these deaths were caused by friendly fire. The three civilian deaths of the Falklands War hold great significance, as they demonstrate the contradictory nature and moral considerations that embodied this conflict. As the islands had been under British rule for centuries, those living there were British citizens and being predominantly farmers, had little to no means of preventing the unexpected Argentine invasion. Consequently, there must have been a sense of relief when news that the British would launch an invasion to secure back the islands reached those living there.[10]However, this was not to be the case for three Falkland Islanders living in the capital, Port Stanley, as Susan Whitley, Doreen Bonner, and Mary Goodwin unfortunately lost their lives during the British bombing of the capital.[11]Whilst these deaths are often overlooked in what is a considerably neglected conflict in itself, they have come to somewhat represent British international relations in the latter half of the twentieth century.

What is therefore so intriguing about these deaths are the wider moral implications that surround them. Britain, in an attempt to recapture the islands, supposedly for the safety of the Falklanders and the right to retain their British identity, contributed to the only incidents of civilian casualties of the war. This represents the contradictory nature of this conflict and creates a wider moral question of whether the unrealistic perception of the ‘Empire’ and the lengths that Britain would go to ensure its survival was worth more to the government and foreign policy makers than the people they were trying to protect. 

 

Conclusion

The Imperial undertones of the Falklands War are highlighted by these deaths; this article therefore concludes by posing the question of British morality and whether this conflict was simply an overreaction to the post war decades characterized by the decline of the once powerful Empire that built up and bubbled over, culminating in one of the most unnecessary, frustrating conflicts in the nation’s history.

 

What do you think of the author’s arguments? Let us know below.


[1]Miles Kington, “What did you do in the Falklands War, Daddy?” The Independent, October 28, 1998, https://www.independent.co.uk/arts-entertainment/what-did-you-do-in-the-falklands-war-daddy-1181032.html.

[2]“Falkland Islands War. Cost and Consequences,” Britannica, accessed 17/11/2018, https://www.britannica.com/event/Falkland-Islands-War#ref302171.

[3]Nicholas Owen, “The Conservative Party and Indian Independence, 1945-1947,” The Historical Journal 46, no. 2 (June 2003): 404.

[4]Hevina S. Dashwood, “Inequality, Leadership and the Crisis in Zimbabwe,” International Journal57, no. 2 (Spring 2002): 209.

[5]Paola Ehrmantruat, “Aftermath of Violence: Coming to Terms with the Legacy of the Malvinas/Falklands War (1982),” Arizona Journal of Hispanic Cultural Studies 15 (2011): 95-96.

[6]“Is Maggie Thatcher a War Criminal?” Belgrano Enquiry, accessed 10/12/2018, http://belgranoinquiry.com/.

[7]“Is Maggie Thatcher a War Criminal?”

[8]Beck, “How Are You Enjoying the Day?”

[9]Lucy Beck, “How Are You Enjoying the Day? Remembering the victims of the Falklands War,” April 2007. http://archive.ppu.org.uk/falklands/falklands3.html.

[10]David Saunders, Hugh Ward, David Marsh and Tony Fletcher, “Government Popularity and the Falklands War: A Reassessment,” British Journal of Political Science 17, no. 3 (July 1987): 281-282.

[11]Beck, “How Are You Enjoying the Day?”

Jesters were a key part of many Medieval courts. But are Jesters still among us? Here Daniel Smith (following his previous article on California in the US Civil War here) tells us about Medieval European Jesters and suggests parallels for Jesters in modern-day America.

A 16th century painting of a jester Source: here.

A 16th century painting of a jester Source: here.

The Jester was common in the times of castles, villages, chain-mail, and treachery. In Medieval Europe, the elites and nobility would hire jesters in which the aristocratic family would regard them as “mascots”. These characters were well-educated individuals who came from a variety of diverse upbringings. Jesters are known for their crazy styles and abstract apparel.[1] All of this for the attention of the court of course, and sometime the humility. These people were hired to amuse the lord and the lord’s guests. At times, Jesters, oddly enough, were paid to criticize them too! 

These people of humor and talents had a privilege given to them by their master: freedom of speech. Interestingly enough, Jesters were one of the few people in their lord’s presence that could speak their minds freely without risk of punishment. They typically used humor and parody to joke around and “razz” the nobles and elites.[2] Bringing bad news was another job for the Jester to deliver to their master - when paid appropriately.

 

Types of Jester

Excessive misbehavior though, would result in some form of harsh punishment. There were two primary types of Jesters in Medieval Europe – the natural fooland the licensed fool. The natural fool was known as moronic in social setting; whereas the licensed fool had the legal privileges granted to them to avoid the mentioned court punishments for bad behavior.

The most apparent description of the Jester is a person who worked under the employment of a European noble, telling jokes and providing entertainment. Bright colors with eccentric hats and bells were a calling card for Jesters. A couple of surprising details pertaining to symbolism are the hat and scepter that Jesters often wore and carried. There was a head usually carved into the top of the scepter, representing the actor. The scepter was more or less ornamental and it was called a “marotte”.[3] This staff was symbolic in representing the authority of the royal court.

Overall, many of these actors held small roles in the courts they worked for (or were pressed into) and livened up most social events. It was some serious responsibility and even obligation for the Jester to bring a smile to a sick or often angry King or Monarch. This position was one held purely for the amusement and humor of his master. Assisting in preventing state affairs from becoming too serious was a main priority to the Jester, as well as bringing excitement to courtly meals, apparently to help assist in aiding with digestion.

 

How The Jester Is Portrayed Today

Most of the Jester’s entertainment in the courts or within the master’s domain would likely include music (vocal and with an instrument), prop and physical comedy, storytelling and myth bringing. Some historians also suggest that some Jesters juggled and were acrobatic. Basic tools, props, and instruments were all that was necessary for performances in the court.[4]

Jesters are comparable to today's clowns. They also parallel Hollywood actors and musical artists. Today’s artists essentially do the same job as the Jester of the feudalistic courts. The only difference is artists are able to connect to mass audience; whereas the Jester could only reach the royal courts and social gatherings. I mean – the printing press didn’t even arrive until the 15thcentury!

Indeed, there are a number of similarities between Jesters and modern-day entertainment. For example, soap opera characters are sometimes corporate people working against one another’s rivals and family members. This could be because some Hollywood entertainment is produced and designed to be geared towards the elites and higher classes of American society. Again, this mimics the Jester entertaining the master and getting paid for it.[5]

It is the same today in modern America, just as the elites and nobility would have done behind castle walls. Today, actors are hired in Hollywood, and some powerful people could consider them as “mascots”.

These actors, actresses, and musicians today are typically well-educated individuals who come from a variation of diverse upbringings – just as the Jesters of the past.

Actors, actresses, and musicians today are known for their crazy styles and abstract apparel – just as the Jesters of the past. All of this for the constant need for attention from the audience at home – just as Jesters would in court of course.

Actors, actresses, and musicians today are people who are hired to amuse the master (the elite) and the master’s guests (the voters) – just as the Jesters of the past. 

And finally, at times, Jesters were paid to criticize the nobility at court…[6]

Just like the actors, actresses, and musicians today in Hollywood who are using television and radio for their political and social arguments using their platforms… which is something that you would not find from Jesters of the Royal Courts for fear of cruel punishment.

Jesters are not a thing of the past. They are here today.

 

What do you think of the arguments in the article? Let us know below.

Finally, Daniel Smith writes atcomplexamerica.weebly.com.

References

1.     Billington, Sandra. “A Social History of the Fool,” The Harvester Press, 1984. ISBN 0-7108-0610-8

2.     Doran, John. “A History of Court Fools,” 1858.

3.     Hyers, M. Conrad, “The Spirituality of Comedy: Comic Heroism in a Tragic World.”1996 Transaction Publishers ISBN 1-56000-218-2

4.     Otto, Beatrice K., "Fools Are Everywhere: The Court Jester Around the World,"Chicago University Press, 2001

5.     Southworth, John, Fools and Jesters at the English Court, Sutton Publishing, 1998. ISBN 0-7509-1773-3

6.     Welsford, Enid: “The Fool: His Social and Literary History”(out of print) (1935 + subsequent reprints): ISBN 1-299-14274-5

Posted
AuthorGeorge Levrier-Jones
CategoriesBlog Post

The pre-Enlightenment world was simultaneously both fascinating and frightening. People often ad no choice but to rely on their imaginations to make sense of the myriad phenomena around them. The result was a world where everything seemed magical; a place teeming with angels, demons, fairies, and witches. Only through uncanny and sometimes ‘ridiculous’ superstitions did many people of the Dark Ages (or Middle Ages or Medieval Period) in Europe try to make sense of their world. Jamil Bakhtawar explains.

The devil swapping a baby. Artist: Martino di Bartolomeo, 15th century.

The devil swapping a baby. Artist: Martino di Bartolomeo, 15th century.

The Lucky Horseshoe

There are reasons why people of the Medieval period believed that horseshoes were lucky. The first was that they were made of iron, a metal that was believed to ward off evil spirits. Another comes from the legend of Saint Dunstan in the 10th century. It was said that that Dunstan worked as a blacksmith and one day the Devil came into his shop. Dunstan pretended not to recognize him and went about getting horseshoes for the Devil’s horse.

However, instead of nailing the horseshoes to the horse, Dunstan nailed them to the Devil instead. The horseshoes caused the Devil immense pain but Dunstan said that he would only discard them if the Devil promised never to enter a home with a horseshoe on the door. The horseshoe was also believed to ward off witches and that is why it was believed that they rode on brooms. Therefore, it was said that a witch would be reluctant to enter any home with a horseshoe over the door. There were also rules regarding the horseshoe. The first was that it had to be iron, and the second was that it had to come off the horse on its own and not be taken off by any man. And then, the horseshoe would need to be nailed over the door with iron nails. There is some debate about the orientation of the horseshoe. Some believe that the horseshoe should point up so as to prevent the luck from spilling out of the horseshoe. Others believe that it should point down so that the luck can be poured upon those who enter the home.

 

The Royal Touch

People often accepted that kings and queens, by virtue of their divine right to rule, had the power to heal disease by their touch. One particular malady called scrofula, a tubercular inflammation of the lymph glands in the neck, was believed to be healed when touched by a sovereign. This healing was seen as validation of the monarch’s appointment from God. It was claimed that the first to practice the healing touch was Edward the Confessor, ruler of England from 1042 to 1066.

In medieval times, grand ceremonies were held in which the ruler touched hundreds of people afflicted with scrofula, or the “King’s Evil.” These people then received special gold coins called “touchpieces” that they regarded as amulets.

Edward the Confessor as shown on the Bayeux Tapestry.

Edward the Confessor as shown on the Bayeux Tapestry.

God Blesses After a Sneeze

One of the most well-known superstitions that is believed to come from the Middle Ages is the need to say, “bless you” after someone sneezes. There was a belief that sneezing gave Satan the opportunity to enter the body and the person who sneezed needed the help of God to exorcise the devil. Saying “God bless you” was believed to be a way to keep the Devil from entering the body and therefore save the person who had sneezed. It was a way to explain the death that sometimes occurred after a person sneezed and it instilled in people the sense that they could do something to help.

There was also the prevailing belief that a person could ’sneeze out their soul’. This was also counteracted by a person saying “God bless you” or covering the face to keep the soul in. This superstition was encouraged with the spread of illness during a time where there was little way to help people to overcome devastating ailments.

 

Spilling Salt 

In the Middle Ages, salt was a precious resource, and it was believed to have medicinal properties. If salt was ever spilled, it was no longer able to be used for medicine and therefore it was gathered up and thrown over the left shoulder in order to blind the evil spirits that were said to constantly follow people around.

There is an even older reasoning behind the superstition that salt was known to make soil barren for an extended time, and this is the basis for the belief that spilling salt is akin to cursing the land.

 

Changelings

One prevalent superstition in Medieval Britain was the fear that a child could be taken and replaced with a changeling. One of the stories of the changeling comes from the tale of a blacksmith who noticed one day that his son suddenly became lethargic and was wasting away.

The blacksmith was told that his son was taken and replaced with a changeling. To prove it, he was told to put water into empty egg shells and place them around the fire. The child then sat up and spoke in the voice of the changeling stating that he had lived for centuries and had never seen something like that. The blacksmith then threw the changeling into the fire. The man journeyed into the land of the fairies with his bible and the fairies, unable to harm him due to the Bible, returned his son.

There were a number of unusual tests that people performed to try and see if their child was a changeling. They typically involved doing something so strange that it would draw the changeling out in surprise. One test was to place a shoe in a bowl of soup, and if the baby laughed it was a changeling. Also, making bread inside of egg shells was said to be so amusing to changelings that it would cause them to expose themselves. Some scholars have suggested that changelings may have been used as a way to explain autistic children, especially since the changes can come on quickly. When a child’s behavior and verbal skills rapidly declined or changed, it was blamed upon the doings of the changeling.

 

Magic and Witchcraft

Throughout Europe, during the Middle Ages, belief in magic and sorcery was pervasive. Magic involved attempts to take advantage of ‘supernatural’ powers for personal benefit. A fascinating combination of magic and religion was involved in the lives of ordinary people and at times they would utilize spiritual practitioners who specialized in a multitude of beneficial magical services. Charms, prayers and rituals, which also incorporated some aspects of Christianity, were routinely employed in attempts to provide a diverse array of benefits. Also, this acted as a protective barrier against harmful magic, or maleficium, which was blamed for many of the hardships that plagued the European population during this period. For the most part, attempting to manipulate these powers was deemed superstitious by the religious establishment.

Consequently by the beginning of the 13th century, witchcraft in the Middle Ages began to be considered as ‘demonic-worship’ and was feared throughout Europe. People believed that magic represented Satan and was associated with devil worship. The types of magic that were said to be practiced during the Dark Ages were:

1. Black Magic

Black Magic was also known as the ‘harmful’ type of magic. Black Magic had more of an association with the devil and satanic worship. If someone fell ill of unknown causes, this was frequently said to be caused by witches who practice black magic. Other harms caused to society, such as accidents or deaths were also said to be caused by Black Magic.

 

2. White Magic

The basis of White Magic was Christian symbolism, and it focused on the power of nature and herbs. It was considered as the ‘good’ type of magic. White Magic was used for good luck, love spells, wealth and spells for good health. Astrology constituted another significant part of White Magic. Alchemy, which is the practice of making potions, was a part of White Magic as well.

 

The 1486 book Malleus Maleficarum(often translated as the Hammer of Witches) decreed that witchcraft was heresy. It asserted that witches were mostly women and that female lust formed pacts with the Devil. Midwives were especially singled out for their alleged ability to prevent conception and terminate pregnancies. It accused them of eating infants and offering live children to the Devil. But the real heinousness of the Malleus Maleficarumwas in the procedures drawn up to identify and exterminate witches.

The accused were stripped and searched for the ‘devil’s marks’, then dunked in water or burned. Using the Malleus Maleficarumas a guide, torture was liberally used to extract confessions or implicate other people throughout the period witch hysteria. Gruesome torture devices were developed that could crush or dislocate bones, mangle bodily orifices and tear out fingernails. Red-hot pincers were applied to tear out pieces of flesh as well. Those found guilty of witchcraft were burned at the stake. All in all, there is no more damning testament to the dangers of superstition than the Malleus Maleficarum.

The cover of 1 1520 edition of the Malleus Maleficarum.

The cover of 1 1520 edition of the Malleus Maleficarum.

The Bride’s Garter

Bridal garments were considered blessed. The bride would have all her clothes ripped from her by the guests on the wedding night as everyone tried to snatch a piece. Gradually attention focused on the bride’s garter-ribbon – a symbol of sexuality and fertility.

In Medieval times, unmarried men fought for the bride’s garter to ensure they would be the next to find a beautiful and fertile wife. Bachelors even mobbed the bride as she stood at the altar, throwing her to the ground and ripping the garters from her during the wedding ceremony. When the church protested, the custom evolved to the groom removing the lucky garter from his new wife in the bridal chamber and tossing them down to the waiting men.

 

Number 13: A Troublesome Phobia

The belief that the number 13 is cursed primarily had a religious reasoning in the Middle Ages. For instance, there were 13 people who attended the Last Supper and therefore it was believed that 13 people at a gathering were a bad omen. Many believed that if a party was held for 13 people, whoever was the first to get up would be dead within the year.

With this superstition, people of the Dark Ages ensured there would never be 13 people gathered together. In fact, by the 16th century, it was claimed a person was a witch if they had 13 people together.

 

Conclusion

When one thinks of magic and superstitions of the Dark Ages, a number of assumptions and sometimes misguided beliefs come to mind. Witch hunts and superstitions caused many deaths and worried the minds of countless people. The world in which most Europeans lived involved belief in supernatural and mystical elements, over which humans could exert some influence with not only evil intentions and outcomes, but with good results as well. Although doing so was forbidden by the church, magical activities and beliefs were an integral part of the ordinary life of the people.

 

What Middle Ages superstitions do you know about? Let us know below.

Sources

http://www.thefinertimes.com/Middle-Ages/witches-and-witchcraft-in-the-middle-ages.html

Stuart Clark, “Witchcraft and Magic in Early Modern Culture,” in Witchcraft and Magic in Europe, ed, Bengt Ankarloo and Stuart Clark (Philadelphia: University of Pennsylvania Press, 2002), 

Bailey, Magic and Superstition, 193.

https://hchroniclesblog.wordpress.com/2013/08/01/superstitions-of-medieval-england/

https://prezi.com/6caqxk-lpxhh/superstitions-of-the-medieval-times/

https://www.historyextra.com/period/medieval/historical-superstitions-why-friday-13th-unlucky-kiss-under-mistletoe/

The US Supreme Court has played a key role at times in US history. One such occasion was when a decision was required on segregation in the 1890s. Here, Jonathan Hennika continues his look at the history of the US Supreme Court (following his article on Marbury v Madison hereand Dred Scott here), and focuses on the 1896 case of Plessy v Ferguson.

Justice Henry Billings Brown, who write the majority opinion in the Plessy v Ferguson case.

Justice Henry Billings Brown, who write the majority opinion in the Plessy v Ferguson case.

Recently, President Trump criticized a federal judge who ruled against his administration’s asylum policy calling him an “Obama judge.” While it is often customary for judges to avoid commentary on this type of political remark, Supreme Court Chief Justice John Robertsthought little of Mr. Trump’s comparison. In repudiating the President’s comment,the Chief Justice relied on the conventional wisdom of an apolitical judiciary. In a rare display of judicial independence, Chief Roberts declared, “We do not have Obama judges or Trump judges or Bush judges or Clinton judges, what we have is an extraordinary group of dedicated judges doing their level best to do equal right to those appearing before them.”[i]

In discussing the debate between President Trump and Chief Justice Roberts, The New Republic’sJonathan Zimmerman invokes the “Martin-Quinn” measure of the judiciary to point out that, “We have a Supreme Court where every Republican on the court is more conservative than every Democrat.” What this means to the nation is that “gone are figures like John Paul Stevens, a Republican judge nominated by a Republican president, who usually sided with Democrat-nominated justice on the court.”[ii]The Martin-Quinn measure was developed by two University of Michigan academics, funded in part by the National Science Foundation, to provide a statistical analysis of the Court’s ideological leanings since 1937.[iii]

While proving helpful, one does not need a statistical analysis of ideological leanings to examine the Court’s history. The Court’s Dred Scott decision ill defined what it was meant to be a citizen. In another infamous ruling, the Court ratified de jure discrimination when it ruled in support of de factodiscrimination of the Jim Crow South.

 

Plessy versus Ferguson: A Test Case Like No Other

The American landscape changed dramatically in the decades after the Supreme Court’s Dred Scottdecision. The nation fractured with the Civil War, followed by a period of Reconstruction.  Concurrently, it was a period of rapid industrializationin the North and Midwest; the agricultural South fell into a system of tenant farming and sharecropping. This system flourished, in part, because in the states of the old Confederacy societal divisions continued along racial lines. Historically referred to as the Jim Crow era; it was the period of segregation of the races.

One of the popular benchmarks historians use to measure a nation’s growth is the area of transportation. Rail united America in 1890; approximately 163,597 miles of railroad tracks crisscrossed the land.[iv]To govern rail passage, the state of Louisiana enacted the Separate Car Act, legislation requiring the railroad companies to maintain a second set of train cars for African American passengers.  In New Orleans, a group of like-minded residents formed the “Comite des Citoynes” or Committee of Citizens to fight the law. The Committee asked Homer Plessy to participate in the test case. With his one-eighth African heritage,Plessyagreed. After purchasing a first-class ticket and boarding the train’s whites-only car, a private detective hired for the sole purpose arrested Plessy.[v]

The case made its way to the Supreme Court which heard arguments and issued its ruling on May 18, 1896. Unlike the Court that ruled on Dred Scott, the Plessy court was regionally diverse. Three of the justices hailed from the Northwest; two from the South; three from the Midwest; and one from the West. Regionalism, once the rallying cry of the nation gave way to other considerations. “Most of the judges were conservatives who favored protection of property rights vis-à-vis state regulationof private property….Justice Stephen J. Field, a California Democrat…was by far the most influential member of the Plessycourt….he was an early champion of minority rights, he later became an advocate of laissez-faire economics and championed the revolution in due process of law…when the Court recognized substantive due process as a limitation of state legislative power.”[vi] The Court became economic activists, issuing rulings limiting the effectivenessof governmental regulations on private enterprises. The railroad company, East Louisiana Railroad, was a voluntary participant to the lawsuit. They objected to the Separate Car Act on the economic grounds of the added expense of the African-American only cars. When deciding the case, the laissez-faire, hands-offattitude espoused by Field could not stand up to prevailing institutionalized racism predominate in the American South.  In a 7 to 1 decision, Justice Brewer did not participate; theCourt ruled the Separate Car Act did not violate the 14thAmendment’s Equal Protection Clause.

 

Separate but Equal?

Justice Henry Brown wrote for the majority: “The object of the [14th] amendment was undoubtedly to enforce the absolute equality of the two races before the law, but, in the nature of things, it could not have been intended to abolish distinctions based upon color, or to enforce social, as distinguished from political, equality, or a commingling of the two races upon terms unsatisfactory to either.Lawspermitting and even requiring, their separation in places where they are liable to be brought into contact do not necessarily imply the inferiority of either raceto the other and have been…recognized as within the competency of the state legislatures in the exercise of their police power.”[vii]

The irony of the Plessy decision is that neither the attorney’s involved for Plessynor the Court ever discussed the premise of equal accommodations. The legal question revolved around the constitutionality of the Louisiana statute when measured against the Constitution. In arguing that the state followed the 14thAmendment in the creation of the Separate Car Act, Brown turns to inherently racist logic:

It is claimed by the plaintiff in error that, in any mixed community, the reputation of belonging to the dominant race, in this instance the white race, is property in the same sense that a right of action or inheritance is property. Conceding those to be so for the purpose ofthis case, we are unable to see how this statute deprives him of, or in any way affects his right to, such property. If he bea white man and assigned to a colored coach, he may have his action for damages against the company for being deprived of his so-called property. Upon the other hand, if he bea colored manand be so assigned, he has been deprived of no property, since he is not lawfully entitled to the reputation of being a white man.[viii]

 

University of Michigan Historian Rebecca Scott summarized the Court’s decision as embracing the “white-supremacist formulation—which reinterpreted the claim to equal treatment as a matter of forcing oneself where one was not wanted—that carried the day…the damage thus done was both practical and doctrinal, formalizing the sleightof hand that portrayed an aggressive program of state-imposed caste distinctions as the mere ratification of custom.”[ix]

Segregation in the American South continued unabashedly throughoutthe twentieth century. Proponents of segregationembraced the Plessy decision as validation that separate was equal. They argued that classification was not discrimination and if all members of the class receivedthe same treatment, there was no disparity. The Supreme Court heard other cases regarding racial bias, but it was not until its Brown versus Board of Education, decision in 1956 that separate but equal was declared unconstitutional.  In the intervening years, there are additional rulings that exemplifythe apolitical nature of the federal judicial systems’ top court.

 

What do you think of the article? Let us know below.


[i]Jonathan Zimmerman, “Who is John Roberts Kidding,” The New Republic, November 26, 2018. https://newrepublic.com/article/152399/john-roberts-kidding

[ii]Ibid.

[iii]Martin-Quinn Score Project Description, http://mqscores.lsa.umich.edu/

[iv]Central Pacific Railroad Photographic History Museum “Maps Showing the Progressive Development of U.S. Railroads - 1830 to 1950,”http://cprr.org/Museum/RR_Development.html

[v]Plessy v Ferguson, 163 U.S. 537 (1896)

[vi]David W. Bishop, “Plessy v Ferguson: A Reinterpretation,” The Journal of Negro History,62 (Apr. 1977), 126. This 

[vii]Plessy v Ferguson, 163 US 537; 544

[viii]Ibid, 549.

[ix]Rebeca Scott, “The Atlantic World and the Road to Plessy v Ferguson,” The Journal of American History, 94 (Dec. 2007), 731.

<!-- AddThis Button BEGIN -->
<div class="addthis_toolbox addthis_default_style ">
<a class="addthis_button_facebook_like" fb:like:layout="button_count"></a>
<a class="addthis_button_tweet"></a>
<a class="addthis_button_pinterest_pinit"></a>
<a class="addthis_counter addthis_pill_style"></a>
</div>
<script type="text/javascript">var addthis_config = {"data_track_addressbar":true};</script>
<script type="text/javascript" src="//s7.addthis.com/js/300/addthis_widget.js#pubid=ra-51ad528e64d08d19"></script>
<!-- AddThis Button END -->

Following the finding of the spies who shared American atomic secrets with the Soviet Union (read more here), the “Red Scare” was sweeping over 1950s Cold War America. And Cold War espionage was not going away. Here Scott Rose explains how Rudolf Abel’s New York-based Soviet spy ring was discovered in 1957.

A Soviet stamp from 1990 commemorating Rudolf Abel.

A Soviet stamp from 1990 commemorating Rudolf Abel.

The United States broke the Soviet atomic spy ring in the early 1950s, after the USSR had already accomplished its goal of acquiring the American information its scientists needed to build an atomic weapon. However, this was not the end of Cold War espionage between the two superpowers; in fact, it was barely the beginning. Both countries used every available method to find out each other’s plans and secrets, and in the process, many participants in this game either died, were sent to prison, or were ruined personally and politically.

When atomic spies Julius and Ethel Rosenberg were tried, convicted and executed, they never admitted guilt or gave up any of their contacts. One of their contemporaries in New York was running a Soviet spy ring of his own, which he built for seven years after the Rosenbergs were arrested.

 

An Espionage Artist

In 1948, an artist and photographer named Emil Goldfus rented a small studio in Brooklyn. While his artistic talents were average at best, Goldfus had a talent for espionage that was anything but average. Mr. Goldfus was actually a Soviet KGB colonel named Rudolf Abel, and he had been one of the Soviet Union’s greatest spies during World War II. Proficient in Russian, English, Polish, German, and Yiddish, Abel was a uniquely versatile spy. He had perfectly impersonated a German military officer and was able to give the Red Army valuable information on German troop movements.

Abel came to New York in 1948, and he quickly built a spy network in America. In addition to his new espionage contacts, he made friends among other artists, who never had any reason to suspect him as being anyone other than who he said he was. Abel would sometimes leave town for weeks at a time, which his friends attributed to his eccentric, bohemian personality. 

By 1954, Abel had built a large spying operation, and his methods of transmitting coded messages included placing microfilm inside of hollowed-out bolts, coins, and pencils. The Soviets decided Abel needed an assistant, to which he objected. Nevertheless, the Soviets sent an agent named Reimo Hayhanen to help Abel in New York. Abel quickly found his assistant to be completely incompetent, but tried his best to make an effective spy out of Hayhanen. A year later the KGB was concerned that Abel was becoming exhausted, and recalled him to the USSR for six months of vacation. When Abel returned to Brooklyn, he found his operation in shambles. Hayhanen had been extremely careless, and had spent much of the network’s finances on alcohol and prostitutes. By 1957, Abel had had enough, and demanded that his assistant be recalled to the Soviet Union. Hayhanen received his recall orders, and panicked, fearing he would be executed upon arriving in Moscow. He made it as far as Paris, where he walked into the American embassy, telling his story and pleading for asylum. At first, the CIA suspected Hayhanen was drunk, and he may very well have been. However, they decided to verify the information he had given them, and realized he was telling the truth.

When Hayhanen didn’t arrive in Moscow, the Soviets knew right away that he had defected. Abel was recalled, but didn’t make it out of the United States. Just before he was scheduled to leave, the FBI arrested him at a hotel in New York. He knew he was caught when an FBI agent addressed him as “Colonel.” Ever the professional, Abel didn’t say a word when he was arrested, simply staring ahead. However, Abel had not disposed of the evidence in his studio before he attempted to leave the United States, a surprising error for a spy as seasoned as Abel. When the studio was raided, the FBI realized it had found a goldmine of information. All sorts of spying and transmission equipment were found in the studio, but most importantly, there were photos of Soviet agents in the USA, along with lists of their names. Within weeks, Abel’s entire network of spies was shut down.

 

The Client Nobody Wanted to Represent

Abel was charged with espionage, and his next predicament was that there were hardly any defense attorneys in America that wanted to represent him. In the late 1950s, the United States was dealing with the lingering effects of the Rosenberg case as well as the “Red Scare” that had been whipped up by Senator Joseph McCarthy, who had had ruined many careers by accusing people in all parts of American government and culture of having communist leanings. Not many lawyers, especially ones with political ambitions, could afford to be seen defending a KGB colonel in court. Eventually, the US government found an attorney willing to take on the case. James Donovan, who had previously worked for the American OSS (the precursor to the CIA), agreed to represent Abel. This would seem quite ironic, but Donovan actually did everything he could to defend his client.

When the case went to trial, it probably would not have mattered who his lawyer was, as the evidence against Rudolf Abel was massive and undeniable. In essence, Donovan knew Abel was probably going to be convicted. His main objective at this point was to keep Abel from getting the death penalty, as the Rosenbergs had. He succeeded in this; when the court found Abel guilty, his life was spared in favor of a 30-year prison sentence. Donovan was not finished though, appealing the case to the US Supreme Court. He argued that the evidence from the studio, by which Abel was convicted, had been obtained in violation of the Fourth Amendment. In a 5-4 vote, the Supreme Court upheld Abel’s conviction, and he was sent to the federal penitentiary in Atlanta to begin his sentence.

While in prison, Abel kept himself busy with intellectual activities, such as painting and playing chess. Some days, he even passed the time by writing out tables of mathematical logarithms. Abel befriended several other convicted spies, including the Rosenbergs’ former accomplice, Morton Sobell. A couple of years after beginning his sentence, events on the other side of the globe would begin to work in Abel’s favor.

The FBI mugshot of Rudolf Abel after his arrest in 1957.

The FBI mugshot of Rudolf Abel after his arrest in 1957.

Gary Powers and the U-2 Incident

In 1960, the Soviets claimed to have shot down an American U-2 spy plane that was performing reconnaissance over the USSR. The pilot, Gary Powers, had ejected and survived, but was captured and brought to trial. The trial was designed to be a major propaganda victory, but it turned into an embarrassment for the Soviets. Powers admitted piloting a spy plane, adding that he had been flying reconnaissance missions over the Soviet Union for the past four years. He also told the court that his plane was not shot down at all; the U-2 had suffered a flame-out that had forced him to eject.  When the trial ended in August 1960, Powers was sentenced to ten years in a Soviet prison.

James Donovan, who had represented Rudolf Abel at his trial, recognized the opportunity to free both Abel and Powers. He orchestrated a prisoner exchange with the Soviets, who were eager to get Abel back. In February of 1962, Abel was released to the Soviet Union after serving only four years of his sentence. Likewise, Gary Powers was returned to the United States, where after retiring from the Air Force, he became a test pilot, as well as a helicopter traffic reporter for a Los Angeles television station. The prisoner exchange took place on the Glienicke Bridge in Berlin. As part of the swap, an American student named Frederic Pryor was released from the custody of the East Berlin police. In August of 1961, Pryor had been arrested and held by the East Germans, on the false suspicion that he was a spy for the CIA.

The Soviets treated Abel well when he returned, as he had been a valuable Cold War operative before being brought down by a bumbling assistant. He continued working for the KGB, even giving speeches to Soviet schoolchildren about intelligence operations. Just as Morris and Lona Cohen (two of the American atomic spies) were commemorated on Soviet postage stamps, Abel was honored on a stamp in 1990, one year before the fall of the Soviet Union. However, Abel’s luck had run out long before; after a lifetime of chain smoking, he died of lung cancer in 1971.

The story of the prisoner exchange was portrayed in the 2015 film Bridge of Spies.Frederic Pryor, who is still alive, went to see the film and claimed to have enjoyed it, while considering it to be over-dramatized. Pryor told a fellow moviegoer that the film had many inaccuracies, and the other person replied by asking Pryor, “How do you know that?” Pryor answered, “I’m Frederic Pryor.”

 

What do you think of the article? Let us know below.

REFERENCES

Chester B. Hearn, Spies & Espionage: A Directory, Thunder Bay Press, 2006

James B. Donovan, Strangers On A Bridge: The Case of Colonel Abel, Atheneum House, 2015

Ryan Dougherty, “Economist Frederic Pryor Recounts Life as a ‘Spy’”, Swarthmore College News & Events, October 21, 2015

Giles Whittell, Bridge of Spies: A True Story of the Cold War, Broadway Books, 2010

King Henry VII of England’s eldest son and first in line to the English throne was Arthur, Prince of Wales (1486-1502). However, he suffered an untimely death at the age of 15 and this led Henry to become first in line to the throne - and later King Henry VIII of England. But what would have happened had Arthur survived and become King of England? We explain here (and follows the author’s past Tudor article on King Edward VI here).

Arthur, Prince of Wales. Painting c. 1500.

Arthur, Prince of Wales. Painting c. 1500.

The Tudors are one of the most renowned and notorious English royal families in history with countless books, movies, articles, and research devoted to understanding them. No doubt King Henry VIII is the center of historical interest in the Tudors, with particular emphasis on his six wives and the reigns of his three children. Henry VIII of England presided over sweeping political and religious changes that brought the nation into the Protestant Reformation and radically altered the fabric of English life.

But Henry was only second in line to the English throne after his elder brother Arthur, Prince of Wales, who died in April 1502, most likely of tuberculosis. The short life of Prince Arthur Tudor is overshadowed and largely forgotten from Tudor history, only to be recounted in “The King’s Great Matter” nearly thirty years after his death. Had Arthur lived and ascended to the English throne instead of Henry VIII, what course would English history have taken?

 

The Birth of Arthur

When Arthur was born in Saint Swithun’s Priory (now Winchester Cathedral Priory) on September 19/20 1486, not only was he heir to the English throne but the result of two unified royal houses, York and Lancaster. His place of birth was believed to have been the capital of the legendary Camelot and the site of King Arthur’s castle. Hence, the infant boy was given the name Arthur, to induce memorable sentiments of the legendary King Arthur, who led the defense of Britain against Saxon invaders in the late 5thand early 6thcenturies.

With the Tudor dynasty off to a successful start, Henry VIIwas convinced his son’s birth would bring about a golden age. Arthur was given a magnificent christening on September 24th, noted by David Starkey as “the first of many spectacular ceremonies that Henry used to mark each stage of the advance and consolidation of the Tudor dynasty.” At two years of age, Arthur was betrothed to Katherine of Aragon, the youngest daughter of the joint Spanish monarchs, Ferdinand of Aragon and Isabella of Castile. The following year, in November 1489, Arthur was became Prince of Wales. In 1492, in a traditional precedent set by the grandfather, Edward IV, the heir to England was sent to reside at Ludlow Castle in the Welsh Marches to begin his education as the future king.

Starkey writes of Arthur growing up to be “a model prince” who “displayed the exaggerated sense of responsibility of the eldest child.” Personality wise, he was “intellectually precocious” and presented a stiff public manner. Historians Steve Gunn and Linda Monckton describe Arthur as “amiable and gentle” and a “delicate lad.”

 

Meeting his future wife – And Tragedy…

In the autumn of 1501, Katherine of Aragon landed in England and met her husband-to-be at Dogmersfield in Hampshire. They were married on November 14th, 1501. Arthur’s 10-year old brother Henry escorted the bride to the cathedral. Arthur wrote to Katherine’s parents that he would be “a true and loving husband.” We do not know exactly what followed after the traditional bedding ceremony, which was the only public bedding of a royal couple recorded in Britain in the 16thcentury. Yet, the next morning Arthur boasted: “bring me a cup of ale for I have been this night in the midst of Spain!”

His sincere affection and longing for Catherine is noted in a letter from October 1499 in which Arthur refers to Katherine as “my dearest spouse,” and writes:

I cannot tell you what an earnest desire I feel to see your Highness, and how vexatious to me is this procrastination about your coming. Let [it] be hastened, [that] the love conceived between us and the wished-for joys may reap their proper fruit.

 

After living at Tickenhill Manor for a month, Arthur and his new bride traveled to the Welsh Marches where they established their household at Ludlow Castle. Plague and illness had been lingering around this area, though the young prince disregarded it and carried on with his duties. In late March 1502, he and Catherine were suddenly struck by “a malign vapour which proceeded from the air.” Catherine recovered but not before her husband and heir to the English throne died on April 2nd- just six months short of his sixteenth birthday. Fifty-one years later, Arthur’s nephew, the last male heir of the Tudor dynasty, would die at the same age.

Theories on the cause of Arthur’s death range from cancer to possible consumption. A commonly suggested cause that is consistent with Katherine of Aragon’s illness is the deadly sweating sickness. This disease first made its way to England in the fifteenth century when Henry VII first took the throne and occurred sporadically, with one of the worst epidemics being in 1528.

The heavy responsibility as new heir to the throne would fall on the young Henry VIII who married his brother’s widow in 1509. When his marriage to Catherine of Aragon failed to produce any surviving male heirs, King Henry desired to have it annulled on the grounds that Catherine had been previously married to his brother, something that was forbidden according to Scripture. Catherine argued in defense that her marriage to Arthur had not been consummated. Henry would take matters in to his own hands and break from the Roman Catholic Church to marry his mistress, Anne Boleyn, establish the Church of England, and catapult along the Protestant Reformation in England.

 

What if… Arthur had become King?

But had Arthur survived and remained married to Catherine, how would history be different? Specifically, what would be the role of reformation in England and would he have lived up to the great legend and “golden age” his parents hoped for?

By all accounts, Arthur’s nature most resembled his father. Italian visitors in 1497 reported that Henry VII “evidently has a most quiet spirit.” In 1504, a Spanish visitor reported back to the Catholic Monarchs that “… He is so wise and attentive to everything; nothing escapes his attention.” It seems the late prince would have been less argumentative and more faithful to his wife like his father and unlike his brother. With his understanding of duty to his country and the likely happy marriage Arthur Tudor and his Spanish bride would have had, Arthur would have had little reason or temptation to relinquish the alliance with Spain. He would have had much less reason to break away from Rome and catapult the English Reformation, especially if Arthur and Catherine managed to produce male heirs. The Reformation had already sprouted in German states.

Much unlike Henry, who would have been trained in the workings of the church as the younger son, Arthur would not have involved or interested in the English church and the strict devotion to Catholicism of Catherine would further deter him from risking excommunication. If, by any far-stretched chance, Arthur was faced with the same succession crisis as Henry, would he have divorced Catherine and remarried? Being the staid boy he was, Arthur would have made some foreign alliance with another European power through a second marriage.

All of this is simply speculation though. If Arthur had indeed fulfilled his parents’ hopes, it would likely have been in the image of his father, which would represent a more careful and consolidated reign that would both avoid war and replace medieval rule with a centralized and united Tudor state. The court would have remained very similar and there would likely have been a distant and occasionally absent king.

 

And the Tudor Dynasty?

On the question of the continuation of the Tudor dynasty, Arthur and Catherine’s surviving children could have accomplished this for multiple generations. Yet, where would the union of the crowns of England and Scotland come into place? Upon the death of Queen Elizabeth I, without an heir, the English throne was passed to her cousin, King James VI of Scotland, uniting the crowns of both countries. It is unlikely this would have taken place under the continuation of the Tudor dynasty.

Especially under Queen Elizabeth I’s reign, England saw a golden age of literature, music, and visual arts in the midst of the English Renaissance. Would the same have occurred under King Arthur and his descendants? As a child, he was a skilled pupil and educated in poetry and ethics and studied the works of Cicero, Homer, Ovid, and Virgil. By 1501, he had even learned to dance “right pleasant and honourably.”

As for the economic might of England, Henry VII’s hopes for his eldest son would have undoubtedly included lessons on wisdoms and parsimony. Combined with Arthur’s temperate nature and disinterest in fighting wars with other countries, this could have produced a more flourishing economy during Arthur’s reign.

At the time of Arthur’s birth, the fate and hopes of England and that of his father rested on him with the expectation of ushering in a new era. Now, five centuries after his untimely death, he has been easily forgotten and overshadowed by his younger brother, the infamous Henry VIII, and his nephews and nieces. If the 15-year old prince had survived his deadly affliction in 1502, no doubt history would have been drastically different.

 

What do you think? How would English history have been different if Arthur had become King instead of Henry VIII?

Sources

“Arthur, Prince of Wales.” Wikipedia, Wikimedia Foundation, 27 Nov. 2018, en.wikipedia.org/wiki/Arthur,_Prince_of_Wales.

Buckingham, Maddie. “What If Arthur Had Become King of England?” W.U Hstry, 16 July 2016, wuhstry.wordpress.com/2012/02/27/what-if-arthur-had-become-king-of-england/.

Crowther, David. “The History of England.” The History of England, 2016, thehistoryofengland.co.uk/resource/henry-vii-character-and-portraits/.

NikitaBlogger. “A Real King Arthur: How Would English History Have Been Different If Arthur Tudor Had Lived?”Royal Central, 17 May 2017, royalcentral.co.uk/blogs/a-real-king-arthur-how-would-english-history-have-been-different-if-arthur-tudor-had-lived-82381.

Ridgway, Claire. “Arthur, Prince of Wales.” The Anne Boleyn Files, 20 Sept. 2010, www.theanneboleynfiles.com/arthur-prince-of-wales/.

Ridgway, Claire. “The Death of Arthur Tudor by Sarah Bryson.” The Tudor Society, 16 July 2018, www.tudorsociety.com/arthur-tudor-sarah-bryson/.

World War Two nurse Reba Z. Whittle is a unique and all-too often forgotten World War Two nurse. As well as helping many troops with their injuries, she became a Prisoner of War in Germany. Here, Matt Goolsby continues his series on Nurses in War and tells her extraordinary story (following articles on US Civil War nurses Clara Barton (here) and Cornelia Hancock (here), and World War One nurse Julia Catherine Stimson (here)).

American World War Two nurse Reba Z. Whittle. Source: US Air Force, available here.

American World War Two nurse Reba Z. Whittle. Source: US Air Force, available here.

The World of the 1930s

Leading up to World War II, the world had already seen significant conflict.

World War I or the ‘Great War’ along with the stock market crash of 1929 had plunged the global economy into a ‘Great Depression’. These were nicknamed ‘Great’ events because of their worldwide impacts.

Along with the ‘Great Depression’, China had been deluged by massive flooding and America was suffering from a suffocating drought. China was also becoming a hotspot of war on the Asian continent as the Japanese were expanding their territorial conquests during the Manchurian invasion of 1931.

There was trouble again brewing in Europe with fascism on the rise. Francisco Franco in Spain, Benito Mussolini in Italy, and Adolf Hitler in Germany were all making overtures of war or were prosecuting it in their respective nations.

By this time, air travel had become more commonplace as had automobiles throughout the entire world. The first inter-continental flight had taken place and travel between continents was quicker and easier than ever before.

Telecommunications were now truly trans-continental and many enjoyed radio as well as early television entertainment daily from studios such as NBC, CBS, and the Blue network.

US Army nurses were professionals in their own right by this time and had become essential participants and  dubbed ‘Angels’ by those cared for by them.

 

Nurses as captives

After the bombing of Pearl Harbor on December 7th, 1941, for the first time ever, five Navy Nurses were taken captive by the Japanese on Guam on December 10th, 1941. These five were fortunate as they were repatriated to Mozambique, Portuguese South Africa in August of 1942.

Nurses had been casualties of war since the Spanish-American conflict, but none had ever been taken captive. All told, 68 Army and 16 Navy nurses were taken captive during World War II.

Other Army and Navy nurses taken captive in The Philippines were not as fortunate as those who had been repatriated to Mozambique.

The Japanese began their assault the day after their Pearl Harbor attack and by April of 1942 had captured Bataan, the prelude of the ‘Bataan Death March’.

When Corregidor fell to Japanese forces on May 6th, 1942, 11 Navy and 66 Army nurses had been captured and were later interred that Summer at Santo Tomas Internment Camp which had previously been the University of Santo Tomas.

The nurses who were captured were fortunate that they were transferred to internment on April 8th, 1942, just two days before the infamous Bataan Death March started that had driven American and Filipino prisoners close to seventy miles on foot and was the scene of numerous atrocities.

On a personal note, one of my uncles was taken captive and forced to march to Bataan. He was never the same once liberated, but was buried with full military honors at Arlington National Cemetery in Washington D.C.

The ‘Angels of Bataan’ as they were referred to by the men or the ‘Battling Belles of Bataan’, survived in captivity for three years without losing a single nurse and were liberated by American forces on February 2nd, 1945.

 

Battle Tested

In the European theater, a brave young woman named Reba Z. Whittle made history that is little known to this day.

Reba Zitella Whittle was born in Rocksprings, Texas on August 19, 1919.  Destined to have a lasting impact, she applied for an appointment as a Reserve Army nurse after graduating from the Medical and Surgical Memorial Hospital School of Nursing in San Antonio, Texas in June 1941.

Having spent a year in college as a Home Economics major at North Texas State College, Reba was better educated and a little older than other applicants of the Army Reserve Nurse Corps.

She was initially denied an appointment as she was 5’7” and only 117 pounds. The requirement was actually seven pounds heavier but was waived due to her educational qualifications. She was advised to diet and rest to increase her weight.

Her initial appointment to the Army Reserves came with orders in June of 1941 that stated: “Assigned to active duty with the Army of the United States for a minimum period of one-year, effective June 17, 1941, and will continue on this status until relieved for the convenience of the Government or otherwise."

An important fact to remember about this time period in world history is that many people saw the looming threats that were developing and felt a compelling desire to be involved. Their sense of duty to their countries still rings true to this day and is a tribute as well as a lasting legacy to the many who sacrificed so much for our freedom from tyranny. Regardless of the allied power that they represented, they banded together during this worldwide conflict that cost so many lives.

After taking the oath to defend her country, Reba was given the rank of 2ndLieutenant as a Reserve Nurse in the Army Nurse Corps. She was ordered to report to the Albuquerque Air Base in Albuquerque, New Mexico and then assigned to Kirkland Field, New Mexico at Station Hospital and then at Station Hospital, Mather Field, California where she would spend the next 27 months as a general duty war nurse.

Her life would forever change in January of 1943 when she volunteered for the Army Air Forces School of Evacuation. All of the applicants of the school were volunteers as the potential for casualties ran high.

The Army utilized a C-47 cargo transport aircraft that served a dual-purpose. Its first role was to transport soldiers and their cargo to the battle-lines. Once having transported their initial load, the planes would then switch roles to become an evacuation air ambulance for the wounded. Since the planes were not identified as a hospital transport and didn’t have a Geneva Red Cross insignia, the assignment was very risky for the crew.

Lieutenant Whittle started training at the Army Air Forces School of Air Evacuation at Bowman Field, Kentucky on September 23, 1943. Prior to her arrival, the training had expanded from four to six weeks. 

According to the book, The Forgotten POW, the following details were highlighted as part of her training: “The intent of the program was to make the nurse largely self-sufficient on the flight. The nurse was required to use the equipment and medical supplies provided on the plane for treatment to relieve pain, to prevent hemorrhage, to treat shock, to administer oxygen, and in every way to meet any circumstances that might be encountered. In contrast to the hospital ward situation, all of this was to be done in the absence of a physician. Only in rare instances did a flight surgeon accompany a patient on a flight.”

Having completed her training, Lieutenant Whittle was assigned to the 813thMedical Aeronautical Transportation Squadron in Great Britain. The unit was initially stationed in the city of Nottingham, but later it moved to Brighton and then to Grove. During her time there from January through September of 1944, she flew 40 missions, 80 hours of which included combat time with a combined flying time of 500 hours. She quickly became a veteran combat nurse who coincidentally, had read nurse Juanita Redmond’s book titled: “I served on Bataan”, published in 1943. This is where she learned about what had happened to her fellow nurses captured in war.

Little did she know she would be next.

 

Lone POW

The day of September 27th, 1944 was not a good one for 2ndLieutenant Reba Z. Whittle.

In her diary that she began not long after being captured, she wrote that on Wednesday, September 27th, she left England with the intention of returning and going to London the next day, her day off. 

As it turned out, this was not to be: “Was sleeping quite soundly in the back of our hospital plane until suddenly awakened by terrific sounds of guns and cracklings of the plane as if it had gone into bits. For a few moments I hardly knew what to think. Can assure anyone a more than startled expression and sensation. Suddenly looked at my Surgical Tech opposite me with blood flowing from his left leg. The noise by this time seemed to be much worse. But to see the left engine blazing away - is simply more than I can express - But never thought I would land on the ground in one peace [sic]. My prayers were used and quick.”

After the plane had hit the ground and all but one of the crew had got out, there were more surprises to come: “Immediately we saw soldiers not many yards away. At first we thought they were British soldiers. Second glance we recognized they were German GIs. This feeling is one never dreamed of having. But thought - we've had it chum. The first thought in my mind - my boyfriend and he would be waiting back at my quarters that evening. But how thankful and grateful to be alive.”

War has a tendency to bring out the best and worst in humanity, but the ground soldiers or ‘grunts’ who served in the German Army were surprisingly gentle and concerned: “They took a glance their guns pointing and immediately one took out a bandage and put around my head as it was bleeding. The surprised look on their faces when they saw a woman was amazing. But they bandaged us and away we marched our ship still burning.”

The unfortunate mistake that the navigator made during the crew’s mission was to get off course. Instead of flying to France, the mis-navigation took them to Achen, Germany where they were shot down.

During the crash, Reba sustained a concussion and a severe laceration that were later treated by a German doctor. Her ordeal would continue as the Germans who’d captured them drove them deeper into Germany.

Traveling on weathered roads with injured POWs inside a cold and dirty truck, the German Army would periodically stop at different towns to take meal breaks or to discuss what to do with the prisoners. One such encounter between a German doctor and Lieutenant Whittle is indicative of the difficulties of war: “Next stop was a German hospital where they unloaded the wood. A German officer takes us in. Where more questions asked. And just what I was - a Dr. came in and looked all over and asked me questions of being a nurse. Shook his head saying, ‘Too bad having a woman as you are the first one and no one knows exactly what to do.’"

Her travels would continue throughout the German countryside as she was assigned to hospitals along a circuitous route that the German command determined on a seemingly random basis. As she landed in different locations, her nurse training met with practical experience to help the wounded American and British POWs.

Along the way the astonishment of the wounded that a woman had been captured and how it had been done continued to intrigue those she met.

 

Free at last

As Reba’s internment continued, her writings demonstrated the perseverance and dedication that nurses in war exhibit and especially the concern they show for those in their care.

At the end of November 1944, Lieutenant Whittle’s writing in her diary suddenly stopped. It’s been surmised since she never wrote a reason why or told her husband, that she was just too busy taking care of preparations for the holiday season. 

Throughout her writings she would express her spontaneous crying when she felt desperate, though she never seemed to waiver in her hope that she would be freed.

Towards the end of her journaling she had learned that the International Red Cross had heard of her plight and was trying to repatriate her through official bureaucratic channels. Undoubtedly, this provided hope where there had been none.

Reba Z. Whittle was finally repatriated on January 25th, 1945, having spent four months as a German prisoner of war. She was the only American female nurse to have been captured in Europe during World War II.

After being freed, Lieutenant Whittle was sent home and was able to spend one night with her fiancé, Lieutenant Colonel Stanley Tobiason, who would later return to duty in England. She also received a telegram from then President Franklin Delano Roosevelt thanking her for her service:

As your Commander in Chief, I take pride in your past achievements and express the thanks of a grateful Nation for your services in combat and your steadfastness while a prisoner of war. May God grant each of you happiness and an early return to health.”

 

Not long after coming home, Reba was awarded the Purple Heart for injuries sustained in action and also the Air Medal for serving in unarmed and unarmored aircraft.

A month later, 2ndLieutenant Whittle was promoted to 1stLieutenant. Unable to return to flight duty due to recurring headaches from the injuries she sustained and other maladies, she remained on active duty until August, 1945 when she married Lieutenant Colonel Tobiason at Hamilton Field, California and resigned her commission.

Reba had never been declared a POW officially. She spent the next 10 years petitioning the Department of the Army to get a disability retirement only to be frustrated by bureaucracy. She finally settled for a meager amount in 1955, trying to reapply in 1960.

By that time the case was determined to have been closed. She never did anything further about the medical and psychiatric disorders she’d suffered from being a prisoner of war. However, her legacy has lived on in her two sons, one of them having graduated from the US Naval Academy and becoming a pilot who flew in the Vietnam conflict. 

Reba passed away from cancer in 1981, but her husband again petitioned the Department of the Army. In 1983 Reba was finally officially recognized as a POW of World War II. 

None of us can truly know the struggles and difficulties that a POW must go through, especially when they return. Reba exemplifies the best in humanity and serves as a lasting role model for the Army Nurse Corps. 

She is justified at last.

 

What do you think of Reba Z. Whittle? Let us know below.

References

Lieutenant Colonel E.V. Frank, AN, “The Forgotten POW: Second Lieutenant Reba Z. Whittle, AN”, US Army War College, February 1990.

“The 1930s”, https://www.history.com/topics/great-depression/1930s

“Angels of Bataan”, https://en.wikipedia.org/wiki/Angels_of_Bataan

In the years immediately after World War One, a Red Scare swept the US. Following the Russian Revolution there were fears that the Bolsheviks would seek to undermine America and democracy, leading to various laws being enacted. Jonathan Hennika (site here) continues his Scared America series below (following articles on strained 19thcentury politics here, Chinese immigration here, and anti-German propaganda during World War One here).

A drawing depicting the Steel Strike of 1919. From New York World, October 11, 1919.

A drawing depicting the Steel Strike of 1919. From New York World, October 11, 1919.

In a Forbes Magazine editorial, political science Professor Donald Brand wrote, “Donald Trump’s nativism is a fundamental corruption of the founding principles of the Republican Party. Nativists champion the purported interests of American citizens over those of immigrants, justifying their hostility to immigrants by the use of derogatory stereotypes: Mexicans are rapists; Muslims are terrorists.” The United States is not the only nation affectedby Nativism. Great Britain faced its nativist fight in the referendum regarding the nation’s involvement in the European Union. “Nativism … is prejudice in favor of natives against strangers, which in present-day terms means a policy that will protect and promote the interests of indigenous or established inhabitants over those of immigrants. This usage has recently found favor among Brexiters anxious to distance themselves from accusations of racism and xenophobia” journalist Ian Jack wrote in The Guardian.[i]

In 2018, voters in both nations facedthe consequences made in 2016. Great Britain struggled with the formalization of an exit from the European Union; theUnited States grappled with a President who calls himself a “nationalist.” In the lead up to the midterm elections,President Trump demonized a caravan of Latin Americans seeking asylum in the United States; proposed ending birthright citizenship; and threatened to shut down the border between the United States and Mexico. President Trump'slast-minute anti-immigrant rhetoric did not yield him noticeable benefit in the mid-terms; his party retaineda precarious hold on the United States Senate and lost the majority in the House of Representatives.  It is possible the American electorate understood the tactic as one of fear; it is possible President Trump pushed the issue toofar and crossed a line. Perhaps President Trump is a man out of time, as he said at a campaign rally in Houston, Texas” "You know, they have a word. It sort ofbecame old-fashioned. It’s called a nationalist," he continued. "And I say, 'Really, we’re not supposed to use that word?' You know what I am? I'm a nationalist. ... Use that word."[ii]

 

America at a Crossroads 

The United States at the end of World War One was a nation in turmoil. After running on a re-election campaign touting, he kept America out of the European war; Wilson became a War President in April 1917. The President postulated that if Americans went towar “they’ll forget there ever was such a thing as tolerance.”[iii]American culture changed over-night as war fervor gripped the nation: “Every element of American public opinion was mobilized behind `my country, right or wrong,’ dissent was virtually forbidden, democracy at home was drastically curtailed so that it could be made safe abroad, while impressionable children were `educated’ in Hun atrocities, or their time was employed in liberty loan, Red Cross, war saving stamp or YMCA campaigns.”Soon, this Americanism became codified under the Espionage Act of 1917 and further enforced with the Sedition Act of 1918. Taken together, the Alien-Sedition Act is an early 20thcentury version of the post-September11, 2001,the PatriotAct, in that they curbed criticism of America’s involvement in the First World War. “There was clear implication that people who utilized free speech as a means of gaining improper ends had to be restricted.”[iv]

Once the war ended, those encouragedby the implementation of the Alien-Sedition Act wanted similar peacetime laws. Therefore, a new enemy was required. Political leaders pointed towards the radical revolutions sweeping Eastern Europe and Russia. Their tactic in creating this Red Scareincluded propaganda, which proved politically useful to President Wilson and his public relations man, George Creel. Inciters of the Red Scare painted a picture of Eastern European immigrants as non-conformists and declared they and “their Socialist `cousins’ rejected the premises upon which the American system rested, namely that rights and privileges were open in a free society to anyone who was willing to work up patiently within the system. Or if the individual wereincapable of utilizing this technique, he would eventually be taken care of in a spiritof paternalism by the affluent class, as long as he stood with his hat in his hand and patiently waited.” The fear of the socialist and Bolsheviks was so great that “by 1920 thirty-five states had enacted some form of restrictive, precautionary legislation enabling the rapid crackdown on speech that might by its expression produce unlawful actions geared toward stimulating improper political or economic change.”[v]

These politicians cast themselves as defenders of the United States and all things American. A partial list of these defenders includes Attorney General A. Mitchell Palmer, former First Army General Leonard Wood, Post Master General Albert Burleson, William J. Flynn, director of the Bureau of Investigation and his head of intelligence, J Edgar Hoover.  Both Palmer and Wood were contenders for the Republican nomination in the 1920 election. Flynn, Hoover, and later Flynn’s successor, William Burns, used the Bolshevik threat to enhance the power and prestige of the Bureau of Investigation, as well as their reputations. 

 

Attorney General A. Mitchell Palmer and the Red Scare of 1919

The end of the war, which had been an economic boon for the United States, brought with it a depression due in part to the cancellationof no longer needed war orders. As the economy slumped, retail prices climbed more than doubling by 1920, the worse increases occurring in the spring and summer of 1919. The workers who had prospered during the boom years of the warnow complained about low wages. Over 4,000,000 workers participated in 3,600 strikes in 1919. Veterans, returning to civilian life, resulted in high unemployment. Speaking at a public event, Secretary of Labor William B. Wilson indicated that two significantstrikes in Seattle and Butte, Montana were “instituted by the Bolsheviks…for the sole purpose of bringing about a nationwide revolution in the United States.”[vi]

Americansoldiers arereturning to civilian life, having performed their patriotic duty, found their jobs had been taken over by African-Americans and others who had not served in the war. There was a sharp rise in unemployment which increased nativist sentiment. Soldiers returned to a country where Socialists and other radicals were striking and threatening violence against the government and the democracytheyhad defended in the trench warfare of France. The threats of violence became real in April and June 1919. Bombing campaigns targeted various cities and public officials. Speaking of the June 1919 bombings, William Flynn, Director of the Bureau of Investigation, declared that the “bombers were connected with Russian Bolshevism aided by Hun money,” placing the enemy directly at the feet of the old enemy, Germany, and the new enemy, Russia.[vii]

Attorney General A. Mitchell Palmer was one of the public officials targeted by the April bombings. Palmer began a swift retaliation campaign between 1919 and 1920. Palmer’s Justice Department rounded up and deported over six thousand aliens and arrested thousands more upon suspicion of belonging to subversive or radical groups. At this time President Wilson had taken ill (having suffered a stroke), Palmer went unchecked. Of the thousands arrested, most were taken within a warrant and detained for inexcusably long sentences. Thosearrested were later released.[viii]In a precursorof what was to come in the 1950s, Attorney General Palmer presented a report to Congress in November 1919. In his report, Palmer stated that “the Department of Justice discovered upwards of 60,000 of these organized agitators of the Trotsky doctrine in the US… confidential information upon which the government is now sweeping the nation clean of such alien filth…. The sharp tongues of the Revolution’s head were licking the altars of the churches, leaping into the belfry of the school bell, crawling into the sacred corners of American homes and seeking to replace marriage with libertine laws.”[ix]

As a result of the Red Scare of 1919-1920, New York State disbarred five Assemblymenas socialist.  Being the nation’s top-crusader against the Red menace, Palmer was unable to turn his campaign of fear and red-baiting into higher political office. Unlike President Trump in 2016, Palmer lost his party’s bid for the nomination. However, as an intended consequence of the Red Scare, in 1921 Congress enacted the Quota law. This was the first of many billspassedduring the Roaring Twenties to sharply curtail an influx of immigration to a country founded by immigrants. These immigration and Naturalization Laws increased the United States’ move towards a return to pre-war isolation and harbored disastrous consequences as the fascists began to seize power in the 1920sand 1930s Europe.

 

What do you think of the post World War One red Scare in the US? Let us know below.


[i]Donald Brand, “How Donald Trump’s Nativism Ruined the GOP,” Forbes, June 26, 2016. http://fortune.com/2016/06/21/donald-trump-nativism-gop/; Ian Jack, “We Called it Racism, Now it’s Nativism,” The Guardian, November 12, 2016. https://www.theguardian.com/commentisfree/2016/nov/12/nativism-racism-anti-migrant-sentiment

 

[ii]Brett Samuels, “Trump: `You Know What I Am? I’m a Nationalist,” The Hill, October 28, 2018.

[iii]Ray Stannard Baer, Woodrow Wilson: Life and Letters(New York: Scribners and Son, 1927), 506-07.

[iv]Paul L Murphy, “Sources and Nature of Intolerance in the 1920s,” The Journal of American History, 51 (June, 1964), 63.

[v]Ibid,62,  65

[vi]Stanley Coben, “A Study in Nativism: The American Red Scare of 1919-20,” Political Science Quarterly, 79 (March, 1964,)66-8.

[vii]Ibid, 60.

[viii]Ibid, 72-3.

[ix]Paul Johnson, Modern Times: From the Twenties to the Nineties(New York: Harper Collins, 1991), 205.

If we think about the food of the Middle Ages, what do we imagine? Maybe pigs and cows, large fireplaces, and tankards with ale. But this picture is only a part of the story. Of course, the dishes in those times were far less varied than those of today. And peasants’ food was far less diverse than that of the upper-classes. But still, we can find some interest when looking into what people ate 1,000 years ago. Alex Moren explains.

John, Duke of Berry, enjoying a great meal in France. Painted by the Limbourg Brothers, 15th century.

John, Duke of Berry, enjoying a great meal in France. Painted by the Limbourg Brothers, 15th century.

The Dark Ages or the Middle Ages (commonly seen to have been the 5thto the 15thcentury in Europe) are often seen as the time of darkness, hunger and pestilence. It was not a particularly refined time – it was an age of superstition. The period has inspired many legends and tales and still attracts children and adults alike. But just what did people eat during the Middle Ages in Europe?

 

Important Things To Remember: Seasons, Salt And Storage

Medieval life was ruled by the temperatures and the seasons. The food was plentiful in the early summer, with berries and early grains, and in autumn, when the harvest was finished. But even with a good harvest and lots of cattle, food was hard to keep. Cold cellars, deep dungeons and ice were the best ways to preserve something. Curing, smoking and salting the meat and fish was also a way to stock up before winter. People even salted butter so that it would keep longer. And because of that, some of the most precious items were salt and simple spices like garlic. Remember the old fairy tale about the princess who told her father the king that she loves him more than salt? A kingdom without salt could starve, so it was indeed very precious. Spices were rare, although during the Crusades there was greater awareness of cardamom, cinnamon, coriander and pepper. For most the taste of food was much simpler: food could be salty, a little tinged with vinegar or garlic, or sweetened with honey. 

But not only seasons determined what a person could eat; their position determined it, too. A diet of a peasant was very different from a diet of a monk. It may seem that it is better to be someone who grows your own food. But if we look at the life in the Middle Ages more closely, we can see that the “down to the earth life” was also the most dangerous one.

 

Peasants: Barely Any Rules All.

In our century, we often idealize rural life. But actually, being a medieval peasant meant to live a very hungry and dangerous life. In many countries, peasants were not really free. They were obligated to work for the higher classes - for the monasteries or nobles who owned the lands they resided on. And they still had to care for their own land as well. Besides, they had to pay taxes, and they remained even in drought and poor harvest.

The peasant had to depend mainly on what they could grow for food. Wheat was an expensive grain, and usually, it was sent to the nobles or the tax collectors. What was left for the villagers was barley and vegetables like turnip and cabbage.

Raising animals for meat, such as cows, goats or sheep required pastures. Hogs were also very expensive to have. Hunting was also forbidden, as forests belonged to the king or his followers (remember Robin Hood saving some peasants because they killed a royal deer?). Having a cow was a sign of wealth - with a cow, one could get milk, produce cheese and even sell them.

All of these things lead to a simple conclusion  - peasants ate very little and their diet was far from diverse. Usually, in the village, it was porridge, vegetable or chicken stew, fish for religious days (there were many!), and cheese and different kinds of preserved meat, if they were lucky enough. And they ate twice a day. 

So, peasants used to eat little. So, did rich people eat too much?

 

Rich: We Have Wheat and We Have Meat! 

When people had money, pastures and servants, they could have variety in their food, too! All the dishes we usually associate with medieval times, such as a pig being roasted on a stick, stuffed pheasants, pigeons and deer - could all be found on the tables of various nobles, sheriffs, and, of course, kings. They could afford spices as well, so the dishes included sauces and vinegar. No barley bread for the knights and barons - they ate wheat bread, and probably pastries as well. 

Though monks were considered servants of God, they could be considered rich as well. Monasteries used to have wide lands assigned to them. They had access to wheat, like nobles. They also grew their own herbs, had their own cattle, and even fisheries. The latter was especially important, as monks had to observe Lent and other religious holidays very strictly. The meat was often forbidden, only fish was allowed. However, in many countries, birds that swim in water, such as ducks, geese and swans, were considered fish, too! As a result, in many monasteries, the diet was not particularly strict. That’s why in the medieval legends, the fat, overeating monks are so common.

 

Last, But Not Least: Be Careful What You Drink! 

Water was filthy. There were no filtration systems, and water was often not boiled before use. Drinking water was often the main instrument of spreading diseases. That is why people preferred other beverages - diluted wine, mead, or other forms of alcohol, which were relatively safe. Everybody drank, even children.

 

A medieval knight would be shocked to see our regular meals. Now, we depend on potatoes as a staple food instead of turnips, and we have a much larger variety of grains. Wild meat is now rare, and we eat a lot of pork and beef instead. And water is safe in our time!

 

Who do you think eats healthier: the people of the twenty-first century or the tenth century? Let us know below.

This article was brought to you by the authors of PapersOwl Australia writing service.

Editor’s note: This site is not affiliated in any way with this website. Please see the link here for more information about external links.