Tour Guides’ Tricks vs Historians’ Hang-ups: Lessons for Teaching History

Recently, I’ve started working as a tour guide on Edinburgh’s beautiful Royal Mile. For two hours, come rain or shine, I escort random tourists down the narrow alleyways, onto cobblestoned streets, across graveyards, and into medieval courtyards, regaling them with (hi)stories about Edinburgh’s colourful past. It’s entertaining and challenging work.  Guides must be on-the-ball with funny jokes, vibrant vocabulary, and accurate answers to a wide variety of questions for the whole two hours. It’s draining work, but super fun. The time flies by.

Being a tour guide is exciting because of my great love of history, local knowledge of Edinburgh, and previous experiences helping tourists. Also, after spending 10+ years in admin office roles in both Canada and Scotland, I am crystal clear that I’d rather be walking the streets of one of the prettiest cities in Europe than sitting in a stale office staring at spreadsheets. No more spreadsheets, I say! No more!  But my education (ie. PhD in modern history) is dismally underutilised in a position that caters to “just tourists.” Surely after so many years of study, I can “do better” than tour guiding?

Tour guiding might be a “fall from grace” for ambitious academics, but other employment opportunities – especially at universities or in research – are notoriously competitive and outrageously difficult to attain. So, while the rejection letters pour into my inbox (and until I reluctantly decide to give up on my dream of being a university lecturer), I’ve decided to apply my skills elsewhere. Also, those bills don’t pay themselves.

But I’ve learned there is a vast difference between a tour guide and a historian. Like, wow. And some of the differences are totally refreshing. Others are a little disconcerting. And some are just down right hilarious. Despite starting this job with a somewhat cocky attitude (“Surely it can’t be that much harder than when I taught at a university…”), tour guiding has re-opened my eyes to history and history-telling. And, of course, sufficiently humbled my approach.

Historians and tour guides both earn money from the same skill set: the ability to teach history. Although one researches and lectures in a university, the other guides on the streets, often in the exact place where that history took place. While teaching history unites these professions, their approaches greatly differ. This is why I decided to write this blog. Hopefully, my observations will debunk some myths (or prejudices) we may have about both trades.

1) Story-telling vs (Hi)story-telling

Professional historians will vehemently say that teaching history as a story (or narrative) is not good history. But tour guides rely heavily on stories because they are entertaining. And, because one of the chief goals of tour guiding is to entertain, historical events are often conveyed in a narrative structure (setup, conflict, resolution), as it makes history accessible, engaging and much more memorable.

But stories and histories are not interchangeable. History should not be moralised, narrativised, pushed into little boxes of “good” versus “bad.”  As we all know, history is often written by the victors (ie. white, old, wealthy, MEN) and any publication about historiography – the study of writing history – is plagued with lengthy analyses about the inherent bias in historical sources. Today’s students of history undertake meticulous and often painstaking training about how to identify and overcome such biases, so that most contemporary historical research endeavours to be objective, evidence-based and (hopefully) self-aware and self-reflective.

And yet, it is worthwhile noting that some students of history, like Michael Conway, perceptively argue that it is not until a student engages in historiography that they begin to realise that history is not a single overarching description, but instead a conflict-ridden zone of historians/scholars bickering endlessly about all aspects of history. This, Conway argues, is actually more compelling: “History is not indoctrination. It is a wrestling match. For too long, the emphasis has been on pinning the opponent. It is time to shift the focus to the struggle itself”. But let’s get back to tour guiding…

When history is told as a story, the goal is often to promote consumption; it (negates that academic battleground that Conway writes about and instead) allows the reader/audience to easily absorb the information without any moral dilemma, ambiguity, or guesswork. But this does not mean that the story is not meaningful or stimulating. In fact, some of my favourite British public historians (such as Neil Oliver or Lucy Worsley) often present history as narratives. Fortunately, they often simultaneously question whether we should accept such interpretations as accurate.  By doing so, audiences are given a choice: they can blindly accept such portrayals as conveniently memorable stories, or they can wrestle with the ambiguity and interpret the (hi)story in their own way.

Neil Oliver

Neil Oliver is a prominent archeologist and television presenter in Scotland. Criticised as being too Anglo-Centric by some, it did not stop him from being appointed as President of the National Trust of Scotland in 2017. (Photo credit: BBC)

But not all history can be put into a narrative structure. In the event you might disagree, then think of genocide, or slavery, or war. There’s no moral to be learned from the existence of concentration camps. There’s no overarching narrative of tragedy, comedy or redemption within slaves’ experiences. There are no “good” or “bad” soldiers in war. And, because these topics cannot be easily reduced or moralised, they continue to attract revision, debate and controversy among multiple stakeholders: historians, legal and legislative bodies, policy-makers, international organisations, humanitarians, educators, artists, authors, filmmakers, and so many, many more.

On Edinburgh’s Cowgate, I tell the story of Joseph Smith (“Bowed Joseph”), a poor, disabled 18thcentury cobbler who notoriously roused Edinburgh’s poorest segments of society into a frenzied mob (up to 10,000 people) whenever it suited him. The town officials were so wary of Bowed Joseph that they would often consult him before enacting local policies (such as increasing the price of ale). The “story” goes that when Joseph heard that a poor father-of-six had committed suicide after being evicted by an unconcerned landlord, Joseph beat his drum down the street to provoke thousands to storm the landlord’s house, stripping it of all possessions and piling the furniture into a nearby park. As the helpless town guard looked on, Bowed Joseph himself lit the match. The pyre reportedly burnt for hours.

Bowed Joseph

Bowed Joseph’s malformed skeleton (on display at the University of Edinburgh’s Anatomical Museum) shows us the devastating effects of childhood malnutrition in the 18th Century.

Historically speaking, we know very little about Joseph Smith. Born into abject poverty sometime in the mid 1700s on Edinburgh’s Cowgate, he developed rickets at a young age. He had strong, massive arms and short, bowed legs. We know this because upon his death in 1780 (falling from a coach after gambling at the race track in Leith), the University of Edinburgh’s prestigious Medical School acquired his deformed skeleton. Today, it’s displayed at the Anatomical Museum, which coincidentally was the only reason that I knew that Joseph Smith existed at all!

What the points of telling you all this? This is bad history, but a great story; the tale of an underdog who used his power for social justice. But – I would argue – stories like these teaches elements of history without the tourist even realising it. For example, the audience learns about Edinburgh’s brutal poverty, childhood diseases of the 1700s, the strength (and fear) of the mob to provoke political change, rioting as a commonplace practise of 18thC Scottish culture, existing class tensions between landowners and tenants – these are all historical themes that this story illustrates.

But I’ve practically sold my academic soul. I have compromised my formal objective, evidence-based study of history with an engaging narrative chiefly devoid of tangible facts, in order to achieve my manager’ goal: to entertain tourists about Edinburgh’s “history.”

2) Telling Entertaining (Hi)stories

Training for this job was mostly self-directed. I was not given a script, but simply told the appointed “audition date” I would give a two-hour tour of the Old Town to my manager. I was allowed to speak about anything, and was invited onto other guides’ tours to see their routes and topics. Considering this company is one of the highest rated on Trip Advisor (the chief reason I applied with them), I was surprised at this somewhat laissez-faire, trusting approach.

Although studying “all” of Scottish and Edinburgh’s history was slightly daunting, this worked perfectly for me. I have learned that both historians and tour guides are only as good as the knowledge they possess. Breadth and nuance of historical insight is entirely dependent upon that person’s work ethic and willingness to learn. Taaa-daaa – maybe tour guides and historians aren’t so different after all!

After my audition, my manager’s feedback was hilarious (…well, in retrospect): “Chelsea, you must use less dates. Tourists do not care if it happened in 1861, just say ‘mid-1800s.’ Of course, always know your dates in case anyone asks, but stop being so precise! Your groups don’t want a history lesson!”

My inner pedantic academic and pride as a “proper educated historian” shrivelled into a little lifeless ball of death. I laughed but was disconcerted. How can years of pushing for flawless historical accuracy (including memorising dates) be considered a weakness? I walked away from my audition a little bruised. My ego a little weakened. But then, it dawned on me. Could it be that imprecise teaching (the greatest faux pas of any educator) could be considered a strength here?! Could it be that extraneous pedantic detail was actually not necessary in guiding? Could it be that just the interesting, engaging, enjoyable parts of history are actually the focus?

A great relief settled inside me and that lifeless ball of death sprung alive again. Of course, tour guiding is about entertainment, and formal history is about education. Both can go hand-in-hand but are not identical. While History with a big H is important, we all know that aspects of it are tediously boring, even to historians: Economic history of immediate post-Confederation Canada? Not interested. Technical capabilities of British naval vessels in the Napoleonic Wars? Sorry, don’t care. Another biography of Winston Churchill? Please dear God, no.

Thus, I’ve learned that tour guiding isn’t just about using stories to tell history, but instead, to tell entertaining history. It’s like being given cream for your coffee, instead of weak milk. It’s like getting a filet mignon instead of rump steak. It’s like eating the centre of the cinnamon bun first, rather than the crusty outer edges. (ps. I like food analogies).

3) Questioning Impact and Legacy Without All the Pedantic Detail

As you’ve noticed, telling Edinburgh’s and Scottish history in entertaining, bite-size pieces are the trick of the two-hour tour guiding trade. But when I asked my manager if I could end with a question, instead of an amusing anecdote, he considerately listened and nodded his head. He simply said that so long as it was concise and clear, there’s no reason it wouldn’t work.

The need to understand and interrogate the legacy or impact of history details/facts/events/persons is a cornerstone of every good historical study. History conferences are full of scholars squabbling over the minutiae of history.  Even if it’s very technical details (i.e. the Spitfire only had 14-18 seconds of ammunition), those details can have significant impact upon larger events (pilot performance, casualty rates, future combat airplane design, and so on). That’s why details are so very important to historians, even if it makes them look like pedantic, over-obsessed nerds. If those details are inaccurate, misinterpreted, or false, then the larger context and the enduring legacy can also be questioned.

But would your average holidaying tourist be interested in such details?

No, let’s get real. Tour guides do not have the time to meticulously analyse every detail of Scottish history. For tourists, this would be the opposite of entertaining. Their holidays would be ruined by my tedious obsession with overwhelming empirical details.

Instead, I discovered that tour guides, similar to public historians, can approach it backwards – by deconstructing the legacy as a way to question details. For example, I deliberately end my tour beside the Writer’s Museum (with a view of the Royal Mile and the Scott Monument).

Scott Monment

The largest monument in the world dedicated to an author, the Scott Monument, was built in 1844.  (Photo Credit)

There I discuss Sir Walter Scott, arguably the most influential Scottish citizen to impact modern Scottish identity.  But because Scott’s writings showcased Scottish identity in a certain way – Highlands, stags, romance, the wilds of the north – it often failed to include other portions of Scottish society and culture.

This came to a climax (notice my narrative structure!) in 1822 when King George IV visited Scotland, the first official state visit in almost 200 years. Scott, a national celebrity, had been commissioned to plan the festivities and he did not fail to deliver. Notably, Polish conmen (Sobieski Stuarts) sought to benefit from the celebrations, publishing a famous book that claimed that specific tartan denoted a specific Highland Clan.  Scottish nobles raced to find their Highland ancestry so they could purchase their kilts in time for the King’s visit.  And Scott’s prolific writings (and explicit instructions for the festivities) had impacted locals and foreigners so much that when the King arrived, only a certain type of Scottish person was showcased – the “Highlander”. Bedecked in colourful tartan, this robust, whisky-swilling, haggis-eating, masculine, bearded and kilted “Highlander” came to represent all the Scots.

HIGHLANDER_QUAD_FINAL

Various cultural representations of Scotland have perpetuated the identity (myth?) of the “Highlander,” including films such as Highlander (1986), Braveheart (1995) and TV series Outlander (2014+). Perhaps it’s no wonder this image of Scotland still prevails today.

I thus conclude my tour with a question: Was Scott’s interpretation of the Scots actually an accurate reflection of Scotland’s identity? Or is he responsible for creating a redundant, overused, exploited image of the Highlands? I then humorously remind them to think twice about purchasing tartan scarves on the Royal Mile. Or watching Braveheart.

4) The Irrefutable Power of Location

The most formidable tool in the tour guide’s arsenal is not actually her/his ability to research history (the realm of professors) or to seamlessly present history in a convenient package (the realm of television programs), or even to repurpose history (the realm of public historians). Instead, it is the power of the physical location of historical events and legacies that allow tour guides to instil, showcase, mobilise, present, investigate, question, and explore history. By walking the same street that Bowed Joseph roused his mob, or by seeing the same views that JK Rowling saw when she wrote the first Harry Potter book, or by tasting haggis as Robert Burns would have tasted when he wrote “Ode to a Haggis,” the tourist is imprinted with so much more than just a history lesson. They themselves smell, taste, see, hear and thus participate in history in a way that no book, no TV show and no lecture can equal. It’s exponentially more powerful, more visceral and more resonant.

To my surprise, tour guides are often the only educational resource for tourists following a tight travel itinerary (I’ve checked with the tourists on my tours!). This means that tour guides are as vital to teaching history to the public as any other formally trained historian, curator or television educator. Although I travel a great deal, and have experienced amazing tours in places where history unfolded (the rise of the Third Reich in Munich, or discussing the Battle of Berlin steps from the Reichstag), I’m not sure I fully appreciated the role of tour guides in orchestrating and teaching history until now. Tour guides have an invaluable role in researching, selecting and presenting the physical locations of historical events, legacies and people to retell history. And by refocusing the audiences’ attention upon the location, tour guides revive history more authentically than can be created even in the most competent lectures, or among the most vibrant imaginations.

In 1774, one of my favourite Enlightenment authors, Voltaire, was dissatisfied with how scholars studied and wrote history:

“People are very careful to report what day a certain battle took place… They print treaties, they describe the pomp of a coronation, the ceremony of receiving the Cardinal’s hat, and even the entrance of an ambassador, forgetting neither his Swiss soldiers nor his lackeys. It is a good thing to have archives on everything that one might consult them when necessary… But after I have read three or four thousand descriptions of battles, and the terms of some hundreds of treaties, I have found the fundamentally I am scarcely better instructed than I was before.”

Voltaire proposed a solution that was rather innovative, especially for his time. Instead, he suggested that we should focus on location, artefacts, art and theatre to learn history:

“A lock on the canal that joins two seas, a painting by Poussin, a fine tragedy, are things a thousand times more precious than all the court annals and all the campaign reports put together.”

 

 

 

 

Six Great Legacies of the Nuremberg Trials that Still Impact the World Today

Did you know that before the Second World War, there were no international laws to protect civilians in war?

Yes, really. And let me explain.

Modern rules of war are only 150 years in the making. Some claim they started with Abraham Lincoln’s enactment of the Lieber Code in 1863, which tried to limit the military actions of his Union forces by permitting certain humanitarian measures to be taken (upon the condition that they do not contradict military objectives, of course). But I would argue that it began in 1864 on the other side of the ocean, when the red cross on a white background – the opposite of the Swiss flag – came to symbolize a neutral protective party helping another in conflict, known as the Red Cross.

Today, the international community has developed a fairly robust series of international laws that explicitly aim to limit the effects of armed conflict for humanitarian reasons. (Although reinforcing these laws is an entirely different matter…)

The critical problem before the Second World War was that individuals were subject to the laws of their nations, but could not claim rights under international law since they were not subjects of international law. (For an excellent lecture on this topic by Thomas Buergenthal, graduate of Harvard Law, a Holocaust survivor and former judge of the UN’s International Court of Justice, see here).  Looking at it the other way around, as objects of international law, individuals’ status did not differ from a State’s territory or its other sovereign possessions; individuals thus belonged to the State. And let’s take a moment to consider those poor stateless individuals, such as refugees, or Jews stripped of their citizenship (as was legal in pre-war Germany), who lost their rights to any national laws…

Thus, before 1939, individuals were subjects to the laws of their nations, but not to international law. A rather subtle but critical difference, you might say.

In both the First and Second World Wars, this equated to entirely different treatment between civilians and other groups, such as prisoners of war. For example, sending or receiving post, which remains a cornerstone of human rights today, was not granted to interned, displaced or detained civilians; but instead these rights existed for POWs. Because the 1907 Hague Convention explicitly stipulated that POWs would “enjoy the privilege of free postage. Letters, money orders, and valuables, as well as parcels by post, intended for prisoners of war, or dispatched by them, shall be exempt from all postal duties,” this meant that you could send your husband knitted socks or a food parcel to a POW camp, or ask about your loved one’s whereabouts to relevant authorities.

ICRC POW records.jpg

The International Committee of the Red Cross’ massive system of indexes in Switzerland handled the two to three thousands letters of inquiry per day into the whereabouts of lost, fallen or captured soldiers. By the end of the First World War, the ICRC had compiled 4,895,000 index cards and forwarded 1,884,914 individual parcels and 1,813 wagonloads of collective relief supplies. Today, these files have been digitised and are searchable online.

The Third Geneva Convention in 1929 expanded the rights of POWs even further to include the establishment of official information bureaux by all belligerent nations and the coordinated relief of prisoners, whereby properly accredited professionals should both monitor camp operations and distribute relief. In practical terms, this meant that POW camps in WWII were regularly inspected by Red Cross officials, whereas concentration camps were not subject to humanitarian inspections.

But the Third Geneva Convention again neglected to define civilian rights.

In 1934, the world came very close to providing civilians international rights at the 15th International Conference of the Red Cross and Red Crescent Societies held in Tokyo. The international community vigorously tried to clearly define the rights of civilians as those people within the territory of a belligerent, or as individuals in the power of the enemy in occupied territories. But the Tokyo Draft was not signed nor implemented with any legal authority in the years that followed, and by the outbreak of the Second World War, it was shelved, reconsidered, and finally made legitimate in the Fourth Geneva Convention of 1949.

Thus, by the outbreak of the Second World War, there were no international laws to protect civilians in war.

But then, the Nuremberg Trials.

In light of the mass atrocities and tremendous violations of human rights that occurred throughout wartime Europe, the Nuremberg Trials sought to administer justice to the Nazi politicians, administrators and bureaucrats that allowed such murderous policies to flourish.

IMG_0110

The beautiful town of Nuremberg (Nürnberg) had been chosen by the Allies as it had once been the considered the spiritual centre of the Third Reich and played host to massive annual Nazi rallies. Ironically, it was also the city chosen by Hitler when enacting the 1935 Racial Laws (also known as the Nuremberg Laws) which stripped thousands of German citizens of their rights. In more practical terms, Nuremberg was chosen because it had functioning infrastructure, a serviceable airstrip and a working prison.

The International Military Tribunal (IMT) – the agreement between France, Britain, the US and Russia to persecute and punish war criminals in a court of law – decided upon four categories of crimes:

  1. Conspiracy to commit charges 2, 3 and 4, listed here;
  2. Crimes Against Peace “defined as participation in the planning and waging of a war of aggression in violation of numerous international treaties”
  3. War Crimes “defined as violations of the internationally agreed upon rules for waging war”; and
  4. Crimes Against Humanity “namely, murder, extermination, enslavement, deportation, and other inhumane acts committed against any civilian population, before or during the war; or persecution on political, racial, or religious grounds in execution of or in connection with any crime within the jurisdiction of the Tribunal, whether or not in violation of domestic law of the country where perpetrated.”

In 1946, Judges from the allied nations of France, Britain, America and Russia presided over the legal hearings of 22 of Germany’s highest ranking Nazis in the first and most publicized trial. Each defendant was tried for one or even all four categories based upon the available evidence often gathered from captured German records. Fortunately for justice, the Nazis were pedantic record-keepers.

img_9475.jpg

Wannsee, Berlin. One outstanding  discovery of evidence for the trials was by German lawyer Robert Kempner, who had scoured the German Foreign Office and uncovered (in the papers of the Undersecretary of the Ministry of Foreign Affairs, Martin Luther) the record of one 90-minute meeting at the Wannsee house in south Berlin in 1942 (above). This lakeside villa became infamously known for hosting this “Wannsee Conference” when high-ranking Nazi officials formally decided on the genocidal policies of the “Final Solution” to the Jewish question – otherwise known as the Holocaust.

Twelve subsequent Nuremberg Trials persecuted other Nazi groups. These included the “Doctors’ Trial” against Nazi medical researchers who conducted experiments on concentration camp victims, the “IG Farben Trial” and “Krupp Trial” against businessmen and industrialists who profited from slave labour, and the “Judges’ Trial” against Nazi judges who enforced racial laws and eugenics.

But given that Germany’s national legal system created laws to support their discriminatory policies (ie. The Nuremberg Race Laws of 1935), and given that civilians had no international rights but were only subject to national laws, then how could the international community enforce international law?

The IMT, the judges, ultimately rejected Germany’s argument that they had been following official policy and thus was actually legally permissible by national law. Instead, they returned to a small but powerful introduction from The Hague Conventions (referred to as the Martens Clause), which states:

Until a more complete code of the laws of war has been issued, the High Contracting Parties deem it expedient to declare that, in cases not included in the Regulations adopted by them, the inhabitants and the belligerents remain under the protection and rule of the principles of the law of nations, as they result from the usages established among civilized peoples, from the laws of humanity, and the dictates of the public conscience (Hague Convention, 1899).

This rather vague and imprecise sentence was sufficient grounds for these judges (admittedly, the victors of the war) to argue that Germany had contradicted the basic human rights afforded to all civilians by the laws of humanity.

And this, ladies and gentlemen, is when it all changed.

While the Nuremberg Trials sought to punish these criminals for their cruel treatment of civilians, it actually achieved so much more – it was a dramatic legal and conceptual transformation that internationalized human rights. Importantly, it eliminated that subtle but critical difference discussed earlier; individuals were no longer subject to the laws of their nations, but subject to a higher authority of international law that guarantees their human rights.

What legacies did the Nuremberg Trials create?

1) Defined an international concept of universal human rights.

See above discussion.

2) Granted civilians in war basic human rights.

Today, civilians who experience war are guaranteed basic human rights that all belligerents must abide, or else be accused of war crimes. These are further defined by additional humanitarian laws that provide different protections depending on the whether the civilian is a child, disabled or a migrant.

For simplicity’s sake, these are some of the basic rights granted to civilians in war, which seem revolutionary compared to the pre-WWII period:

  • civilians have a right to receive relief and aid from any party, government or non-state actor
  • when detained or imprisoned, civilians must be given food, water, and allowed to communicate with loved ones in order to preserve one’s dignity and physical health;
  • sick or wounded have a right to receive medical assistance, regardless of whether they are a belligerent
  • medical workers must always be allowed to provide life-saving assistance to wounded or injured and must never be attacked
  • belligerents are prohibited from causing the following upon civilians: violence to the life, health or physical or mental well-being of persons (including murder, torture, corporal punishment and mutilation), outrages upon personal dignity, in particular humiliating and degrading treatment, enforced prostitution and any form of indecent assault; taking hostages, collective punishments and threats to commit any of the above.

If you’re a POW, you’re also afforded significant rights.  If you’re a belligerent yourself, you have certain obligations to provide for civilians under occupation, and you have rights to legal process and representation.

While the Nuremberg Trials may have closed the pre-WWII loophole regarding civilians’ rights in war, war crimes against civilians still occur today. Shocking examples include the increasing male victims of rape at Libyan detention centers, or last week’s sexual exploitation of Syrian women in return for relief, or the recent abduction of 110 Nigerian school girls by Boko Haram

Screen Shot 2018-03-07 at 16.16.08

On 5 March 2018, Human Rights Watch revealed that seven boys under the age of 14 living in a Russian orphanage for children with disabilities claim they were raped by staff and visitors. Although this will be handled by domestic courts, HRW uses press releases like this to bring attention to the systematic failures by national systems, like Russia’s institutions for disabled children, which is currently being monitored for widespread human rights abuses.

3) Genocide became a crime.

It might sound ridiculous to think that genocide had never been outlawed until WWII, but when one pauses and considers the multiple genocides that occurred before the Holocaust, explicit laws were evidently required to deter governments or political groups from undertaking acts “with the intent to destroy, in whole or in part, a national, ethnical, racial or religious group” (1948 Geneva Convention). And, if one pauses to consider this meaning of genocide, then it can be evidenced in other major historical contexts or themes, including colonialism, imperialism, slavery, nationalism, etc.

After the collapse of communism in the 1990s, when suppressed nationalism of multiple ethnic groups was unleashed in places like Yugoslavia and Bosnia, the UN investigated evidence of war crimes for the first time since Nuremberg. Notably, rape was recognized as a crime against humanity for the first time in the aftermath of the Bosnian Civil War.

Although genocide has been criminalized, this has not deterred governments and political groups from committing mass atrocities. To date, genocides have occurred in Uganda, Cambodia, Rwanda, Somalia, Bosnia, and many other nations. Also, the legal process to administer justice can extend into the decades. For example, on 24 March 2016, Radovan Karadzic, the former Bosnian Serb leader nicknamed “the Butcher of Bosnia,” now aged 70, was found guilty of genocide, war crimes and crimes against humanity by a United Nations tribunal for his actions in 1995 in Srebrenica. He was sentenced to 40 years’ imprisonment – more than many Nazi war criminals from Nuremberg.

4) Introduced explicit laws for research upon human subjects.

Subsequent Nuremberg Trials also charged the scientists and doctors with crimes against humanity for their extensive medical experiments upon concentration camp inmates. Nazi doctors’ defense was that they were ordered by their government to investigate how to overcome common ailments of German pilots and soldiers (such as hypothermia). However, the persecution team argued, forcing concentration camp victims to die in frigid and icy baths for “medical research” failed to honour doctors’ Hippocratic Oath and underlying ethics to do no harm to patients.

Ultimately, the trial exposed that there was no single blueprint for medical research and, ironically, it forced the American persecution team to find the best and most ethical doctor to testify to research physiology and whose wartime scientific interests corresponded to Nazi research interests. Dr. Andrew Ivy was called as witness for the prosecution and his testimony lasted four days, the longest of the trial.

Dr. Ivy claimed he personally followed three common-sense rules when experimenting on human subjects, such as avoiding all unnecessary physical and mental suffering and injury to patients, or conducting trials on the basis of animal experimentation first. While such guidelines were evidence that medical experiments could be undertaken ethically, this trial revealed that there were no written principles of research in the United States or elsewhere before December 1946. In fact, the legal defense at the time argued that there was no difference between the actions of Nazi doctors and those actions of U.S. doctors at Stateville Prison in Joliet, Illinois, by experimenting with a malaria vaccine on prisoners.

Again, as victors of the war, the Nuremberg judges had the final say. They created a 10-point research ethics code, known today as the “Nuremberg Code.” Although it was not formally adopted by any nation, the irrefutable importance of informed consent was adopted into the UN’s international laws in 1966. Informed consent was, and remains, the core pillar of any research upon human subjects to this day.

5) Created a permanent international court for war crimes.

The International Criminal Tribunal for the former Yugoslavia, or ICTY, was the first war crimes trial held after Nuremberg. In many ways, ICTY was similar because it held four categories of crime, it had a panel of judges, it pursued justice according to international laws and conventions. But, of course, it was modified from its predecessor; according to Bernard D. Meltzer, the ICTY sat at The Hague to signify its neutrality and internationality, it had a smaller team to collect evidence and thus relied heavily on oral history testimonies, it also utilised new methods in forensic evidence, etc.

After the International Criminal Tribunals for former Yugoslavia and then Rwanda, the international community created the International Criminal Court (ICC), in force since the Rome Statute in 2002. According to the Robert H. Jackson Centre, the ICC “is the first ever permanent, treaty-based, international criminal court established to promote the rule of law and ensure that the gravest international crimes do not go unpunished.”

The ICC is currently investigating war crimes in Uganda, Darfur (Sudan), Democratic Republic of Congo, Kenya, Libya, Mali, Georgia, Central African Republic, Côte d’Ivoire, and Burundi.

6) Created an irrefutable historical record of war crimes.

At a time when German society, and the international community, wanted to move on from war and embrace happier peacetime activities, the Nuremberg Trials became an invaluable historical record for future generations. No one today can claim ignorance to the atrocities or scale of the Holocaust. And a great part of that is due to the extensive research, legal proceedings and publicity of the Nuremberg Trials.

Today’s discussions about Nuremberg – and essentially about international justice – now include the great disparity between Nazi Germany’s culpability and the Allies’ culpability of war crimes. For example, no Nazi at Nuremberg was charged with terror bombardment since the use of strategic bombardment against civilians had been a pillar of the British and US war efforts (the controversy surrounding Dresden still rages today). Or, the failure of Nuremberg to legitimise the brutal mass rape of German women by Soviet AND American forces in immediate post-war Germany. And, of course, many others that may not be fully explored until the victor’s narrative of the Second World War, and generation who experienced and created it, passes away.

Switzerland’s No-So-Secret Wartime Weapon: The Case of the Swiss-led Child Evacuations

Last month, the BBC published an article “Is this Switzerland’s Schindler?” about a Swiss man named Carl Lutz who used his position as an envoy for neutral Switzerland stationed in Budapest to issue letters to thousands of Hungarian Jews during the Second World War. These special letters extended Lutz’s diplomatic protection to those targeted for deportation. Lutz saved an astounding 62,000 Jews from being sent to the concentration camps.

Carl Lutz

Crowds expecting to be deported gather outside Carl Lutz’ office in Budapest to get protective letters in late 1944 (photo credit). Notably, Carl Lutz not only issued letters to individuals and families, but also 76 buildings that housed these groups. The Glass House survives today as a result of Lutz’s intervention.

It’s a very remarkable story. Not only does it demonstrate the extent to which people in positions of power could sacrifice their own safety for the survival of total strangers, but it also exemplifies how Swiss citizens could mobilise their government’s neutral status in WWII to help victims of persecution.

Shortly after this article was published, a friend contacted me and, knowing that I studied Switzerland during the Second World War, asked me about Switzerland’s wartime humanitarian efforts: But Chelsea, didn’t the Swiss create the Red Cross? And weren’t they neutral during the war? If so, did they help protect Jews during the war through the Red Cross? And what about refugees fleeing the Nazis? Honestly, why didn’t every single person just pack their bags and move to Switzerland during the war?

These are all excellent questions. Switzerland’s neutrality certainly means that it had a unique position in wartime Europe. Combined with its history of humanitarianism (yes, it did create the International Committee of the Red Cross), and its convenient geography in central Europe (bordering Austria, Germany, France, Italy and Liechtenstein), Switzerland appears to be perfect hiding spot from the Nazis, and a country that could manoeuvre through tense wartime diplomacy to help victims of war. Well spotted, my friend.

Added to all these facts was (and remains) Switzerland’s strong legacy of banking (supported by valuable privacy laws). Foreign investors still flock to Swiss banks because of its centuries of neutrality (and thus financial stability during war), including foreign governments.  In fact, some scholars argue Switzerland’s ability to financially shelter governments’ investments was the single reason that it was not invaded during the war – Swiss banks were just too valuable to both the Allied governments and Nazi Germany’s financial health to even consider crossing one platoon into its little alpine territory.

So really, we have three non-negotiable factors that influenced (and continue to influence) Switzerland’s political actions: neutrality, humanitarianism and banking. Remarkably, Switzerland protected its geographic borders from invasion in both World Wars due to its ability to maintain amicable relationships with belligerent nations. It provided them with a neutral trading centre (ie. banks and foreign currency), as well as becoming an intermediary for international organizations, such as the League of Nations. This tradition still stands today.

post-header-un-geneva

Today, if you’re lucky enough to be able to afford a trip to Geneva, you can walk past the United Nations (above), the headquarters of the World Health Organisation, the International Committee of the Red Cross, the International Labour Office and the World Trade Organisation – all a stone’s throw from the same street! (And you’ll have to walk because you won’t be able to afford anything else).

Although Switzerland’s neutrality, humanitarianism and banking can be seen as massive opportunities and methods to help others, they were often used as excuses by Swiss authorities to limit, evade, or reject multiple initiatives that would have saved countless lives during the Second World War.

However, in keeping with the optimism and sacrifice that Carl Lutz has shown the world, I will write about one extraordinary example where Swiss citizens overcame these limitations to provide refuge and relief to one of the most vulnerable groups suffering under Nazi rule – children.

Why would the  Swiss government reject humanitarian initiatives?

Ultimately, Switzerland feared being overrun by refugees. As Switzerland depended on warring countries for its imports (about 55%) and exports (about 60%), there was simply not enough resources to ensure its national survival if thousands of foreigners (even refugees) came to stay. Over half of the coal in Switzerland, for example, originated from Nazi Germany’s monthly shipments. Thus, Switzerland had to balance national survival with shrewd financial decisions. (For more on Swiss wartime economy, see Herbert Reginbogin’s [2009] Faces of Neutrality, and Georges-André Chevallaz’s [2001] The Challenge of Neutrality, Diplomacy and the Defense of Switzerland).

Similar to today, Europe was overwhelmed with refugees still displaced by the First World War, the Turkish-Armenian War, the Russian Civil War, and the impact of famines gripping eastern Europe. Similar to today, refugees were not simply a passing trend.

Save teh Children Russia

Multiple charities helped refugees in the wake of the First World War. By 1921, when the Russian Civil War had produced countless refugees and starving children, the Save the Children Fund had found it’s stride. It campaigned on the big screen by showing films of the conditions children faced to British audiences. For a brief history, see here.

By the end of 1933, the first year of power for the Nazis, some 37,000 Jews had voluntarily emigrated from Germany as a direct result of increasing violence and persecution (RJ Evans, Third Reich in Power). With Germany’s introduction of the Nuremberg Laws in 1935 – stripping all Jews in Germany or Austria of their citizenship and thus making them stateless refugees in their own country – the world began to realise it had a growing refugee crisis on its hands, especially if Hitler’s militarisation of Germany continued to go unchallenged.  Despite this, countries like France and Britain were apathetic to the plight of these refugees, instead being more concerned with unemployment or other domestic issues (Claudena Skran, Refugees in Inter-war Europe). Sounds like the recent situation in Calais, no?

But refugees had protected rights. In 1933, refugees gained internationally recognised rights (to passports, for example) for the first time, granted by the League of Nations (which, notably, Germany withdrew from in 1933). But this did not equate to decent treatment or immediate asylum for refugees worldwide. In fact, it still doesn’t. (See how refugees today are treated in Libyan detention centres).

In 1938, President Roosevelt’s administration organized the Evian Conference in France to help coordinate efforts to facilitate the emigration of refugees from Germany and Austria.  But the conference was unsuccessful, because most participating nations seemed more concerned with turning the refugees away from their own borders or, in the case of Britain, by simply refusing to contribute to it (Skran, Refugees in Inter-war Europe, 280).

Evian-les-Bains/ Lake Geneva: International conference on jewish immigrants from Germany

Lord Winterton, the English representative at the Evian Conference, gives a speech to attendees (photo credit). TIME reported on 18 July 1938, “Britain, France, Belgium pleaded that they had already absorbed their capacity, Australia turned in a flat ‘No’ to Jews, and the U. S. announced that she would combine her former annual Austrian immigration quota with her German to admit 27,370 persons (who can support themselves) from Greater Germany next year.”

Switzerland’s delegate, Heinrich Rothmund (the Chief of Police and responsible for Swiss borders and immigration), argued that larger countries, such as the US, should absorb large numbers of refugees so that European nations could operate as merely transit countries. Seems logical, eh? However, this line of policy was not accepted.  By the time the Second World War broke out, very few legal stipulations existed which governed the admission and rejection of refugees, and, instead, refugees had to rely upon the decisions made by individual countries. The League of Nations, and the international community, had ultimately failed to protect refugees in time for war.

By late 1938, Rothmund’s idea to treat Switzerland as a transit country had failed.  Escalating Nazi persecution (and the annexation of Austria) caused more fleeing Jews to congregate at Swiss borders. At this point, Rothmund decided that all refugees without visas, especially Jews, would be rejected from Swiss borders. Switzerland then implemented a new, discriminatory method of stamping all Jewish passports and visas with a large J (J for “Jude” meaning “Jew”). This “J-stamp” method to clearly distinguish Jews from other refugees was recommended to Nazi officials by a Swiss legation in 1938. Unfortunately, the Nazis adopted this into their own immigration and deportation protocols. (For a collector’s example, see here).

Amidst public outcry, Switzerland closed its borders in August 1942, justified by Swiss authorities due to an alleged lack of resources. The border closures remain one of the darkest chapters of Swiss history as Swiss actions directly impacted refugees, forcing many refugees to face persecution and death (This was a major finding of a large 25-volume Swiss government-commissioned study in the 1990s, see here). And, in November 1942, when Germany invaded southern unoccupied France, fresh waves of refugees fled to Switzerland’s strictly controlled borders; most were turned away, resulting, for some, in eventual deportation to mass extermination camps. By late 1942, Swiss refugee policies slowly changed, but it was not until July 1944 that the border opened again fully to Jewish refugees.

Switzerland’s Wartime Dilemma: How to Help Refugees when Limited by (an anti-Semitic and anti-refugee) government?

Similar to so many countries today, private citizens vehemently disagreed with their government’s restrictive border controls to limit the intake of refugees. This friction provoked Swiss civilians to turn to non-governmental organizations to help victims of war they deemed worthy of their donations, relief and aid.

One key example is the “Swiss Coalition for Relief to Child War Victims” (Schweizerische Arbeitsgemeinschaft für kriegsgeschädigte Kinder, or Le cartel Suisse de secours aux enfants victimes de la guerre). A mouthful, I know, but let’s call this group the “Swiss Coalition.”

The Swiss Coalition was an alliance of seventeen Swiss charities that sought to evacuate children from war-torn Europe to Switzerland. Although it had operated successfully during the Spanish Civil War (evacuating over 34,000 child refugees of the Spanish Civil War to multiple host nations), this “new” Swiss Coalition was bigger, prepared and practiced. Importantly, remaining funds from its Spanish operations were liquidated and added to the new coalition’s purse.

In 1940, the Swiss Coalition began its remarkable work. Raising over 700,000 Swiss francs in one year alone, the Swiss Coalition appealed to the humanitarian spirit of the Swiss people. One initiative encouraged thousands of Swiss families to voluntarily host children from southern France (then unoccupied by Nazi forces) for three months in their own homes. This ingenious method bypassed Switzerland’s strict immigration controls, as the children would not be a perpetual national burden, as well as appearing more attractive to Swiss hosts, as the children would not be a permanent family commitment.

Untitled7

When children arrived, they gave their information to Red Cross workers who then compared it to the transport manifest and reported it to immigration authorities. After medical screening and delousing at Swiss train stations, they received their first warm meal in Switzerland. (Photographer Hans Staub. Basel train station, circa 1942. CH-BAR J2.15 1969/7 BD116, Belgische Kinder kommen (nach Basel), circa 1942).

The measure was extremely popular among the public, and by November 1940, when the first evacuations from unoccupied France began, the number of families volunteering to host children actually outnumbered the children selected for evacuation. Thousands of families offered spots for French children; over 2,000 were offered in Geneva alone. By December 1941, the Swiss Coalition hosted more than 7,000 children in Switzerland, the majority of them French (Swiss Federal Archives, CH-BAR E2001D 1968/74 BD 16 D.009 14 and Antonie Schmidlin, Eine andere Schweiz, 137).

Untitled22.png

Notice the fatigue from this little Belgian boy. The captain reads “Arrival of Belgian child convoys in a Swiss train station. The children have travelled all night, have slept little and are now hungry and tired.” (Photographer Kling-Jenny. CH-BAR J2.15 1969/7 BD116, Belgische Kinder kommen (nach Basel), circa 1942).

The success continued and operations enlarged. Surprisingly, Nazi authorities agreed to temporary evacuations from their occupied zone, as it was hardly an inconvenience for them; the Swiss operated and funded the evacuations and – crucially – Switzerland was neutral. In February 1941, child evacuations from German-occupied northern France began, and the Swiss Coalition was the first foreign agency allowed into blocked areas, such as Dunkirk, Calais and Boulogne.

Untitled13.png

Medical assessment was the chief criterion for selection. Due to the typhoid epidemics in late 1940 and summer 1943 in northern France and rampant diphtheria during the winter of 1942-43, it was necessary to protect the children, and the Swiss hosts, from such diseases. (CH-BAR J2.15 1969/7 BD 114, Kindertransporte aus Frankreich, March 1942).

In 1942, Belgian children suffering under Nazi rule were now evacuated. Generous donations from Swiss citizens continued to pour in and the Swiss Red Cross joined the operations. This was an important moment because it meant that the national Red Cross infrastructure (and doctors) could be utilised. This was certainly a formidable humanitarian operation.

Strict immigration controls still existed though. By mid 1942, Kinderzüge, or special Children’s Trains, were only allowed to travel one day per week. It had to be the same day every week. Maximum 830 per train. Only 1 adult per 30 children. According to Heinrich Rothmund’s office, there was to be absolutely no deviation from the following criteria:

  • Only children with appropriate identity papers (passports) that allowed them to return to France or Belgium could be selected. This was difficult for stateless groups, such as Jewish families who had left fled Germany or Austria for France. Importantly, this meant that no German-Jews could be evacuated. This also ensured that no child became a responsibility of the Swiss government.
  • Poor health was the sole criterion for selecting children (secondary to having the correct identity papers, of course).
  • Children had to be selected by Swiss Coalition doctors and medically screened upon arrival in Switzerland.
  • Children had to be 4 years to 14 years old.
  • Swiss Federal Police have the full authority to reject children upon entry on any grounds for any reason.

Once the children arrived in Switzerland, there was a host of additional criteria they had to follow while residents in Switzerland. While you could argue that these pedantic rules prevented children from becoming lost or abused by their hosts, it also meant that no one could abuse this temporary system of asylum. No Swiss host could extend a child’s stay, for example.

Untitled24.png

Rothmund specified that Medical Corps of the Swiss Frontier Guards (above) had to deem the children physically poor in order for admission into Switzerland. If entry was refused, then children were not to cross the Swiss border and were immediately returned to their home country. I’ve found no direct evidence to reveal that children were rejected. (CH-BAR J2.15 1969/7 BD116, Belgische Kinder kommen (nach Basel), circa 1942).

Despite the impressive enterprise, the Germans terminated the evacuations from Belgium in May 1942 and from France in October 1942. Their justification was based upon the belief that children in Switzerland would become politically incited with anti-German sentiments. (Yep, really).

The Nazis’ termination of these three-month evacuations coincided with Swiss border closures in late 1942. (But it is important to point out that some children gained entry into Switzerland, including those admitted due to tuberculosis and others sent through another initiative led by Pro Juventute). It was not until July 1944 when the Swiss Coalition resumed the three-month evacuations.

In total, over 60,000 French and Belgian children benefitted from these temporary child evacuations (including some from Yugoslavia) during the Second World War. In the post-war period, this was expanded to other war-stricken nations and an additional 100,000 children were welcomed to Switzerland from 1945 to 1949.

So what?

While I discuss Switzerland at length here, the obligations among so-called “neutral” nations to help refugees is not just about Switzerland. If we put any nation under a microscope, we will discover many unwelcome truths about its immigration policies. Assigning responsibility (and culpability) for who did or did not protect refugees, including Jews, is a tricky exercise, especially when discussed on such a large, international scale.

Perhaps Swiss historians say it best. When ascribing responsibility for Switzerland’s lack of action to protect vulnerable groups, notable Swiss historian Edgar Bonjour argued that the entire generation of Swiss made it possible for the democratic government to create such refugee policies. Historian Stephen Mächler (Hilfe und Ohnmacht, 440) pushes this further to criticize “the entire globe,” as everyone opposed welcoming refugees, especially Jews, making it nearly impossible for Switzerland to do anything but to create similar refugee policies. However, as Georg Kreis argues (Switzerland and the Second World War, 113), if all are responsible, then ultimately no one is responsible

Let’s return to our “Swiss Schindler”. As a diplomat working from a Swiss consulate in Budapest, Carl Lutz was protected by international law and granted immunity to local conflict, as any diplomat should be treated. But, importantly, only neutral governments during the Second World War could act as protective powers. As Lutz was the citizen of a neutral government, this meant that his Swiss embassy in Budapest acted as an intermediary and/or protective power for other warring nations without diplomatic representation in Hungary. (This system still operates today; a Canadian pastor was recently released in North Korea via the Swedish embassy because Canada designated Sweden to be its protective power). Therefore, Carl Lutz’s citizenship to neutral Switzerland played an incredibly critical role in the lives of 62,000 Jews.

Remarkable initiatives like the Swiss Coalition, and the actions of Swiss citizens like Carl Lutz, Paul Grüninger, Hans Schaffert, Roslï Näf, and so many others, deserve great attention. They not only sacrificed their own personal comfort, safety and even careers, but they discovered cunning ways to capitalise on their Swiss neutrality for the protection of thousands of people. In this sense, their humanitarianism (and courage) seems amplified. Neutrality was not a limitation or excuse to not intervene, but actually an influential weapon that could be used, if in the right hands.

Big Opportunities for Big Improvement: Changing the History of Social Security in Scotland

History is being made in Scotland right now. Although many haven’t noticed.

Westminster is currently devolving a number of powers to the Scottish Parliament under the Scotland Act 2016. This includes a portion of the social security budget, accounting for £2.9 billion or 15% of the total £17.5 billion spent every year. The new social security system will deliver 10 of 11 key benefits to over 1.4 million people in Scotland, including Carer’s Allowance, Disability Living Allowance and Sure Start maternity grants (Discretionary Housing Payments will continue to be paid by Local Authorities).

If you’re not on benefits, or don’t live in Scotland, then perhaps this is of little interest to you. But from a historical standpoint, and a humanitarian perspective, remarkable things are happening at Holyrood that will have a massive impact on the most vulnerable portions of Scottish society.

As a welfare state, Scotland (and Britain) is committed to the collective welfare of its people, so that no citizen falls below the minimum standards in income, health, housing and education. In other words, it’s like a social safety net.

Collective welfare in Britain began in the 1830s. Although Victorians distrusted the poor, believing poverty was their own fault due to wasteful habits, laziness, and poor moral character, England introduced the New Poor Law Act in 1834. However, it only offered assistance to able-bodied persons if they entered a workhouse, were put to work and thus “submitted to harshness” (I’m not even kidding, that exact phrase came from this textbook: Baldock, Mitton, Manning and Vickerstaff, Social Policy, 4th ed, 2007, p. 29). Workhouses were not happy institutions, we must remember. Instead, these able-bodied persons were meant to experience a lower standard of living than even the poorest labourer. The rationale was that it would discourage all but the destitute able-bodied from turning to the Poor Law, whose only choice in life was either the workhouse or nothing. Pretty grim!

Slum housing - Close 46 Glasgow L_tcm4-556904

Abject poverty of Glaswegian children in the 1850s, photographed by Thomas Annan, and taken from No. 46 Saltmarket, an old close in Glasgow.  Image from the National Galleries of Scotland via www.sath.org.uk

Over a hundred years later, during the Second World War, economist William Beveridge (1941) wrote a ground-breaking report on social policy. After surveying wartime housing schemes, Beveridge famously declared that Britain commit itself to attack five giant evils: want, disease, ignorance, squalor and idleness.  Some would argue that this was the pivotal moment when modern welfare measures were introduced. (For a marvelous documentary on how the bombings during the Blitz revealed the conditions of Britain’s poorest class, and inspired lesser-known journalist Ritchie Calder to confront the long-term housing and poverty crisis in Britain, see BBC’s Blitz: The Bombs that Changed Britain).

The creation of the National Health Service (NHS) in 1946 was one of the largest reforms of modern British society. By amalgamating local authority hospitals with voluntary hospitals (many of which had already been organised during the war), and by promoting the NHS as a service based on clinical need rather than ability to pay, the British public warmly welcomed the new health scheme. Despite this, social security in Britain faltered. In the 1960s, critics such as Peter Townsend, brought attention to the fact that many pensioners were in poverty because of inadequate pensions. Meanwhile, National Insurance (NI) was given based on contributions, which often left unemployed women (ie. homemakers) excluded from the system. By the 1970s, means-tested systems were introduced to rectify social security shortcomings, which meant that low pensions, for example, were increased in line with prices or earnings, whichever were greater.

By the 1980s, Prime Minister Margaret Thatcher declared that excessive public expenditures were the root of Britain’s economic issues, as the delivery of public services were “paternalistic, inefficient and generally unsatisfactory” (Baldock, Mitton, Manning and Vickerstaff, Social Policy, 4th ed, 2007, p. 39).

Tell Sid 1986 British Gas _BD_54_ 5

The ‘Tell Sid’ campaign from 1986, which that encouraged people to buy shares in British Gas. ‘If you see Sid, tell him’ ran the slogan, and around 1.5million did at a cost of 135p/share, in a £9billion share offer, the largest ever at the time.

The real issue, of course, was to avoid welfare recipients from becoming too dependent on state benefits. Echoing her Victorian ancestors, Thatcher and her advisers thought that “generous collective provision for unemployment and sickness was sapping some working-class people’s drive to work.” Measures were introduced to lower taxes and decrease state intervention and instead increase market forces with private investment. Major utility companies for gas, electricity, telephones, British Airways and British Rail were all privatized.  The assumption was that this new system would use competition to promote efficiency, and be motivated by public demands. This, it can be argued, was when the welfare state in Britain changed substantially. Or, this is when it went downhill.

An Example: Thatcher’s “Right To Buy”

Affordable housing, for example, was undermined by the unprecedented cuts in maintenance and subsidization under Thatcher. The “Right to Buy” scheme was introduced in 1980 so that long-term council tenants could purchase their council home at a discounted rate. As over 55% of Scottish people lived in council homes in 1981, this was a useful scheme to help many families become more independent.

But, crucially, this scheme removed thousands of homes from local councils’ resources. Without more affordable housing being built, and large reduction in subsidies from the federal government in 1981, Right to Buy only led to higher rents, longer waiting lists, and created a major housing crisis that lasted for decades.

By November 2012, a Scottish government consultation revealed that the majority of councils, and many tenants and landlords wanted the Right to Buy scheme abolished. In 2013, the Scottish government announced it would end this scheme in order to “safeguard social housing stock for future generations.” By 2016, the Right to Buy scheme was terminated in Scotland.

Education, healthcare, social security, all experienced cuts under the Thatcher period. And, with New Labour in the late 1990s, more changes sought to eradicate the “postcode lottery” effect of the NHS services by introducing national standards and centralized audits and performance reviews. Focus was also placed on employment; “welfare-to-work” epitomized the belief that work was the surest way out of poverty. As Chancellor, Gordon Brown promised to eradicate child poverty through a system of tax credits (a mission he still fights for, especially in Edinburgh where child poverty stands at 35% in some areas).

When Death Comes Knocking… Is Devolution is the Solution?

In addition to Scotland’s current housing crisis and child poverty, policy researchers are now drawing attention to another impending crisis on the British horizon: death.

In 2017, a comprehensive 110+ page report called “Death, Dying and Devolution” (hereafter called DDD) by the University of Bath’s Institute for Policy Research outlined the impact of death in Britain. Its findings were unsettling: Over 500,000 people die in Britain each year and over 2 million deal with death’s emotional, financial and practical consequences every year.  That is one in four or six Britons every year. A further 1 million people provide care for someone with a terminal illness every year, but only one in six employers have policies in place to support this population.

This means that family members (estimated 58% women) must assume the caring role with very little compensation (as you will see shortly). Disabled or injured people receive small sums to support themselves, forcing their family and support network or local authorities to pay for their housing and basic utilities. And, shockingly, unregulated funeral services plummet many families into something called “funeral poverty”!! Read on!

The findings in the DDD report is meant to be a radical wake-up call to policy makers about Britain’s approach to death. The alarming deficit in policy response and legislation, the report argues, is compounded by poor infrastructure and strategising, resulting in fragmented care, escalating and unregulated funeral costs, and massive inequalities experienced by dying and bereaved people due to their geographic location. However, the DDD report singles out Scotland as the only nation to have developed innovative, progressive policies in respect of end of life care. Notably, Scotland’s goal is to provide palliative care to all who need it by 2021 – the only nation to actually set a deadline.

Fortunately, it is not all doom and gloom. The process of devolution is the perfect opportunity to tackle many of these problems. The report claims that “In light of the projected rise in the UK death rate over the next 20 years, with devolution comes a once-in-a-lifetime opportunity to (re)address the neglect of death as a public policy issue, repositioning death as a central concern of the welfare state,” (p. 6).

Soc Secuity Bill

The Social Security bill has just passed through stage one of a three-step parliamentary process. Find the latest updates on Twitter @SP_SocialSecur 

Let’s now return to the Social Security Bill being discussed in the Scottish Parliament…

As part of the legislative process, the Social Security Committee, comprised of multiple Members of the Scottish Parliament, has been given the task to oversee the new social security system.  This committee decided to make a Social Security charter (which would be quicker to implement than legislation) and submitted it for the public to scrutinise in the summer of 2017. You can see the charter here.

This is where it gets interesting.

Numerous charities responded to the public consultation with their concerns, criticisms and viewpoints. After the Social Security Committee received these reports, it will then amend the charter, seek more evidence from private citizens and charity directors as required, and in general, improve the legislation. (“Part one” of a three-stage legislative process is due for completion by December 2017, “part two” will begin in January 2018).

To the credit of the Scottish Ministers who drafted this charter, it’s somewhat vague and simplified, which is “normal” for such legislation in its infancy. For example, it’s impossible to say precisely how much a Carer should be paid, but merely to state whether a Carer should be paid more or less than the current benefit.

And to the credit of these charities’ policy researchers who drafted these perceptive responses, they have sunk their teeth into this charter and have ripped it apart…

Carer’s Allowance. This is currently less than the current jobseeker’s allowance (£73.10/week) and comes with a list of restrictive conditions. To qualify for Carer’s Allowance under the current system, you must provide care to someone who already receives Disability Living Allowance (this can be a major problem due to delays in assessment or confusing terminology about “terminal illness,” for example).  Assuming the person under your care successfully receives DLA, then to receive your small £73/week “compensation”:

  • you must provide 35+ hours/week caring,
  • you must NOT earn over £116/week,
  • you must NOT receive a pension (45% of all carers are aged 65+!)
  • you must NOT study 21+hrs/week or be under age 16.
  • If you care for two people (say an elderly parent and your child), you cannot double your benefits.

Although the new Social Security Bill suggests Carer’s Allowance should be raised to the current jobseeker’s allowance, Motor Neurone Disease Scotland argues “this rise does not go far enough: Many carers are forced to give up work to care for their loved one on a full time basis – they are not looking for work.” And even if Carer’s Allowance was increased to the equivalent of jobseeker’s allowance, this “new rate would only recompense carers at the rate of £2.00 per hour based on a 35 hour caring week(Carer’s Scotland Consultation Response, 2017). And, remember, if they work part-time to compensate for this egregiously low compensation, they cannot make more than £116/week, or else their Carer’s Allowance is terminated.

Disability Living Allowance. The actual amount provided to people with disability is surprisingly little. The lowest DLA rate (for those requiring some care) is £22.10/week. The highest DLA one can receive in Scotland is £82/week (assuming they require constant 24/7 care). If mobility is challenged or non-existent, then the highest mobility allowance is £58/week. But is that enough to cover most disabled peoples’ expenses (transportation costs to medical appointments, rent, food and other daily expenditures)? Really?

Funeral Poverty. The average cost of a funeral in Britain was £3,897 (DDD Report, p. 82). Applications for assistance take four weeks on average to process, and the rules regarding eligibility for the applicant often does not “take into consideration the nature of contemporary family relationships” that may not be straightforward nuclear families (p. 81). But Scotland has admittedly taken great strides towards regulating the funeral industry by pushing for licensing the Burial and Cremation Scotland (Bill) 2016.

Let’s just imagine a desperate situation to illustrate funeral poverty: a young family mourning the loss of a terminally-ill child. Currently in Scotland, independent funeral directors provide free of charge until the age of 16 (while the Co-operative Funeral Care extends this to the age of 18). These include embalming, a coffin, a hearse to transport the child, personnel to conduct the funeral and the use of a funeral home (where available), in addition to overhead administrative requirements. However, this does not include cremation (or burial), the headstone, or burial plot. As cremation is currently a less expensive option than a traditional burial in Scotland, some families (according to a leading Scottish children’s charity) are forced to choose a less expensive cremation despite their religious beliefs or the wish of the terminally ill child.

So what?

Exposing the bare bones of Scotland’s current social security policies might seem like an insensitive and rude awakening. But we currently live in a world where gaps in care are compensated by families, friends or charities. Often, these carers and support networks are rewarded with heartlessly small compensation. And although those amazing charities help where they can, they should not be responsible for filling the gaps in care. Even large government donations to charities sadly fuels the “postcode lottery” effect of what services these charities can provide across the country, contributing to inconsistent care and, ultimately, health inequality.

The welfare state exists so that the poorest and most challenged citizens in a community can be supported. Historically, huge strides have been taken in the last 100 years towards administering social security to British people – from Lloyd-George’s National Insurance Act (1911) to Lord Beveridge’s Five Giant Evils (1943) to the founding of the NHS (1948), Thatcher’s sweeping reforms in the 1980s and then New Labour’s Welfare-to-Work  – and we hope these strides are taking us towards a better, fairer and more equal society.

The new Social Security Bill in Scotland is one step in a longer national history towards social justice and a healthier society. Devolution might seem like a complex, tricky business but one can hope the outcome will be transparent and simple to access. For all we know, devolution could irrevocably change the history of social security in Scotland! Ultimately, this new legislation offers big opportunities for big improvement.   According to Social Security Secretary, MSP Angela Constance, “once all the benefits are devolved, we will make more payments each week than the Scottish government currently makes each year.” With the first payments planned to roll out by mid-2018, this ambitious and complex infrastructure could (potentially) improve the desperate circumstances of Scotland’s most vulnerable citizens.

To keep up to date on the latest meetings or voice your opinion, see the Social Security Committee website or Tweet them @SP_SocialSecur 

Rocking Around the Nazi Christmas Tree: The Invention of National Community

Nazi Germany was not the first radical regime to revolutionise its holidays. Russian Bolsheviks believed church bells represented the “old way of life” and actively sought to destroy them in from the late 1920s (and many didn’t ring again until the collapse of communism in the 1990s!). Even French Revolutionaries changed the entire calendar to reflect its commitment to the separation of church and state; 7 day weeks were replaced with 9 day weeks; saint-days were replaced with names of animals, plants, and farm implements; months were renamed by their seasonal activity (germination, flowering, meadow).  It is astonishing that such a calendar lasted a full 12 years.

As in any dictatorship, Nazi Germany’s control and influence filtered down into all aspects of social and cultural life. But Christmas was a bit tricky. How does an anti-Semitic political party celebrate the birth of a Jew? How does that same violent political party celebrate Christian values of charity, love and forgiveness? And, how does a despot like Hitler share his power with baby Jesus?

But the Nazis were cunning, resourceful and, above all, ambitious. Their Christmas celebrations morphed good ole Christian traditions into a mystifying quagmire of cultish obsession with “Nordic” nationalism. German women became “priestesses” of the home, while rituals like lighting the candles on a Christmas tree came to symbolise the birth of “Germanness.” It must have been effective though, as some Nazi-written carols were still sung until the 1950s (yes, really).

Of course, in the post-war period, Christmas became sanitised and distanced from whatever it had become under Hitler’s reign.  But as one Westfalen resident commented in the 1950s, “family celebration has been degraded into the simple giving of presents, and the mother has been dethroned” (see a fabulous article on this complex topic by Joe Perry, “Nazifying Christmas: Political Culture and Popular Celebration in the Third Reich,” Central European History, Vol. 38, No. 4 (2005), pp. 572-605).

But let us not assume that every resident of post-war Germany was longing for the days of “All I Want for Christmas is Hitler.” Because that’s simply not true. But instead of writing a superficial blog about Christmas trees adorned with swastikas, I shall attempt to delve deeper and do justice to the confusing and desperate Christmastimes that average Germans experienced under Nazi rule.

 “Have a Holly, Golly, Invented Norse/Pagan/Viking/German Christmas”

Before the Nazis came to power, Christmas could be considered a rather unique “German” holiday. This attitude pervades even today’s Germany. In the mid 1800s, German scholars (Paulus Cassel, Johannes Marbach, W. Mannhardt) wrote at length that German-speaking territories celebrated Christmas not only as a Christian holiday, but also a pre-Christian tribal ritual incorporating popular folk superstitions.  What the hell does that mean? Well, think Norse. Think Pagan. Think Viking. While they are not interchangeable words (or cultures, or histories), Germany by the 1900s had embraced a mish-mash of holiday traditions and fused them under the term of “Weihnachten” or “Christfest”.

For example, the Advent Wreath, which is adorned with four candles and lit each Sunday before Christmas, derives from the “ring of light” that existed among Germanic tribes before the celebration of Advent. Apparently, these tribes lit lights to represent the shortening of the days until the solstice, at which time the Julfest celebrated the return of light. (Incidentally, the English word yule is derives from the Germanic Jul). Other traditions, such as Santa Claus (Weihnachtsmann), Christmas markets (Weihnachstmärkte) and Christmas trees (Tannenbaum) share their roots from these pre-Christian and “Germanic” traditions.

As Germany itself was still trying to find its national identity in the wake of its unity in 1871, Christmas traditions – whether invented or repurposed – became essential to the national celebrations the Nazis would manipulate when they came to power.

Pre-1933: “It’s the Most Anti-Semitic Time of the Year”

Before the Nazis came to power in 1933, Christmas was an opportunity to launch attacks against those they perceived to be internal enemies (communists, socialists, Jews, liberals, etc.). Rather predictably, they blamed the erosion of so-called “real” Christmas on these groups. They even justified attacks on Jewish stores as a way to promote Christian harmony and a “good will to all.”

Hitler_addressing_Beer_Hall Nov 1921

Hitler addressing a crowd at a Hofbräuhaus in Munich in November 1921, just weeks before his “German Christmas Celebration” speech. (Photo credit)

In 1921, Hitler gave a “German Christmas Celebration” speech at his favourite beer hall in Munich. Four thousand guests applauded when Hitler criticized the materialism that degraded the holiday. He also condemned the Jews who nailed the world’s liberator to the cross (and did not mention the Romans…). By focusing on ideas of “authentic” German community and old pagan traditions (like lighting Christmas tree candles), Hitler and his Nazis pitched Christmas as a German rather than Christian holiday. While it might seem extraordinary that such hateful language could tarnish such a holiday, historian Joe Perry argues that it was “relatively easy for the National Socialists to cast the holiday as an exclusionary celebration of pagan, Volk nationalism, since these ideas had a lengthy popular and scholarly pedigree.” (p. 579).

Post-1933: “Have Yourself a Merry People’s Christmas”

After the Nazis came to power, their approach to Christmas totally changed. While this was strategically beneficial to propaganda efforts and gained mass appeal, it also signified a new wariness towards the Protestant and Catholic churches in Germany.

Religious belief in Nazi Germany was not encouraged. The Nazis would not tolerate being subordinate or accountable to any religious institution, despite the fact that over 95% of Germans in 1939 identified as Protestant or Catholic (Evans, Third Reich at War, p. 546).  For the academic studies and longer discussion, check out Guenter Lewy’s The Catholic Church and Nazi Germany, Hubert G. Locke and Marcia Sachs Littell’s Holocaust and Church Struggle, Donald J. Dietrich’s Catholic Citizens in the Third Reich, Leo Stein’s Hitler Came for Niemoeller: The Nazi War against Religion, and Franz G. M. Feige’s The Varieties of Protestantism in Nazi Germany.

Instead of outright condemning Christmas’ religious connotations, the Nazis simply redefined the holidays as annual events of “national rebirth.” Christmas was thus viewed as a superlative opportunity to ritualize and revive the German community in a way that benefitted Nazi politics. This rather clever strategy became another method to politically indoctrinate the masses.

In 1934, the first “People’s Christmas” was celebrated throughout Germany. In Cologne, Hitler Youth brigades held night rallies modelled after solstice pagan rituals. In Hamburg, storm troopers gathered around bonfires on Christmas Eve and sang Nazi marching songs. In Berlin, Propaganda Minister Joseph Goebbels radio broadcast his speech after a torch-lit parade that “the socialism of the deed as become reality. Peace on Earth to mankind.” And, of course, nothing says Christmas in Nazi Germany without “People’s Christmas Trees” set up in various town squares and public parks.

Goebbels Xmas 1937

This photo from 1937 shows Joseph Goebbels with his daughters, Helga and Hilda, beside a People’s Christmas tree in Friedrichshain. (Goebbels’ wife would later kill their children in the Fuhrer Bunker in May 1945). (Photo credit)

Other initiatives also reinforced the Nazis’ politicization of Christmas. Official Nazi holiday greeting cards pictured blue-eyed, blond-haired families to signify racial purity. Christmas entertainment was also revamped and kicked up a notch. On the radio, broadcasts began in late November and seamlessly blended classical carols, radio plays and children’s shows with party propaganda. On Christmas Eve, a special “Christmas Message” from Rudolph Hess was broadcast at 8pm, while carols sung by children’s choirs were followed by Christmas shows about the army, navy and air force.

Even the cinema did not escape the Nazis Christmas propaganda. Annual Christmas newreels featured reports from Christmas markets, state-sponsored events, speeches from political leaders – literally anything that would “colonise and exalt traditional sacred practices” (Perry, p. 582). As with any mass media campaigns, these Christmas campaigns aimed to create a cohesion among the nation-wide audiences who consumed their messages.

Christmas markets, which had been operating in Germany since the 14th century, were also invaded by pro-Nazi booths and and “brown” trade fairs. School teachers were given a specific nazified Christmas curricula with texts that emphasised the Germanic culture as the epitome of Christmas traditions. Children and Hitler Youth members were recruited to help with the Winterhilfswerk campaigns for those “less fortunate.” No German, whether pro-Nazi or vehemently opposed, could escape the Nazis’ reinvention of Christmas.

Nazi Xmas party 1941

Hitler addresses a crowd of Nazis at a Christmas Party in Munich, 1941. (Photo credit)

“I saw Mommy kissing Nazi Claus….” 

Women’s Christmastime roles were also reconstructed by the Nazis as absolutely crucial to holiday celebrations. According to the director of the women’s division of the National Socialist Teacher’s Union, Auguste Reber-Grüber, the German mother was the “protector of house and hearth.” As a “priestess” of the home, the traditional family holidays benefitted from her moral and physical direction. Broadcasts and Nazi pamphlets provided mothers with directions on how to create home-made decorations shaped like “Odin’s Sun Wheel” or bake cookies in the shape of a loop to imitate fertility symbols.  As historian Joe Perry states, “Traditional women’s tasks… like wrapping presents, decorating the home, baking foods…. now had ‘eternally German’ meanings that linked special, everyday acts of holiday preparation and celebration to a cult of sentimentalised ‘Nordic’ nationalism” (p. 597).

“I Won’t Be Home For Christmas”

Once the Second World War began, Christmas changed once again. It even received a name made popular during the First World War: Kriegsweihnachten (literally, “war Christmas”). With millions of men fighting away from home, new initiatives created Christmas books, magazine article and holiday newsreels that celebrated the “spirit” of German war Christmas. Public drives for food, clothes and necessities also helped in December 1941, when the German army began its retreat from Moscow.

soldier-baby

1944 Nazi Christmas Card (Photo credit)

Radio broadcasts from the front lines reported to families at home how their fathers, sons and brothers were celebrating Christmas in the field.

Christmas cards once again were revamped to show the Christmas unity of the home front with the battlefront. This card to the left shows a woman and child (notice the Madonna-child symbolism) above three soldiers trudging through snow in the East. The images faces a poem by Herbert Menzel entitled “Soldier’s Christmas.” Circulated in 1944, cards like these were meant to reinforce the need for ultimate personal sacrifice to ensure the national victory. For more examples, see Randall L. Bytwerk’s excellent online German Propaganda Archive (Calvin College).

But Christmas gift-giving became increasingly more desperate as the war continued and necessities became scarce. Books, interestingly, were not rationed. They became a popular present in the last years of the war. People scrambled to buy books by weight or attractive covers, rather than by title or content. But as historians point out, reports from Christmas in 1944 were riddled with tales of German housewives fighting over meagre portions of eggs (she got five and I only got two!), and emergency holiday distributions of food and coal were critical to survival. As war dragged on, Christmas celebrations became ever more irrelevant to the overwhelming crisis of total war. Instead, most used the occasion to remember fallen soldiers. As one survivor states, “By then, nobody felt like celebrating.”

So what? 

When I think of the average German Protestant or Catholic family in 1930s Hamburg or Berlin, going to the Christmas markets or singing carols with friends while sipping delicious Glühwein, I can only imagine that many must have felt bombarded by Nazi stalls, Nazi lyrics, Nazi Christmas trees. In some ways, this reminds me of how many Christmases I’ve personally felt overwhelmed by the commercialisation and, frankly, the tacky ways society today celebrates Christmas. Advertisements on the radio and TV harass you by mid-November, and it’s nearly impossible to escape any form of Christmas music once 1st December passes.

While today’s robust commercialisation of Christmas is obviously not equivalent to the violent and highly politicised nature of Nazi Germany, these two periods do share the similarity that the original Christian connotations of Christmas have been diluted and sometimes even entirely replaced by other political messages. Today, it’s about consuming the materialism of the season, which reinforces capitalist ideologies. But in Hitler’s Germany, the Nazis’ ability to smoothly refocus Christmas on its Germanic rather than religious derivations forced average Germans into unavoidable celebrations of “national community.” By doing so, this allowed the Nazis a remarkable and intimate route into the private and familial traditions of millions of Germans on an annual basis. By extracting the Christian meaning from the holidays, the Nazis could then supplant it with a cultish definition of national identity that was exclusionary, racist and violent.

While many Germans believed in Hitler’s doctrine and supported Nazi initiatives, and although “People’s Christmas” drew large crowds, I do not believe that this necessarily means that those Germans were outright Nazis.  Instead, they were engaging in a tradition they already wished to celebrate, and would continue to celebrate, regardless of the politics that surrounded or infused the occasion. The Nazis saturated every fabric of German life, and Christmas was no exception.

Of course, I write this as peel a mandarin “Christmas” orange and search on Amazon for a Christmas gift for my one-year old nephew (who is neither Christian nor old enough to understand the holidays).

Merry Christmas and Happy Holidays everyone!