Switzerland’s No-So-Secret Wartime Weapon: The Case of the Swiss-led Child Evacuations

Last month, the BBC published an article “Is this Switzerland’s Schindler?” about a Swiss man named Carl Lutz who used his position as an envoy for neutral Switzerland stationed in Budapest to issue letters to thousands of Hungarian Jews during the Second World War. These special letters extended Lutz’s diplomatic protection to those targeted for deportation. Lutz saved an astounding 62,000 Jews from being sent to the concentration camps.

Carl Lutz

Crowds expecting to be deported gather outside Carl Lutz’ office in Budapest to get protective letters in late 1944 (photo credit). Notably, Carl Lutz not only issued letters to individuals and families, but also 76 buildings that housed these groups. The Glass House survives today as a result of Lutz’s intervention.

It’s a very remarkable story. Not only does it demonstrate the extent to which people in positions of power could sacrifice their own safety for the survival of total strangers, but it also exemplifies how Swiss citizens could mobilise their government’s neutral status in WWII to help victims of persecution.

Shortly after this article was published, a friend contacted me and, knowing that I studied Switzerland during the Second World War, asked me about Switzerland’s wartime humanitarian efforts: But Chelsea, didn’t the Swiss create the Red Cross? And weren’t they neutral during the war? If so, did they help protect Jews during the war through the Red Cross? And what about refugees fleeing the Nazis? Honestly, why didn’t every single person just pack their bags and move to Switzerland during the war?

These are all excellent questions. Switzerland’s neutrality certainly means that it had a unique position in wartime Europe. Combined with its history of humanitarianism (yes, it did create the International Committee of the Red Cross), and its convenient geography in central Europe (bordering Austria, Germany, France, Italy and Liechtenstein), Switzerland appears to be perfect hiding spot from the Nazis, and a country that could manoeuvre through tense wartime diplomacy to help victims of war. Well spotted, my friend.

Added to all these facts was (and remains) Switzerland’s strong legacy of banking (supported by valuable privacy laws). Foreign investors still flock to Swiss banks because of its centuries of neutrality (and thus financial stability during war), including foreign governments.  In fact, some scholars argue Switzerland’s ability to financially shelter governments’ investments was the single reason that it was not invaded during the war – Swiss banks were just too valuable to both the Allied governments and Nazi Germany’s financial health to even consider crossing one platoon into its little alpine territory.

So really, we have three non-negotiable factors that influenced (and continue to influence) Switzerland’s political actions: neutrality, humanitarianism and banking. Remarkably, Switzerland protected its geographic borders from invasion in both World Wars due to its ability to maintain amicable relationships with belligerent nations. It provided them with a neutral trading centre (ie. banks and foreign currency), as well as becoming an intermediary for international organizations, such as the League of Nations. This tradition still stands today.

post-header-un-geneva

Today, if you’re lucky enough to be able to afford a trip to Geneva, you can walk past the United Nations (above), the headquarters of the World Health Organisation, the International Committee of the Red Cross, the International Labour Office and the World Trade Organisation – all a stone’s throw from the same street! (And you’ll have to walk because you won’t be able to afford anything else).

Although Switzerland’s neutrality, humanitarianism and banking can be seen as massive opportunities and methods to help others, they were often used as excuses by Swiss authorities to limit, evade, or reject multiple initiatives that would have saved countless lives during the Second World War.

However, in keeping with the optimism and sacrifice that Carl Lutz has shown the world, I will write about one extraordinary example where Swiss citizens overcame these limitations to provide refuge and relief to one of the most vulnerable groups suffering under Nazi rule – children.

Why would the  Swiss government reject humanitarian initiatives?

Ultimately, Switzerland feared being overrun by refugees. As Switzerland depended on warring countries for its imports (about 55%) and exports (about 60%), there was simply not enough resources to ensure its national survival if thousands of foreigners (even refugees) came to stay. Over half of the coal in Switzerland, for example, originated from Nazi Germany’s monthly shipments. Thus, Switzerland had to balance national survival with shrewd financial decisions. (For more on Swiss wartime economy, see Herbert Reginbogin’s [2009] Faces of Neutrality, and Georges-André Chevallaz’s [2001] The Challenge of Neutrality, Diplomacy and the Defense of Switzerland).

Similar to today, Europe was overwhelmed with refugees still displaced by the First World War, the Turkish-Armenian War, the Russian Civil War, and the impact of famines gripping eastern Europe. Similar to today, refugees were not simply a passing trend.

Save teh Children Russia

Multiple charities helped refugees in the wake of the First World War. By 1921, when the Russian Civil War had produced countless refugees and starving children, the Save the Children Fund had found it’s stride. It campaigned on the big screen by showing films of the conditions children faced to British audiences. For a brief history, see here.

By the end of 1933, the first year of power for the Nazis, some 37,000 Jews had voluntarily emigrated from Germany as a direct result of increasing violence and persecution (RJ Evans, Third Reich in Power). With Germany’s introduction of the Nuremberg Laws in 1935 – stripping all Jews in Germany or Austria of their citizenship and thus making them stateless refugees in their own country – the world began to realise it had a growing refugee crisis on its hands, especially if Hitler’s militarisation of Germany continued to go unchallenged.  Despite this, countries like France and Britain were apathetic to the plight of these refugees, instead being more concerned with unemployment or other domestic issues (Claudena Skran, Refugees in Inter-war Europe). Sounds like the recent situation in Calais, no?

But refugees had protected rights. In 1933, refugees gained internationally recognised rights (to passports, for example) for the first time, granted by the League of Nations (which, notably, Germany withdrew from in 1933). But this did not equate to decent treatment or immediate asylum for refugees worldwide. In fact, it still doesn’t. (See how refugees today are treated in Libyan detention centres).

In 1938, President Roosevelt’s administration organized the Evian Conference in France to help coordinate efforts to facilitate the emigration of refugees from Germany and Austria.  But the conference was unsuccessful, because most participating nations seemed more concerned with turning the refugees away from their own borders or, in the case of Britain, by simply refusing to contribute to it (Skran, Refugees in Inter-war Europe, 280).

Evian-les-Bains/ Lake Geneva: International conference on jewish immigrants from Germany

Lord Winterton, the English representative at the Evian Conference, gives a speech to attendees (photo credit). TIME reported on 18 July 1938, “Britain, France, Belgium pleaded that they had already absorbed their capacity, Australia turned in a flat ‘No’ to Jews, and the U. S. announced that she would combine her former annual Austrian immigration quota with her German to admit 27,370 persons (who can support themselves) from Greater Germany next year.”

Switzerland’s delegate, Heinrich Rothmund (the Chief of Police and responsible for Swiss borders and immigration), argued that larger countries, such as the US, should absorb large numbers of refugees so that European nations could operate as merely transit countries. Seems logical, eh? However, this line of policy was not accepted.  By the time the Second World War broke out, very few legal stipulations existed which governed the admission and rejection of refugees, and, instead, refugees had to rely upon the decisions made by individual countries. The League of Nations, and the international community, had ultimately failed to protect refugees in time for war.

By late 1938, Rothmund’s idea to treat Switzerland as a transit country had failed.  Escalating Nazi persecution (and the annexation of Austria) caused more fleeing Jews to congregate at Swiss borders. At this point, Rothmund decided that all refugees without visas, especially Jews, would be rejected from Swiss borders. Switzerland then implemented a new, discriminatory method of stamping all Jewish passports and visas with a large J (J for “Jude” meaning “Jew”). This “J-stamp” method to clearly distinguish Jews from other refugees was recommended to Nazi officials by a Swiss legation in 1938. Unfortunately, the Nazis adopted this into their own immigration and deportation protocols. (For a collector’s example, see here).

Amidst public outcry, Switzerland closed its borders in August 1942, justified by Swiss authorities due to an alleged lack of resources. The border closures remain one of the darkest chapters of Swiss history as Swiss actions directly impacted refugees, forcing many refugees to face persecution and death (This was a major finding of a large 25-volume Swiss government-commissioned study in the 1990s, see here). And, in November 1942, when Germany invaded southern unoccupied France, fresh waves of refugees fled to Switzerland’s strictly controlled borders; most were turned away, resulting, for some, in eventual deportation to mass extermination camps. By late 1942, Swiss refugee policies slowly changed, but it was not until July 1944 that the border opened again fully to Jewish refugees.

Switzerland’s Wartime Dilemma: How to Help Refugees when Limited by (an anti-Semitic and anti-refugee) government?

Similar to so many countries today, private citizens vehemently disagreed with their government’s restrictive border controls to limit the intake of refugees. This friction provoked Swiss civilians to turn to non-governmental organizations to help victims of war they deemed worthy of their donations, relief and aid.

One key example is the “Swiss Coalition for Relief to Child War Victims” (Schweizerische Arbeitsgemeinschaft für kriegsgeschädigte Kinder, or Le cartel Suisse de secours aux enfants victimes de la guerre). A mouthful, I know, but let’s call this group the “Swiss Coalition.”

The Swiss Coalition was an alliance of seventeen Swiss charities that sought to evacuate children from war-torn Europe to Switzerland. Although it had operated successfully during the Spanish Civil War (evacuating over 34,000 child refugees of the Spanish Civil War to multiple host nations), this “new” Swiss Coalition was bigger, prepared and practiced. Importantly, remaining funds from its Spanish operations were liquidated and added to the new coalition’s purse.

In 1940, the Swiss Coalition began its remarkable work. Raising over 700,000 Swiss francs in one year alone, the Swiss Coalition appealed to the humanitarian spirit of the Swiss people. One initiative encouraged thousands of Swiss families to voluntarily host children from southern France (then unoccupied by Nazi forces) for three months in their own homes. This ingenious method bypassed Switzerland’s strict immigration controls, as the children would not be a perpetual national burden, as well as appearing more attractive to Swiss hosts, as the children would not be a permanent family commitment.

Untitled7

When children arrived, they gave their information to Red Cross workers who then compared it to the transport manifest and reported it to immigration authorities. After medical screening and delousing at Swiss train stations, they received their first warm meal in Switzerland. (Photographer Hans Staub. Basel train station, circa 1942. CH-BAR J2.15 1969/7 BD116, Belgische Kinder kommen (nach Basel), circa 1942).

The measure was extremely popular among the public, and by November 1940, when the first evacuations from unoccupied France began, the number of families volunteering to host children actually outnumbered the children selected for evacuation. Thousands of families offered spots for French children; over 2,000 were offered in Geneva alone. By December 1941, the Swiss Coalition hosted more than 7,000 children in Switzerland, the majority of them French (Swiss Federal Archives, CH-BAR E2001D 1968/74 BD 16 D.009 14 and Antonie Schmidlin, Eine andere Schweiz, 137).

Untitled22.png

Notice the fatigue from this little Belgian boy. The captain reads “Arrival of Belgian child convoys in a Swiss train station. The children have travelled all night, have slept little and are now hungry and tired.” (Photographer Kling-Jenny. CH-BAR J2.15 1969/7 BD116, Belgische Kinder kommen (nach Basel), circa 1942).

The success continued and operations enlarged. Surprisingly, Nazi authorities agreed to temporary evacuations from their occupied zone, as it was hardly an inconvenience for them; the Swiss operated and funded the evacuations and – crucially – Switzerland was neutral. In February 1941, child evacuations from German-occupied northern France began, and the Swiss Coalition was the first foreign agency allowed into blocked areas, such as Dunkirk, Calais and Boulogne.

Untitled13.png

Medical assessment was the chief criterion for selection. Due to the typhoid epidemics in late 1940 and summer 1943 in northern France and rampant diphtheria during the winter of 1942-43, it was necessary to protect the children, and the Swiss hosts, from such diseases. (CH-BAR J2.15 1969/7 BD 114, Kindertransporte aus Frankreich, March 1942).

In 1942, Belgian children suffering under Nazi rule were now evacuated. Generous donations from Swiss citizens continued to pour in and the Swiss Red Cross joined the operations. This was an important moment because it meant that the national Red Cross infrastructure (and doctors) could be utilised. This was certainly a formidable humanitarian operation.

Strict immigration controls still existed though. By mid 1942, Kinderzüge, or special Children’s Trains, were only allowed to travel one day per week. It had to be the same day every week. Maximum 830 per train. Only 1 adult per 30 children. According to Heinrich Rothmund’s office, there was to be absolutely no deviation from the following criteria:

  • Only children with appropriate identity papers (passports) that allowed them to return to France or Belgium could be selected. This was difficult for stateless groups, such as Jewish families who had left fled Germany or Austria for France. Importantly, this meant that no German-Jews could be evacuated. This also ensured that no child became a responsibility of the Swiss government.
  • Poor health was the sole criterion for selecting children (secondary to having the correct identity papers, of course).
  • Children had to be selected by Swiss Coalition doctors and medically screened upon arrival in Switzerland.
  • Children had to be 4 years to 14 years old.
  • Swiss Federal Police have the full authority to reject children upon entry on any grounds for any reason.

Once the children arrived in Switzerland, there was a host of additional criteria they had to follow while residents in Switzerland. While you could argue that these pedantic rules prevented children from becoming lost or abused by their hosts, it also meant that no one could abuse this temporary system of asylum. No Swiss host could extend a child’s stay, for example.

Untitled24.png

Rothmund specified that Medical Corps of the Swiss Frontier Guards (above) had to deem the children physically poor in order for admission into Switzerland. If entry was refused, then children were not to cross the Swiss border and were immediately returned to their home country. I’ve found no direct evidence to reveal that children were rejected. (CH-BAR J2.15 1969/7 BD116, Belgische Kinder kommen (nach Basel), circa 1942).

Despite the impressive enterprise, the Germans terminated the evacuations from Belgium in May 1942 and from France in October 1942. Their justification was based upon the belief that children in Switzerland would become politically incited with anti-German sentiments. (Yep, really).

The Nazis’ termination of these three-month evacuations coincided with Swiss border closures in late 1942. (But it is important to point out that some children gained entry into Switzerland, including those admitted due to tuberculosis and others sent through another initiative led by Pro Juventute). It was not until July 1944 when the Swiss Coalition resumed the three-month evacuations.

In total, over 60,000 French and Belgian children benefitted from these temporary child evacuations (including some from Yugoslavia) during the Second World War. In the post-war period, this was expanded to other war-stricken nations and an additional 100,000 children were welcomed to Switzerland from 1945 to 1949.

So what?

While I discuss Switzerland at length here, the obligations among so-called “neutral” nations to help refugees is not just about Switzerland. If we put any nation under a microscope, we will discover many unwelcome truths about its immigration policies. Assigning responsibility (and culpability) for who did or did not protect refugees, including Jews, is a tricky exercise, especially when discussed on such a large, international scale.

Perhaps Swiss historians say it best. When ascribing responsibility for Switzerland’s lack of action to protect vulnerable groups, notable Swiss historian Edgar Bonjour argued that the entire generation of Swiss made it possible for the democratic government to create such refugee policies. Historian Stephen Mächler (Hilfe und Ohnmacht, 440) pushes this further to criticize “the entire globe,” as everyone opposed welcoming refugees, especially Jews, making it nearly impossible for Switzerland to do anything but to create similar refugee policies. However, as Georg Kreis argues (Switzerland and the Second World War, 113), if all are responsible, then ultimately no one is responsible

Let’s return to our “Swiss Schindler”. As a diplomat working from a Swiss consulate in Budapest, Carl Lutz was protected by international law and granted immunity to local conflict, as any diplomat should be treated. But, importantly, only neutral governments during the Second World War could act as protective powers. As Lutz was the citizen of a neutral government, this meant that his Swiss embassy in Budapest acted as an intermediary and/or protective power for other warring nations without diplomatic representation in Hungary. (This system still operates today; a Canadian pastor was recently released in North Korea via the Swedish embassy because Canada designated Sweden to be its protective power). Therefore, Carl Lutz’s citizenship to neutral Switzerland played an incredibly critical role in the lives of 62,000 Jews.

Remarkable initiatives like the Swiss Coalition, and the actions of Swiss citizens like Carl Lutz, Paul Grüninger, Hans Schaffert, Roslï Näf, and so many others, deserve great attention. They not only sacrificed their own personal comfort, safety and even careers, but they discovered cunning ways to capitalise on their Swiss neutrality for the protection of thousands of people. In this sense, their humanitarianism (and courage) seems amplified. Neutrality was not a limitation or excuse to not intervene, but actually an influential weapon that could be used, if in the right hands.

Big Opportunities for Big Improvement: Changing the History of Social Security in Scotland

History is being made in Scotland right now. Although many haven’t noticed.

Westminster is currently devolving a number of powers to the Scottish Parliament under the Scotland Act 2016. This includes a portion of the social security budget, accounting for £2.9 billion or 15% of the total £17.5 billion spent every year. The new social security system will deliver 10 of 11 key benefits to over 1.4 million people in Scotland, including Carer’s Allowance, Disability Living Allowance and Sure Start maternity grants (Discretionary Housing Payments will continue to be paid by Local Authorities).

If you’re not on benefits, or don’t live in Scotland, then perhaps this is of little interest to you. But from a historical standpoint, and a humanitarian perspective, remarkable things are happening at Holyrood that will have a massive impact on the most vulnerable portions of Scottish society.

As a welfare state, Scotland (and Britain) is committed to the collective welfare of its people, so that no citizen falls below the minimum standards in income, health, housing and education. In other words, it’s like a social safety net.

Collective welfare in Britain began in the 1830s. Although Victorians distrusted the poor, believing poverty was their own fault due to wasteful habits, laziness, and poor moral character, England introduced the New Poor Law Act in 1834. However, it only offered assistance to able-bodied persons if they entered a workhouse, were put to work and thus “submitted to harshness” (I’m not even kidding, that exact phrase came from this textbook: Baldock, Mitton, Manning and Vickerstaff, Social Policy, 4th ed, 2007, p. 29). Workhouses were not happy institutions, we must remember. Instead, these able-bodied persons were meant to experience a lower standard of living than even the poorest labourer. The rationale was that it would discourage all but the destitute able-bodied from turning to the Poor Law, whose only choice in life was either the workhouse or nothing. Pretty grim!

Slum housing - Close 46 Glasgow L_tcm4-556904

Abject poverty of Glaswegian children in the 1850s, photographed by Thomas Annan, and taken from No. 46 Saltmarket, an old close in Glasgow.  Image from the National Galleries of Scotland via www.sath.org.uk

Over a hundred years later, during the Second World War, economist William Beveridge (1941) wrote a ground-breaking report on social policy. After surveying wartime housing schemes, Beveridge famously declared that Britain commit itself to attack five giant evils: want, disease, ignorance, squalor and idleness.  Some would argue that this was the pivotal moment when modern welfare measures were introduced. (For a marvelous documentary on how the bombings during the Blitz revealed the conditions of Britain’s poorest class, and inspired lesser-known journalist Ritchie Calder to confront the long-term housing and poverty crisis in Britain, see BBC’s Blitz: The Bombs that Changed Britain).

The creation of the National Health Service (NHS) in 1946 was one of the largest reforms of modern British society. By amalgamating local authority hospitals with voluntary hospitals (many of which had already been organised during the war), and by promoting the NHS as a service based on clinical need rather than ability to pay, the British public warmly welcomed the new health scheme. Despite this, social security in Britain faltered. In the 1960s, critics such as Peter Townsend, brought attention to the fact that many pensioners were in poverty because of inadequate pensions. Meanwhile, National Insurance (NI) was given based on contributions, which often left unemployed women (ie. homemakers) excluded from the system. By the 1970s, means-tested systems were introduced to rectify social security shortcomings, which meant that low pensions, for example, were increased in line with prices or earnings, whichever were greater.

By the 1980s, Prime Minister Margaret Thatcher declared that excessive public expenditures were the root of Britain’s economic issues, as the delivery of public services were “paternalistic, inefficient and generally unsatisfactory” (Baldock, Mitton, Manning and Vickerstaff, Social Policy, 4th ed, 2007, p. 39).

Tell Sid 1986 British Gas _BD_54_ 5

The ‘Tell Sid’ campaign from 1986, which that encouraged people to buy shares in British Gas. ‘If you see Sid, tell him’ ran the slogan, and around 1.5million did at a cost of 135p/share, in a £9billion share offer, the largest ever at the time.

The real issue, of course, was to avoid welfare recipients from becoming too dependent on state benefits. Echoing her Victorian ancestors, Thatcher and her advisers thought that “generous collective provision for unemployment and sickness was sapping some working-class people’s drive to work.” Measures were introduced to lower taxes and decrease state intervention and instead increase market forces with private investment. Major utility companies for gas, electricity, telephones, British Airways and British Rail were all privatized.  The assumption was that this new system would use competition to promote efficiency, and be motivated by public demands. This, it can be argued, was when the welfare state in Britain changed substantially. Or, this is when it went downhill.

An Example: Thatcher’s “Right To Buy”

Affordable housing, for example, was undermined by the unprecedented cuts in maintenance and subsidization under Thatcher. The “Right to Buy” scheme was introduced in 1980 so that long-term council tenants could purchase their council home at a discounted rate. As over 55% of Scottish people lived in council homes in 1981, this was a useful scheme to help many families become more independent.

But, crucially, this scheme removed thousands of homes from local councils’ resources. Without more affordable housing being built, and large reduction in subsidies from the federal government in 1981, Right to Buy only led to higher rents, longer waiting lists, and created a major housing crisis that lasted for decades.

By November 2012, a Scottish government consultation revealed that the majority of councils, and many tenants and landlords wanted the Right to Buy scheme abolished. In 2013, the Scottish government announced it would end this scheme in order to “safeguard social housing stock for future generations.” By 2016, the Right to Buy scheme was terminated in Scotland.

Education, healthcare, social security, all experienced cuts under the Thatcher period. And, with New Labour in the late 1990s, more changes sought to eradicate the “postcode lottery” effect of the NHS services by introducing national standards and centralized audits and performance reviews. Focus was also placed on employment; “welfare-to-work” epitomized the belief that work was the surest way out of poverty. As Chancellor, Gordon Brown promised to eradicate child poverty through a system of tax credits (a mission he still fights for, especially in Edinburgh where child poverty stands at 35% in some areas).

When Death Comes Knocking… Is Devolution is the Solution?

In addition to Scotland’s current housing crisis and child poverty, policy researchers are now drawing attention to another impending crisis on the British horizon: death.

In 2017, a comprehensive 110+ page report called “Death, Dying and Devolution” (hereafter called DDD) by the University of Bath’s Institute for Policy Research outlined the impact of death in Britain. Its findings were unsettling: Over 500,000 people die in Britain each year and over 2 million deal with death’s emotional, financial and practical consequences every year.  That is one in four or six Britons every year. A further 1 million people provide care for someone with a terminal illness every year, but only one in six employers have policies in place to support this population.

This means that family members (estimated 58% women) must assume the caring role with very little compensation (as you will see shortly). Disabled or injured people receive small sums to support themselves, forcing their family and support network or local authorities to pay for their housing and basic utilities. And, shockingly, unregulated funeral services plummet many families into something called “funeral poverty”!! Read on!

The findings in the DDD report is meant to be a radical wake-up call to policy makers about Britain’s approach to death. The alarming deficit in policy response and legislation, the report argues, is compounded by poor infrastructure and strategising, resulting in fragmented care, escalating and unregulated funeral costs, and massive inequalities experienced by dying and bereaved people due to their geographic location. However, the DDD report singles out Scotland as the only nation to have developed innovative, progressive policies in respect of end of life care. Notably, Scotland’s goal is to provide palliative care to all who need it by 2021 – the only nation to actually set a deadline.

Fortunately, it is not all doom and gloom. The process of devolution is the perfect opportunity to tackle many of these problems. The report claims that “In light of the projected rise in the UK death rate over the next 20 years, with devolution comes a once-in-a-lifetime opportunity to (re)address the neglect of death as a public policy issue, repositioning death as a central concern of the welfare state,” (p. 6).

Soc Secuity Bill

The Social Security bill has just passed through stage one of a three-step parliamentary process. Find the latest updates on Twitter @SP_SocialSecur 

Let’s now return to the Social Security Bill being discussed in the Scottish Parliament…

As part of the legislative process, the Social Security Committee, comprised of multiple Members of the Scottish Parliament, has been given the task to oversee the new social security system.  This committee decided to make a Social Security charter (which would be quicker to implement than legislation) and submitted it for the public to scrutinise in the summer of 2017. You can see the charter here.

This is where it gets interesting.

Numerous charities responded to the public consultation with their concerns, criticisms and viewpoints. After the Social Security Committee received these reports, it will then amend the charter, seek more evidence from private citizens and charity directors as required, and in general, improve the legislation. (“Part one” of a three-stage legislative process is due for completion by December 2017, “part two” will begin in January 2018).

To the credit of the Scottish Ministers who drafted this charter, it’s somewhat vague and simplified, which is “normal” for such legislation in its infancy. For example, it’s impossible to say precisely how much a Carer should be paid, but merely to state whether a Carer should be paid more or less than the current benefit.

And to the credit of these charities’ policy researchers who drafted these perceptive responses, they have sunk their teeth into this charter and have ripped it apart…

Carer’s Allowance. This is currently less than the current jobseeker’s allowance (£73.10/week) and comes with a list of restrictive conditions. To qualify for Carer’s Allowance under the current system, you must provide care to someone who already receives Disability Living Allowance (this can be a major problem due to delays in assessment or confusing terminology about “terminal illness,” for example).  Assuming the person under your care successfully receives DLA, then to receive your small £73/week “compensation”:

  • you must provide 35+ hours/week caring,
  • you must NOT earn over £116/week,
  • you must NOT receive a pension (45% of all carers are aged 65+!)
  • you must NOT study 21+hrs/week or be under age 16.
  • If you care for two people (say an elderly parent and your child), you cannot double your benefits.

Although the new Social Security Bill suggests Carer’s Allowance should be raised to the current jobseeker’s allowance, Motor Neurone Disease Scotland argues “this rise does not go far enough: Many carers are forced to give up work to care for their loved one on a full time basis – they are not looking for work.” And even if Carer’s Allowance was increased to the equivalent of jobseeker’s allowance, this “new rate would only recompense carers at the rate of £2.00 per hour based on a 35 hour caring week(Carer’s Scotland Consultation Response, 2017). And, remember, if they work part-time to compensate for this egregiously low compensation, they cannot make more than £116/week, or else their Carer’s Allowance is terminated.

Disability Living Allowance. The actual amount provided to people with disability is surprisingly little. The lowest DLA rate (for those requiring some care) is £22.10/week. The highest DLA one can receive in Scotland is £82/week (assuming they require constant 24/7 care). If mobility is challenged or non-existent, then the highest mobility allowance is £58/week. But is that enough to cover most disabled peoples’ expenses (transportation costs to medical appointments, rent, food and other daily expenditures)? Really?

Funeral Poverty. The average cost of a funeral in Britain was £3,897 (DDD Report, p. 82). Applications for assistance take four weeks on average to process, and the rules regarding eligibility for the applicant often does not “take into consideration the nature of contemporary family relationships” that may not be straightforward nuclear families (p. 81). But Scotland has admittedly taken great strides towards regulating the funeral industry by pushing for licensing the Burial and Cremation Scotland (Bill) 2016.

Let’s just imagine a desperate situation to illustrate funeral poverty: a young family mourning the loss of a terminally-ill child. Currently in Scotland, independent funeral directors provide free of charge until the age of 16 (while the Co-operative Funeral Care extends this to the age of 18). These include embalming, a coffin, a hearse to transport the child, personnel to conduct the funeral and the use of a funeral home (where available), in addition to overhead administrative requirements. However, this does not include cremation (or burial), the headstone, or burial plot. As cremation is currently a less expensive option than a traditional burial in Scotland, some families (according to a leading Scottish children’s charity) are forced to choose a less expensive cremation despite their religious beliefs or the wish of the terminally ill child.

So what?

Exposing the bare bones of Scotland’s current social security policies might seem like an insensitive and rude awakening. But we currently live in a world where gaps in care are compensated by families, friends or charities. Often, these carers and support networks are rewarded with heartlessly small compensation. And although those amazing charities help where they can, they should not be responsible for filling the gaps in care. Even large government donations to charities sadly fuels the “postcode lottery” effect of what services these charities can provide across the country, contributing to inconsistent care and, ultimately, health inequality.

The welfare state exists so that the poorest and most challenged citizens in a community can be supported. Historically, huge strides have been taken in the last 100 years towards administering social security to British people – from Lloyd-George’s National Insurance Act (1911) to Lord Beveridge’s Five Giant Evils (1943) to the founding of the NHS (1948), Thatcher’s sweeping reforms in the 1980s and then New Labour’s Welfare-to-Work  – and we hope these strides are taking us towards a better, fairer and more equal society.

The new Social Security Bill in Scotland is one step in a longer national history towards social justice and a healthier society. Devolution might seem like a complex, tricky business but one can hope the outcome will be transparent and simple to access. For all we know, devolution could irrevocably change the history of social security in Scotland! Ultimately, this new legislation offers big opportunities for big improvement.   According to Social Security Secretary, MSP Angela Constance, “once all the benefits are devolved, we will make more payments each week than the Scottish government currently makes each year.” With the first payments planned to roll out by mid-2018, this ambitious and complex infrastructure could (potentially) improve the desperate circumstances of Scotland’s most vulnerable citizens.

To keep up to date on the latest meetings or voice your opinion, see the Social Security Committee website or Tweet them @SP_SocialSecur 

Rocking Around the Nazi Christmas Tree: The Invention of National Community

Nazi Germany was not the first radical regime to revolutionise its holidays. Russian Bolsheviks believed church bells represented the “old way of life” and actively sought to destroy them in from the late 1920s (and many didn’t ring again until the collapse of communism in the 1990s!). Even French Revolutionaries changed the entire calendar to reflect its commitment to the separation of church and state; 7 day weeks were replaced with 9 day weeks; saint-days were replaced with names of animals, plants, and farm implements; months were renamed by their seasonal activity (germination, flowering, meadow).  It is astonishing that such a calendar lasted a full 12 years.

As in any dictatorship, Nazi Germany’s control and influence filtered down into all aspects of social and cultural life. But Christmas was a bit tricky. How does an anti-Semitic political party celebrate the birth of a Jew? How does that same violent political party celebrate Christian values of charity, love and forgiveness? And, how does a despot like Hitler share his power with baby Jesus?

But the Nazis were cunning, resourceful and, above all, ambitious. Their Christmas celebrations morphed good ole Christian traditions into a mystifying quagmire of cultish obsession with “Nordic” nationalism. German women became “priestesses” of the home, while rituals like lighting the candles on a Christmas tree came to symbolise the birth of “Germanness.” It must have been effective though, as some Nazi-written carols were still sung until the 1950s (yes, really).

Of course, in the post-war period, Christmas became sanitised and distanced from whatever it had become under Hitler’s reign.  But as one Westfalen resident commented in the 1950s, “family celebration has been degraded into the simple giving of presents, and the mother has been dethroned” (see a fabulous article on this complex topic by Joe Perry, “Nazifying Christmas: Political Culture and Popular Celebration in the Third Reich,” Central European History, Vol. 38, No. 4 (2005), pp. 572-605).

But let us not assume that every resident of post-war Germany was longing for the days of “All I Want for Christmas is Hitler.” Because that’s simply not true. But instead of writing a superficial blog about Christmas trees adorned with swastikas, I shall attempt to delve deeper and do justice to the confusing and desperate Christmastimes that average Germans experienced under Nazi rule.

 “Have a Holly, Golly, Invented Norse/Pagan/Viking/German Christmas”

Before the Nazis came to power, Christmas could be considered a rather unique “German” holiday. This attitude pervades even today’s Germany. In the mid 1800s, German scholars (Paulus Cassel, Johannes Marbach, W. Mannhardt) wrote at length that German-speaking territories celebrated Christmas not only as a Christian holiday, but also a pre-Christian tribal ritual incorporating popular folk superstitions.  What the hell does that mean? Well, think Norse. Think Pagan. Think Viking. While they are not interchangeable words (or cultures, or histories), Germany by the 1900s had embraced a mish-mash of holiday traditions and fused them under the term of “Weihnachten” or “Christfest”.

For example, the Advent Wreath, which is adorned with four candles and lit each Sunday before Christmas, derives from the “ring of light” that existed among Germanic tribes before the celebration of Advent. Apparently, these tribes lit lights to represent the shortening of the days until the solstice, at which time the Julfest celebrated the return of light. (Incidentally, the English word yule is derives from the Germanic Jul). Other traditions, such as Santa Claus (Weihnachtsmann), Christmas markets (Weihnachstmärkte) and Christmas trees (Tannenbaum) share their roots from these pre-Christian and “Germanic” traditions.

As Germany itself was still trying to find its national identity in the wake of its unity in 1871, Christmas traditions – whether invented or repurposed – became essential to the national celebrations the Nazis would manipulate when they came to power.

Pre-1933: “It’s the Most Anti-Semitic Time of the Year”

Before the Nazis came to power in 1933, Christmas was an opportunity to launch attacks against those they perceived to be internal enemies (communists, socialists, Jews, liberals, etc.). Rather predictably, they blamed the erosion of so-called “real” Christmas on these groups. They even justified attacks on Jewish stores as a way to promote Christian harmony and a “good will to all.”

Hitler_addressing_Beer_Hall Nov 1921

Hitler addressing a crowd at a Hofbräuhaus in Munich in November 1921, just weeks before his “German Christmas Celebration” speech. (Photo credit)

In 1921, Hitler gave a “German Christmas Celebration” speech at his favourite beer hall in Munich. Four thousand guests applauded when Hitler criticized the materialism that degraded the holiday. He also condemned the Jews who nailed the world’s liberator to the cross (and did not mention the Romans…). By focusing on ideas of “authentic” German community and old pagan traditions (like lighting Christmas tree candles), Hitler and his Nazis pitched Christmas as a German rather than Christian holiday. While it might seem extraordinary that such hateful language could tarnish such a holiday, historian Joe Perry argues that it was “relatively easy for the National Socialists to cast the holiday as an exclusionary celebration of pagan, Volk nationalism, since these ideas had a lengthy popular and scholarly pedigree.” (p. 579).

Post-1933: “Have Yourself a Merry People’s Christmas”

After the Nazis came to power, their approach to Christmas totally changed. While this was strategically beneficial to propaganda efforts and gained mass appeal, it also signified a new wariness towards the Protestant and Catholic churches in Germany.

Religious belief in Nazi Germany was not encouraged. The Nazis would not tolerate being subordinate or accountable to any religious institution, despite the fact that over 95% of Germans in 1939 identified as Protestant or Catholic (Evans, Third Reich at War, p. 546).  For the academic studies and longer discussion, check out Guenter Lewy’s The Catholic Church and Nazi Germany, Hubert G. Locke and Marcia Sachs Littell’s Holocaust and Church Struggle, Donald J. Dietrich’s Catholic Citizens in the Third Reich, Leo Stein’s Hitler Came for Niemoeller: The Nazi War against Religion, and Franz G. M. Feige’s The Varieties of Protestantism in Nazi Germany.

Instead of outright condemning Christmas’ religious connotations, the Nazis simply redefined the holidays as annual events of “national rebirth.” Christmas was thus viewed as a superlative opportunity to ritualize and revive the German community in a way that benefitted Nazi politics. This rather clever strategy became another method to politically indoctrinate the masses.

In 1934, the first “People’s Christmas” was celebrated throughout Germany. In Cologne, Hitler Youth brigades held night rallies modelled after solstice pagan rituals. In Hamburg, storm troopers gathered around bonfires on Christmas Eve and sang Nazi marching songs. In Berlin, Propaganda Minister Joseph Goebbels radio broadcast his speech after a torch-lit parade that “the socialism of the deed as become reality. Peace on Earth to mankind.” And, of course, nothing says Christmas in Nazi Germany without “People’s Christmas Trees” set up in various town squares and public parks.

Goebbels Xmas 1937

This photo from 1937 shows Joseph Goebbels with his daughters, Helga and Hilda, beside a People’s Christmas tree in Friedrichshain. (Goebbels’ wife would later kill their children in the Fuhrer Bunker in May 1945). (Photo credit)

Other initiatives also reinforced the Nazis’ politicization of Christmas. Official Nazi holiday greeting cards pictured blue-eyed, blond-haired families to signify racial purity. Christmas entertainment was also revamped and kicked up a notch. On the radio, broadcasts began in late November and seamlessly blended classical carols, radio plays and children’s shows with party propaganda. On Christmas Eve, a special “Christmas Message” from Rudolph Hess was broadcast at 8pm, while carols sung by children’s choirs were followed by Christmas shows about the army, navy and air force.

Even the cinema did not escape the Nazis Christmas propaganda. Annual Christmas newreels featured reports from Christmas markets, state-sponsored events, speeches from political leaders – literally anything that would “colonise and exalt traditional sacred practices” (Perry, p. 582). As with any mass media campaigns, these Christmas campaigns aimed to create a cohesion among the nation-wide audiences who consumed their messages.

Christmas markets, which had been operating in Germany since the 14th century, were also invaded by pro-Nazi booths and and “brown” trade fairs. School teachers were given a specific nazified Christmas curricula with texts that emphasised the Germanic culture as the epitome of Christmas traditions. Children and Hitler Youth members were recruited to help with the Winterhilfswerk campaigns for those “less fortunate.” No German, whether pro-Nazi or vehemently opposed, could escape the Nazis’ reinvention of Christmas.

Nazi Xmas party 1941

Hitler addresses a crowd of Nazis at a Christmas Party in Munich, 1941. (Photo credit)

“I saw Mommy kissing Nazi Claus….” 

Women’s Christmastime roles were also reconstructed by the Nazis as absolutely crucial to holiday celebrations. According to the director of the women’s division of the National Socialist Teacher’s Union, Auguste Reber-Grüber, the German mother was the “protector of house and hearth.” As a “priestess” of the home, the traditional family holidays benefitted from her moral and physical direction. Broadcasts and Nazi pamphlets provided mothers with directions on how to create home-made decorations shaped like “Odin’s Sun Wheel” or bake cookies in the shape of a loop to imitate fertility symbols.  As historian Joe Perry states, “Traditional women’s tasks… like wrapping presents, decorating the home, baking foods…. now had ‘eternally German’ meanings that linked special, everyday acts of holiday preparation and celebration to a cult of sentimentalised ‘Nordic’ nationalism” (p. 597).

“I Won’t Be Home For Christmas”

Once the Second World War began, Christmas changed once again. It even received a name made popular during the First World War: Kriegsweihnachten (literally, “war Christmas”). With millions of men fighting away from home, new initiatives created Christmas books, magazine article and holiday newsreels that celebrated the “spirit” of German war Christmas. Public drives for food, clothes and necessities also helped in December 1941, when the German army began its retreat from Moscow.

soldier-baby

1944 Nazi Christmas Card (Photo credit)

Radio broadcasts from the front lines reported to families at home how their fathers, sons and brothers were celebrating Christmas in the field.

Christmas cards once again were revamped to show the Christmas unity of the home front with the battlefront. This card to the left shows a woman and child (notice the Madonna-child symbolism) above three soldiers trudging through snow in the East. The images faces a poem by Herbert Menzel entitled “Soldier’s Christmas.” Circulated in 1944, cards like these were meant to reinforce the need for ultimate personal sacrifice to ensure the national victory. For more examples, see Randall L. Bytwerk’s excellent online German Propaganda Archive (Calvin College).

But Christmas gift-giving became increasingly more desperate as the war continued and necessities became scarce. Books, interestingly, were not rationed. They became a popular present in the last years of the war. People scrambled to buy books by weight or attractive covers, rather than by title or content. But as historians point out, reports from Christmas in 1944 were riddled with tales of German housewives fighting over meagre portions of eggs (she got five and I only got two!), and emergency holiday distributions of food and coal were critical to survival. As war dragged on, Christmas celebrations became ever more irrelevant to the overwhelming crisis of total war. Instead, most used the occasion to remember fallen soldiers. As one survivor states, “By then, nobody felt like celebrating.”

So what? 

When I think of the average German Protestant or Catholic family in 1930s Hamburg or Berlin, going to the Christmas markets or singing carols with friends while sipping delicious Glühwein, I can only imagine that many must have felt bombarded by Nazi stalls, Nazi lyrics, Nazi Christmas trees. In some ways, this reminds me of how many Christmases I’ve personally felt overwhelmed by the commercialisation and, frankly, the tacky ways society today celebrates Christmas. Advertisements on the radio and TV harass you by mid-November, and it’s nearly impossible to escape any form of Christmas music once 1st December passes.

While today’s robust commercialisation of Christmas is obviously not equivalent to the violent and highly politicised nature of Nazi Germany, these two periods do share the similarity that the original Christian connotations of Christmas have been diluted and sometimes even entirely replaced by other political messages. Today, it’s about consuming the materialism of the season, which reinforces capitalist ideologies. But in Hitler’s Germany, the Nazis’ ability to smoothly refocus Christmas on its Germanic rather than religious derivations forced average Germans into unavoidable celebrations of “national community.” By doing so, this allowed the Nazis a remarkable and intimate route into the private and familial traditions of millions of Germans on an annual basis. By extracting the Christian meaning from the holidays, the Nazis could then supplant it with a cultish definition of national identity that was exclusionary, racist and violent.

While many Germans believed in Hitler’s doctrine and supported Nazi initiatives, and although “People’s Christmas” drew large crowds, I do not believe that this necessarily means that those Germans were outright Nazis.  Instead, they were engaging in a tradition they already wished to celebrate, and would continue to celebrate, regardless of the politics that surrounded or infused the occasion. The Nazis saturated every fabric of German life, and Christmas was no exception.

Of course, I write this as peel a mandarin “Christmas” orange and search on Amazon for a Christmas gift for my one-year old nephew (who is neither Christian nor old enough to understand the holidays).

Merry Christmas and Happy Holidays everyone!

 

Hidden Edinburgh: The World’s First School for the Deaf is in my back Garden

A few times a week, I walk down a quiet path through Edinburgh’s residential Southside. This well-used and well-maintained path follows the foot of the Salisbury Crags and offers a magnificent, up-close view of Holyrood Park. It’s also a stone’s throw from my flat.

This path borders one of the largest council-built estates in southern Edinburgh called “Dumbiedykes.” (Pronounced as dumm-ee-dykes). While this is a strange name, I had once been told that it derived from the fact that there was once an old school for the “deaf and dumb” nearby. (Of course, no one in modern PC language would ever call it that nowadays).

A tall stone wall separates this walk path from Queen’s Drive, the road that skirts the Salisbury’s Crags and is annoyingly closed every Sunday or during major events to all road traffic. (Often due to royal events at nearby Holyrood Palace, an inconvenient reality for us plebs who live so close to royalty).  Thus, a walker such as myself must walk along this wall when using the path. Here’s a crude representation:

Dumbiedykes Chelsea

Upon closer inspection of this marvellous stone wall, you begin to notice the remnants of fireplaces, walls, and numerous inexplicable nooks and crannies that have no logical order.

IMG_7390

Can you spot the fireplace? Can you spot the commemorative plaque?

Every time I pass this wall, I try to imagine the stone cottages or stables that might have been attached to it so long ago.  Of course, just like so many other parts of Edinburgh’s dark grimy history and confusing urban landscape, I merely shrug my shoulders and continue my walk.

 

Not today, I vowed myself. Not today! 

After an afternoon of researching my local area, learning about this deaf school, its founder, and discovering British Sign Language’s status in Scottish society,  I thought a blog post would be a perfect forum for my findings.

Who founded this “deaf and dumb” school? When? And, why? 

Born in 1715 in South Lancashire, Thomas Braidwood studied at the University of Edinburgh and began a career in education to the children of wealthy families. At his home in Edinburgh’s Canongate, Braidwood privately instructed local students and especially enjoyed teaching mathematics. However, this changed in 1760 when a wealthy Leith wine merchant, Alexander Shirreff, asked Braidwood to teach his 10 year old deaf son, Charles, how to write.

Evidently, Braidwood was eager for the challenge. In 1764, he founded Braidwood Academy just south of the Royal Mile along a street called St. Leonards. The building, which came to be known as “Dumbie House,” was the very first (private) school for deaf children in Britain and, some claim, in the world.

Screen Shot 2017-11-28 at 15.50.02.png

This drawing (1885) depicts Dumbie House (later named Craigside House), and it can also be seen on 1820s maps from Historic Environment Scotland.

Despite Braidwood’s good connections and enthusiasm towards this untapped educational market, Charles Shirreff (who would become a celebrated painter and portrait miniaturist) was his only pupil. However, soon enough, Dumbie House welcomed other wealthy pupils, including astronomer John Goodricke (1764-1786), Governor of Barbados Francis McKenzie (1754-1815, also Clan Chief of Highland Clan McKenzie, British MP, and a botanist with the Royal Society of London), and Scottish biographer John Philip Wood (1762-1838).

Remarkably, Braidwood became a pioneer in sign language. During the mid-18th century, deaf education mostly comprised of teaching how to speak clearly enough to be understood. But Braidwood’s new technique was unusual; he combined the vocal exercises of articulation with lip-reading and, for the first time, hand gestures that we recognise today as sign language. This combined system became the forerunner of British Sign Language (BSL).

PCF Johnson1 334.jpg

Dr. Samuel Johnson (1709-1784) suffered from poor health himself having contracted scofula (a form of tuberculosis) as a child. Despite hearing loss and bad eyesight, Johnson had a remarkable career as a writer. According to the Oxford Dictionary of Quotations, he is the second-most quoted Englishman in history.

Braidwood Academy also received attention from famous contemporaries. Sir Walter Scott mentioned Braidwood Academy in Heart of Midlothian (1818) and even the famous author, Dr. Samuel Johnson, described the school after a short visit en route to the Western isles: “There is one subject of philosophical curiosity in Edinburgh which no other city has to show; a College for the Deaf and Dumb, who are taught to speak, to read and to write, and to practise arithmetic, by a gentleman whose name is Braidwood. It was pleasing to see one of the most desperate of human calamities capable of so much help: whatever enlarges hope will exalt courage. After having seen the deaf taught arithmetic, who would be afraid to cultivate the Hebrides.”

Dumbie House eventually boasted 20 students, including women (Jane Poole, for example, set a major legal precedent when a court accepted her last will as valid, even though she had communicated her wishes to the drafter exclusively by fingerspelling as she was both deaf and blind – a massive victory for legal rights of the disabled in Britain). By 1780, Thomas Braidwood moved to London to begin a new school in Hackney. Notably, his three daughters also became teachers for the deaf and continued to practise his combined approach to new generations of pupils. Dumbie House continued to operate as a school until it was shut in 1873 and Dumbie House was demolished in 1939.

But Braidwood’s Influence Spreads Across the Seas….

Another very important Braidwood Academy pupil was Charles Green. Born deaf, Charles was the son of fourth-generation American and Harvard graduate Francis Green (1742-1809). Just prior to the American Revolution, the Green family moved to England and in 1780, Charles was enrolled at Braidwood Academy in Edinburgh.

Francis watched his son learn how to communicate orally, but was astonished at the speed of which sign language could allow his son to communicate with other students. He was so impressed, apparently, that Francis published a book anonymously that praised Braidwood’s work called “Vox Oculis Subjecta: A Dissertation on the most curious and important art of imparting speech, and the knowledge of language, to the naturally deaf, and (consequently) dumb; With a particular account of the Academy of Messrs. Braidwood of Edinburgh” in 1783. “Vox Oculis Subjecta” translates to “voice subjected to the eyes.” Francis wrote in the introduction that:

 “Man as a social being has an irresistible propensity to communicate with his species, to receive the ideas of others, and to impart his own conceptions.”

The first half of Vox Oculis Subjecta surveys the natural capacity of humans for language (quoting various famous authors extensively), and then describes Braidwood’s methods. As Braidwood himself never wrote about his own teaching practises, Green’s Vox Oculis Subjecta (1783) is invaluable record of deaf education.

Although Charles tragically drowned at age 15 while fishing, Francis continued to take an interest in deaf education. According to historians Melvia M. Homeland and Ronald E. Homeland, in the 1790s, Francis visited Paris and London to see how other institutions taught deaf students to communicate (The Deaf Community in America: History in the Making, p. 31). He eventually returned to the US. Before his death, he not only advocated through his writings for free education to all deaf children in America, but in 1809, had collected the names of 75 deaf individuals in Massachusetts (the first ever census of the deaf) with plans to start a school.

In 1812, just three years after Francis Green’s death, Col. William Bolling, who was a sibling to some Braidwood Academy’s pupils who studied alongside Charles Green, and himself a father of two deaf children, attempted the first US school for the deaf. Even Thomas Braidwood’s grandson, John Braidwood II, who had moved to America by then, assisted with the school. Although the school closed in 1815, it just two year later when another educationalist, Thomas Hopkins Gallaudet, successfully started what is considered today to be America’s first school for the deaf in West Hartford, Connecticut.

IMG_7393

This road sign, which serves history more than a practical purpose and is just meters from the remnants of Dumbie House, has a great deal more meaning for me now.

So What? 

It seems rather odd that such an instrumental school for deaf education and British Sign Language is so little acknowledged. When I stride past its demolished foundations and the fireplace in the stone wall, I note that its commemorative plaque was not installed there until 2015 (admittedly with a great turn out by the Lord Provost, British Deaf History Society, Deaf History Scotland, and multiple Scottish officials). But this delayed promotion of deaf history is inconsistent with the remarkable work of Edinburgh’s numerous charities and societies (Historic Scotland, National Trust, Old Edinburgh Club, Lost Edinburgh, etc) that are outstanding in their ability to conserve, protect and promote local history…

So perhaps my perception of Braidwood Academy’s neglect speaks to some of the larger attitudes towards disability. Of course, during the 18th Century, deafness – like all disabilities – was poorly understood and it wasn’t until institutions like Braidwood Academy that some began to realise that intellect was not affected by disability. Being deaf was certainly not synonymous with being “dumb.” Compounding this ignorance was the issue of class. Initiallly, only the rich could afford to educate their children at Braidwood Academy. Fortunately, in 1792, the London Asylum for the Education of the Deaf and Dumb Poor at Bermondsey, became the first public Deaf school in Britain. Again, Braidwood’s influence was also felt there too – it was one of his previous employees, Joseph Watson, who founded it.

British Sign Language (BSL) was not recognised as an official language by Westminster until 2003.  Ironically, Braidwood Academy’s Dumbie House is just half a mile from where the British Sign Language (Scotland) Act was passed unanimously in 2014 by MSPs in the Scottish Parliament – giving BSL the same legal protection as languages such as Gaelic. In Scotland today, an estimated 12,500 people speak BSL. However, in Wales and Northern Ireland, BSL has no legal status or protection.

While I’m pleased that BSL is legally protected and that a commemorative plaque was mounted on the original foundations of Braidwood Academy,  I do not believe that deafness, or disabilities in general, are given the recognition they deserve. But Braidwood’s remarkable influence on language, teaching and disabled rights is at least an excellent starting point for repositioning deafness as a critical aspect of Scotland’s broader history.  As Ella Leith, secretary of Deaf History Scotland, said at the unveiling of the plaque in 2015:

IMG_7389

 

“It’s partly about pride for the deaf community in seeing their history recognised, but also about raising awareness among hearing people that Scotland’s heritage should include deaf people too. Their heritage is as much part of Scotland as general heritage.”

 

 

 

Opportunities in Oral History Research: Guest Blog with Dr. Jane Judge

Have you ever asked a friend about “what happened!” on his/her latest date? Or listened to an interview with your favourite actor about their upcoming movie? Or asked your mother how on earth she baked her Yorkshire puddings so golden, puffy and gorgeous, while yours simply collapse in on themselves?

Believe it or not – so long as these events occurred in the past – then you’ve just conducted the impressive method of “oral history,” albeit very informally.

Oral history includes both the process of collecting testimony from living, breathing human beings, as well as the product itself, the narrative of past events.  

And although oral history, as both method and output, is the latest trend among historians, it can’t actually be confined to the study of history alone. Key witness testimonies in high-profile murder cases rely enormously on oral history.  Medical practitioners exploring the effects of new drugs, treatments and therapies rely enormously on oral history. Social workers and psychologists helping survivors of traumatic events often rely on the memories produced through oral history. As oral historian Lynn Abrams argues, “oral history has become a crossover methodology, an octopus with tentacles reaching into a wide range of disciplinary, practise-led and community enterprises” (Oral History Theory, 2010, p.2).

Although oral history is a vastly rewarding and highly deployable tool for nearly any discipline or purpose, it also comes at a cost. Professional scholars must often submit enormous ethics approval applications to their institutions or governments before even approaching a potential human subject for interview. Many aspects of interviewing can be volatile, emotional, and even dangerous (for example,  Dr. Erin Jessee’s fieldwork included gathering testimonies from Rwandans convicted of genocide while they were detained in Rwandan prisons!) And what happens to the interviewee if researchers ask unsettling questions – is there post-interview psychological support for the subjects (or even the interviewer) for example? These calculations of risk are absolutely essential to the ethical responsibility of any oral history project. And, of course, the goal is to cause minimal harm, which is often the general outcome (And for Dr. Jessee’s helpful tips about about managing risk, her advice here).

Despite some risks, oral history remains an invaluable tool.  Findings can influence new policies and initiatives, while researchers can harness its power as a versatile method to record history in action, bolster an organisation or government’s ethos and contribute to an initiative’s influence. In this sense, oral history can be one of the most dynamic instruments in a researcher’s arsenal, and profoundly utilised by multiple interdisciplinary stakeholders.

Dr. Jane Judge, a postdoctoral researcher in early modern history at the KU Leuven in Belgium, recently experienced the exhilarating power of oral history. Although the majority of Jane’s historical research has permitted her into fabulous dusty old libraries and national archives housing original sources with elaborate 18th Century handwriting, Jane has not been required to conduct interviews with real, living humans – until now!

IMG_3926

Dr. Jane Judge in her natural habitat of Leuven, Belgium.

Jane currently volunteers at the Fulbright Commission in Brussels, which is an independent body that, along with the US Embassies to Belgium and Luxembourg, as well as the Belgian and Luxembourg governments, administers the US State Department’s  Fulbright Scholarship Programs for Belgians and Luxembourgers going to the US, as well as Americans coming to these two countries. Since 1948, the Fulbright Commission in Brussels has connected and supported over 4,000 students, researchers, and teachers, while promoting international educational exchange and mutual understanding. 

Screen Shot 2017-11-14 at 13.00.15

The Fulbright Program awards approximately 8,000 grants annually.  Approximately 370,000 “Fulbrighters” have participated in the Program (over 4,000 through the Belgium Commission) since its inception in 1946. For more information, visit www.fulbright.be 

Recently, Jane has been tasked with gathering stories from alumni of the Fulbright Commission in order to record and promote the program’s overall mission for the 70th anniversary celebrations that will take place next year. The idea is to highlight that it is the people that make Fulbright what it is as they engage in immersive experiences abroad and make human connections. This means that she has met over 15 very interesting people–and plans to meet at least 15 or 20 more–who, at some point in their lives, benefited from a Fulbright grant and the program’s international networks, financial support and scholarly community. 

In Jane’s quest to gather data from real human beings, various unanticipated surprises allowed her to discover a few crucial things about oral history, interviewing techniques and the value of human input. After musings over some of her most interesting findings, we both thought it would be highly appropriate to share some of these gems in a guest blog! Here are Jane’s discerning observations about this often-tricky but fruitful research method:

Before you began interviewing, what perceptions did you have about oral history in general? 

Jane: My perception of oral history generally was that it was messy, fraught with ethics issues and required intensive training to do well. As far as the Fulbright project itself, I didn’t have much choice in doing interviews. Because of the many stakeholders in this Fulbright Commission (there are commissions around the world implementing the Fulbright program in their locations), the office here already knew they wanted to focus on alumni and not the nuts and bolts history of the program. So, I didn’t really chose oral history, it chose me. That being said, the archives here are very rich, containing midterm and final reports from every grantee, as well as commentary from the offices here on American grantees that came to Belgium and Luxembourg until the 1980s. The project could have been done by just going through these and piecing together stories, pulling out interesting anecdotes. Given my background, that was much more in my wheelhouse and so I was a little apprehensive about doing interviews, especially with my rather negative preconceived notions about oral history. But I decided to see it as an opportunity rather than a challenge–an opportunity to learn and enact a new methodology, to travel throughout Belgium, and to meet lots of new people in new fields!”

Can you comment about the interview process – who, where, when, how?

Jane: Sure. The who should be fairly obvious at this point—Fulbright alumni! (Haha.) The first thing I did was go through the archive of somewhere between 3,000 and 4,000 alumni that past interns have digitized and picked out people who had dynamic profiles, represented diverse backgrounds, fields of study, and programs (undergraduates, graduates, research scholars, teachers, visiting professors, and newer summer programs). To the list that I compiled, we also added some notable alumni and some who had volunteered or been quite active as alums in the past. These included people from walks of life as diverse as being Deputy Prime Minister for Belgium or a Spanish Linguistics teacher. We have been in touch with alumni from every decade of Fulbright’s 70 years so far, so that’s quite exciting.  

As for the interviewing itself, I started with in-person interviews with people here in Belgium. Funnily enough, the first interview was actually an American alumna and her husband who happened to be here on holiday, but the others have all been Belgians that had Fulbrights to the States at one time or another. I go to them, meeting them either at their homes, offices, or a quiet comfortable cafe they know, at a time that’s mutually convenient. I record the audio of our conversations with my phone, so that part’s pretty straightforward, easy, and compact! We set the interviews up by first having either the Executive Director or the Program Director for Students get in touch via email, explaining the anniversary and the project, and then I follow up with an email about logistics. If they are up for being interviewed, I take it from there as far as setting up a time and place. For the Americans and Luxembourgers (and one very busy Belgian), I will and have done the interviews by phone or internet call. The interviews themselves are pretty organic. We want to cover their personal experience, how Fulbright has impacted their lives, and what they think the program can continue to offer. So I start by just asking them to introduce themselves and explain their relationship to the program (how are they “a Fulbrighter”?) and then I really let them go, guiding them if there’s dips or when we need to get back on track.

Were there any challenges in the interview itself that you had not predicted? If yes, how did you overcome them? 

Jane: I wouldn’t say there were challenges in the interview, as such. Everyone’s pretty enthusiastic and already very willing to talk about their experiences. The only things I could think of would be technical. One of the interviews took place over lunch, for example, so I worried that it wouldn’t record clearly in the cafe–this didn’t end up being a problem though, and the recording is crystal clear. I have had some trouble with the recordings of Skype interviews, but that’s, again, technical. With those interviews it’s also harder to have an official start and end of the recorded interview, since people feel like they’re chatting with me and so sometimes they start asking me questions about my experiences!

In your opinion, what was the best thing about interviewing your subjects?

Jane: Oh, by far hearing first-hand stories. I love the narrative that comes out of it. In much of my past work, I’ve had to piece together the story from snippets I’ve found in the archives. Here, I get to ask a question and then sit back and listen to a whole answer.

What would your top tips be to anyone about to conduct an interview?

Jane: Definitely get in touch with a modern historian (if you’re not one yourself), preferably someone who is already a trained oral historian. Check out the wonderful (credible!) resources available online, especially the Oral History Association and the Southern Oral History Program at UNC Chapel Hill. You were my first port of call, Chelsea, as a trained historian who was a member of the OHA, and you came through with aplomb. Definitely the best decision I made before embarking on this research adventure.

Would you ever volunteer to do it again?

Jane: Absolutely. I’ve had a complete blast doing these interviews. Even the transcriptions, though sometimes tedious and always time consuming, are fun. Since these people have fascinating stories to tell about travel, research, and all kinds of different experiences, it’s a pleasure to interview them and even to then relive that through transcription.

Finally, as a historian, what do you think that oral history achieves that archival research cannot? 

Jane: Follow up questions! This is by far my favorite part of oral history to this point. When you’re working in an archive, you can pose pointed questions, go searching through piles of papers people might never have wanted to see the light of day, and uncover secrets unabashedly. However, you cannot ask a single follow-up question or check with your subjects/sources that you are interpreting them correctly. In my own research into 18th-century revolutionaries, this means that there’s never any certainty that the way I interpret how some reacted to a given decree, for example, is the way they actually felt about it. With oral history, I can follow up when someone says or writes something that’s not entirely clear. I can ask them to connect dots and even answer an explicit question, rather than trying to figure out what they were implying later when I’m trying to write my analysis.

So what?

Jane touches on a great many qualities of oral history research that traditional archival research does not possess: listening to the “whole answer” rather than piecing together small fragments of history from a dusty archive, or understanding some of the emotional reactions behind certain people’s experiences, or verifying your own analyses of history by asking follow up questions, or even anticipating and minimising risks when interviewing in a café – these diverse observations demonstrate what we can gain from oral history and the multiple opportunities oral history presents to those wanting to learn from people who experienced the past.

How about a round of applause for Dr. Jane Judge’s perceptive analysis of her oral history experience? Many thanks, Jane!

IMG_3996

Surely Belgium’s world famous frites (or frieten in Flemish) are one of the best reasons for Fulbrighters to study in Belgium?

 

(Disclaimer: The views, thoughts, and opinions expressed in the text belong solely to the authors, and does not reflect any official opinion of Fulbright, EdUSA, or the US State Department or other groups or individuals).