Why Scotland’s Treatment of Refugees is a Cut Above the Rest

This past week, I’ve been doing some freelance research for a British human rights charity. This experience has dramatically opened my eyes to the absolute chaos that is the British “immigration” system.

Of course, we’ve all heard about British government’s fumbling inability to handle current refugees and even integrate migrants. Windrush Scandal, anyone? Or Theresa May’s explicit wish since 2012 “to create, here in Britain, a really hostile environment for illegal immigrants”? Well, looks like it’s worked there, Prime Minister! How about the refusal to grant visas to 100+ Indian doctors, who had been especially recruited to help critical health shortages in the NHS? Or the ever-growing imprisonment of asylum seekers in British “detention facilities”? Or the fact that the Home Secretary herself had no idea that immigration quotas were used in her own department? So long, Amber Rudd.

But while I was scrutinising the absolute disorder and contradictory measures that the Home Office currently takes towards the world’s most vulnerable people – refugees, asylum seekers and victims of trafficking – I was delighted to discover that at least one portion of this Great Britain is taking deliberate, long-term steps towards helping these groups: Scotland.

Did you know that of all the UK, Scotland is the only nation that wants to give refugees the right to vote?

Hurrah! In May 2018, it was announced that plans are being proposed to the Scottish Parliament to give EU, non-EU and asylum seekers the right to vote. Let’s hope it passes!

But why is this a good thing? Giving migrants the right to vote is an absolute cornerstone of nations with a history of immigration and diversity. For example, Australia, the United States, and Canada have benefitted immensely from giving refugees, asylum seekers and other landed migrants the right to vote. Although, admittedly, this didn’t happen overnight. (For example, Japanese and aboriginals in Canada were not given the right to vote until 1949 and 1960, respectively). But in the 1970s, Canadian PM Pierre Trudeau flooded Canada with migrants and, by extension, new voters. Although it seems wonderfully inclusive, the true motive was to dilute existing Francophone and Anglophone tensions that were hitting a crisis point!

Trudeau’s strategy, however underhanded, achieved something remarkable. It meant that political parties had to include these new immigrants in their broader policy objectives. It meant that migrants were courted with initiatives that appealed directly to them. This forced politics to become dynamic, progressive and inclusive. Instead of pushing migrants to the fringes of society, this enforced that Canadians, whether new or native, were included in the most high-level decisions in Ottawa. In fact, in 2011 and 2015, the Canadian Conservative Party won a higher share of votes among immigrants than it did among native-born Canadians. Go figure.

Of course, if migrants in Scotland are given the right to vote, this allows the Scottish National Party (SNP), currently a significant minority party, an opportunity to expand its voter base. And, you know what? I don’t care. It doesn’t matter if you’re an SNP, Tory, Labour, Lib Dem or Green supporter. If migrants can vote, including EU and non-EU residents, then this only benefits greater Scottish society. Inclusivity and diversity will become ingrained in Scottish politics which, in turn, will impact Scottish voters, Scottish attitudes and broader long-term Scottish aims.  This will irrevocably enrich Scottish society.

Also, increasing the rights of people who live here does not nullify or decrease the existing voter rights of born-and-bred Scots. Equal rights for others does not mean less rights for them. It’s not pie, right?

image

Green MSP Ross Greer said in May 2018,  “What better way could we show refugees and asylum seekers that they truly are welcome and that Scotland is their home than by giving them the right to vote?” These Syrian refugees arrived in December 2017 to be settled on the Isle of Bute (Photo by Christopher Furlong/Getty Images).

Did you know that of all the UK, Scotland is the only nation to have an explicit strategy in place to integrate newly-arrived refugees?

To my absolute astonishment, England, Wales and Northern Ireland do not have any broader strategy to integrate its thousands of refugees and asylum seekers.  Stupid, no? Fortunately, after mounting pressure, England announced last March that it will invest £50m to support an Integrated Communities Strategy that will initially target five local authorities in England to improve English language skills, increase economic opportunities (particularly for women) and ensure that every child receives an education. So, I guess late is better than never, eh?

But a lack of an “integration strategy” (however bureaucratic and boring that sounds) has massive impact on migrants. For example, asylum seekers in the UK face massive problems once they’re granted refugee status. After waiting six months for a decision (while surviving on just £37.75/week for yourself and dependents, with no right to work or access mainstream benefits, and living in shady Home Office accommodation outsourced to companies with a history of poor quality compliance, like G4S), you are given just 28 days to find work, a new home, apply for benefits, and “move on” towards integration.

This “move on” period is often the worst moment for refugees in the UK. Suicide rates spike, mental health problems increase, people are forced into destitution and exploitation simply due to a lack of support and, critically, not enough time.

In fact, the Home Office often does not send critical documentation to new refugees within this 28-day period. For example, a National Insurance Number (NINo) and Biometric Residence Permit become vital to a refugee’s survival in the UK because they often did not flee war and persecution in their homeland with their passports, right? So, one or both of these documents are required for gaining employment, opening a bank account, applying for a Home Office “integration loan” (£100+), accessing mainstream benefits and securing private or public accommodation. However, the Home Office often does not send these documents until well after an asylum seeker has been granted refugee status. Seems counterintuitive, no?

For example, the All Party Parliamentary Group on Refugees wrote in their report about Sami from Iraq. He was not sent his NINo until the day after he was evicted from his Home Office accommodation (at the end of the 28 day “move on” period). Because Sami could not claim benefits or obtain employment to secure accommodation without his NINo, he was forced into homelessness. Charity reports are riddled with stories like these, where it’s obvious that the UK’s current system is failing those it most means to help. Instead, homelessness, destitution and exploitation become synonymous with the refugee experience.

After the “move on” period, refugees now have the long-term task to integrate. Learning English is, obviously, the biggest task. Without being able to communicate, migrants cannot access NHS services, higher education or training, the job market, or even just simple things like community events! So, English for Speakers of Other Languages (ESOL) classes are vital, right? But in England, government funding for ESOL classes was drastically reduced by 55% between 2008 to 2015. Fortunately, Scotland and Wales and Northern Ireland all currently have ESOL strategies in place. Nearly 5% of all Scots (over age of 3 years old) actually speak another language other than English in the home. Although Brits are generally notorious for not speaking other languages, at least the Scottish government is wise enough to support these refugees learning English. This, sadly, is something they’re failing to do south of Hadrian’s Wall.

Did you know that of all the UK, Scotland currently hosts the largest urban population of refugees? Yep, Glasgow.

The local authorities that currently host the largest number of asylum seekers (waiting on refugee status) are Glasgow (3,799), Liverpool (1,622), Birmingham (1,575), and Cardiff (1,317). But the largest asylum seeker populations are actually in North West (10,111), the West Midlands (5,431), Yorkshire and the Humber (5,258) and London (5,084).

JS98397454

By September 2016, asylum seekers were ten times more likely to live in Glasgow than anywhere else in the UK. Seems the ‘Weegies were okay with this! (Photo credit)

Although asylum seekers are allocated Home Office accommodations in Glasgow, decisions on their applications are not within the remit of the Scottish authorities. Everything is decided through a centralised, federal system. But while one waits on their application, one can be “dispersed” anywhere within the system without one’s choice taken into account. This means that local authorities and NGOs must compensate for shortages in financial support, issuing documentation and allocated housing.

Fortunately, there’s multiple Scottish/Glaswegian charities willing to help: Scottish Refugee Council, Refugee Survival Trust, Positive Action in Housing, and Scottish Faiths Action for Refugees, among others.

To my great surprise, I googled “Glasgow, refugees, asylum seekers, bad, 2018” to find recent negative news stories about asylum seekers in Scotland. To my shock, I found nothing that denotes a systemic problem between asylum seekers and the local populations. Instead, I googled “Glasgow, refugees, asylum seekers, 2018” and found headlines within the last 6 months like this:

JS143362104

Damascene Street Food really does look delicious, eh?

Of course, there must be bad news stories… and they appear to be coming from Scottish charities. One independent news source, called The Ferret, reported that charities had doled out record amounts of emergency grants to asylum seekers in 2017 – over £110,000 to be precise. And, at £80 per grant, that’s a huge number of asylum seekers deemed to be in crisis in Scotland.

Why is it so high, the Ferret asks? Due to delays on documentation, poor housing, no access to work or benefits while waiting on one’s application, etc. Basically everything I’ve already written. In fact, The Ferret calls it the “invisible epidemic” of refugee destitution. Evidently, Scottish charities are facing the same challenges as their brothers and sisters south of the border.

Did you know that Scotland allows asylum seekers and refugees full access to the NHS?

All refugees in the UK have immediate free access to healthcare provided by the NHS. Asylum seekers are also entitled to free “urgent care” (also called “primary care”) while in the UK. But “secondary care,” such as getting a specialist to check that never-ending ear infection, or receiving mental health support, or chemotherapy if you have cancer, all those types of long-term “secondary care” benefits are not provided to everyone.

In England, those refused asylum are required to payfor secondary health services. However, in Scotland, refugees and even refused asylum seekers (those deemed as having no recourse to public funds “NRPF”) have full access and treatment on the same basis as any other UK national. Also, all prescriptions are free! Sensational.

So what?

The broader immigration system in the UK is flawed, to put it mildly. Asylum seekers like Nesrîn, an Iraqi Kurd, and her two children, survive on just £37.75/week. She comments that:

They give us asylum benefit so we will not beg, but actually we are begging. Sometimes I cry for myself; everything is secondhand, everything is help. I can never do something for myself… When you become a mum you have everything dreamed for your daughter, and I can’t do anything. I’ve given up, actually.

I can’t imagine just how powerless an asylum seeker must feel in this country. After fleeing violence, war and persecution in their homeland, they arrive on British shores to only find a hostile and monstrous bureaucracy awaiting them.

But, fortunately, Scotland’s treatment of refugees is a cut above the rest. By giving asylum seekers the right to vote, you are giving them a voice. By giving asylum seekers access to full healthcare, you are giving them a chance to live. By creating national strategies for local governments, communities and charities, you are giving refugees a chance to learn English, get a job, find a home, receive an education and integrate into Scottish society. These are remarkable steps in a direction that is supportive, inclusive and diverse. As Sabir Zaza, Chief Executive of the Scottish Refugee Council, said eloquently in May 2018:

“
Refugees often flee their homes because their human rights are denied. For people from the refugee community to then have access to all their rights including the right to vote in Scotland is a hugely significant point in their journey towards integration, citizenship and the ability to play an active role in society.”

Hidden Edinburgh: The World’s First School for the Deaf is in my back Garden

A few times a week, I walk down a quiet path through Edinburgh’s residential Southside. This well-used and well-maintained path follows the foot of the Salisbury Crags and offers a magnificent, up-close view of Holyrood Park. It’s also a stone’s throw from my flat.

This path borders one of the largest council-built estates in southern Edinburgh called “Dumbiedykes.” (Pronounced as dumm-ee-dykes). While this is a strange name, I had once been told that it derived from the fact that there was once an old school for the “deaf and dumb” nearby. (Of course, no one in modern PC language would ever call it that nowadays).

A tall stone wall separates this walk path from Queen’s Drive, the road that skirts the Salisbury’s Crags and is annoyingly closed every Sunday or during major events to all road traffic. (Often due to royal events at nearby Holyrood Palace, an inconvenient reality for us plebs who live so close to royalty).  Thus, a walker such as myself must walk along this wall when using the path. Here’s a crude representation:

Dumbiedykes Chelsea

Upon closer inspection of this marvellous stone wall, you begin to notice the remnants of fireplaces, walls, and numerous inexplicable nooks and crannies that have no logical order.

IMG_7390

Can you spot the fireplace? Can you spot the commemorative plaque?

Every time I pass this wall, I try to imagine the stone cottages or stables that might have been attached to it so long ago.  Of course, just like so many other parts of Edinburgh’s dark grimy history and confusing urban landscape, I merely shrug my shoulders and continue my walk.

Not today, I vowed myself. Not today! 

After an afternoon of researching my local area, learning about this deaf school, its founder, and discovering British Sign Language’s status in Scottish society,  I thought a blog post would be a perfect forum for my findings.

Who founded this “deaf and dumb” school? When? And, why? 

Born in 1715 in South Lanarkshire, Thomas Braidwood studied at the University of Edinburgh and began a career in education to the children of wealthy families. At his home in Edinburgh’s Canongate, Braidwood privately instructed local students and especially enjoyed teaching mathematics. However, this changed in 1760 when a wealthy Leith wine merchant, Alexander Shirreff, asked Braidwood to teach his 10 year old deaf son, Charles, how to write.

Evidently, Braidwood was eager for the challenge. In 1764, he founded Braidwood Academy just south of the Royal Mile along a street called St. Leonards. The building, which came to be known as “Dumbie House,” was the very first (private) school for deaf children in Britain and, some claim, in the world.

Screen Shot 2017-11-28 at 15.50.02.png

This drawing (1885) depicts Dumbie House (later named Craigside House), and it can also be seen on 1820s maps from Historic Environment Scotland.

Despite Braidwood’s good connections and enthusiasm towards this untapped educational market, Charles Shirreff (who would become a celebrated painter and portrait miniaturist) was his only pupil. However, soon enough, Dumbie House welcomed other wealthy pupils, including astronomer John Goodricke (1764-1786), Governor of Barbados Francis McKenzie (1754-1815, also Clan Chief of Highland Clan McKenzie, British MP, and a botanist with the Royal Society of London), and Scottish biographer John Philip Wood (1762-1838).

Remarkably, Braidwood became a pioneer in sign language. During the mid-18th century, deaf education mostly comprised of teaching how to speak clearly enough to be understood. But Braidwood’s new technique was unusual; he combined the vocal exercises of articulation with lip-reading and, for the first time, hand gestures that we recognise today as sign language. This combined system became the forerunner of British Sign Language (BSL).

PCF Johnson1 334.jpg

Dr. Samuel Johnson (1709-1784) suffered from poor health himself having contracted scofula (a form of tuberculosis) as a child. Despite hearing loss and bad eyesight, Johnson had a remarkable career as a writer. According to the Oxford Dictionary of Quotations, he is the second-most quoted Englishman in history.

Braidwood Academy also received attention from famous contemporaries. Sir Walter Scott mentioned Braidwood Academy in Heart of Midlothian (1818) and even the famous author, Dr. Samuel Johnson, described the school after a short visit en route to the Western isles: “There is one subject of philosophical curiosity in Edinburgh which no other city has to show; a College for the Deaf and Dumb, who are taught to speak, to read and to write, and to practise arithmetic, by a gentleman whose name is Braidwood. It was pleasing to see one of the most desperate of human calamities capable of so much help: whatever enlarges hope will exalt courage. After having seen the deaf taught arithmetic, who would be afraid to cultivate the Hebrides.”

Dumbie House eventually boasted 20 students, including women (Jane Poole, for example, set a major legal precedent when a court accepted her last will as valid, even though she had communicated her wishes to the drafter exclusively by fingerspelling as she was both deaf and blind – a massive victory for legal rights of the disabled in Britain). By 1780, Thomas Braidwood moved to London to begin a new school in Hackney. Notably, his three daughters also became teachers for the deaf and continued to practise his combined approach to new generations of pupils. Dumbie House continued to operate as a school until it was shut in 1873 and Dumbie House was demolished in 1939.

But Braidwood’s Influence Spreads Across the Seas….

Another very important Braidwood Academy pupil was Charles Green. Born deaf, Charles was the son of fourth-generation American and Harvard graduate Francis Green (1742-1809). Just prior to the American Revolution, the Green family moved to England and in 1780, Charles was enrolled at Braidwood Academy in Edinburgh.

Francis watched his son learn how to communicate orally, but was astonished at the speed of which sign language could allow his son to communicate with other students. He was so impressed, apparently, that Francis published a book anonymously that praised Braidwood’s work called “Vox Oculis Subjecta: A Dissertation on the most curious and important art of imparting speech, and the knowledge of language, to the naturally deaf, and (consequently) dumb; With a particular account of the Academy of Messrs. Braidwood of Edinburgh” in 1783. “Vox Oculis Subjecta” translates to “voice subjected to the eyes.” Francis wrote in the introduction that:

 “Man as a social being has an irresistible propensity to communicate with his species, to receive the ideas of others, and to impart his own conceptions.”

The first half of Vox Oculis Subjecta surveys the natural capacity of humans for language (quoting various famous authors extensively), and then describes Braidwood’s methods. As Braidwood himself never wrote about his own teaching practises, Green’s Vox Oculis Subjecta (1783) is invaluable record of deaf education.

Although Charles tragically drowned at age 15 while fishing, Francis continued to take an interest in deaf education. According to historians Melvia M. Homeland and Ronald E. Homeland, in the 1790s, Francis visited Paris and London to see how other institutions taught deaf students to communicate (The Deaf Community in America: History in the Making, p. 31). He eventually returned to the US. Before his death, he not only advocated through his writings for free education to all deaf children in America, but in 1809, had collected the names of 75 deaf individuals in Massachusetts (the first ever census of the deaf) with plans to start a school.

In 1812, just three years after Francis Green’s death, Col. William Bolling, who was a sibling to some Braidwood Academy’s pupils who studied alongside Charles Green, and himself a father of two deaf children, attempted the first US school for the deaf. Even Thomas Braidwood’s grandson, John Braidwood II, who had moved to America by then, assisted with the school. Although the school closed in 1815, it just two year later when another educationalist, Thomas Hopkins Gallaudet, successfully started what is considered today to be America’s first school for the deaf in West Hartford, Connecticut.

IMG_7393

This road sign, which serves history more than a practical purpose and is just meters from the remnants of Dumbie House, has a great deal more meaning for me now.

So What? 

It seems rather odd that such an instrumental school for deaf education and British Sign Language is so little acknowledged. When I stride past its demolished foundations and the fireplace in the stone wall, I note that its commemorative plaque was not installed there until 2015 (admittedly with a great turn out by the Lord Provost, British Deaf History Society, Deaf History Scotland, and multiple Scottish officials). But this delayed promotion of deaf history is inconsistent with the remarkable work of Edinburgh’s numerous charities and societies (Historic Scotland, National Trust, Old Edinburgh Club, Lost Edinburgh, etc) that are outstanding in their ability to conserve, protect and promote local history…

So perhaps my perception of Braidwood Academy’s neglect speaks to some of the larger attitudes towards disability. Of course, during the 18th Century, deafness – like all disabilities – was poorly understood and it wasn’t until institutions like Braidwood Academy that some began to realise that intellect was not affected by disability. Being deaf was certainly not synonymous with being “dumb.” Compounding this ignorance was the issue of class. Initiallly, only the rich could afford to educate their children at Braidwood Academy. Fortunately, in 1792, the London Asylum for the Education of the Deaf and Dumb Poor at Bermondsey, became the first public Deaf school in Britain. Again, Braidwood’s influence was also felt there too – it was one of his previous employees, Joseph Watson, who founded it.

British Sign Language (BSL) was not recognised as an official language by Westminster until 2003.  Ironically, Braidwood Academy’s Dumbie House is just half a mile from where the British Sign Language (Scotland) Act was passed unanimously in 2014 by MSPs in the Scottish Parliament – giving BSL the same legal protection as languages such as Gaelic. In Scotland today, an estimated 12,500 people speak BSL. However, in Wales and Northern Ireland, BSL has no legal status or protection.

While I’m pleased that BSL is legally protected and that a commemorative plaque was mounted on the original foundations of Braidwood Academy,  I do not believe that deafness, or disabilities in general, are given the recognition they deserve. But Braidwood’s remarkable influence on language, teaching and disabled rights is at least an excellent starting point for repositioning deafness as a critical aspect of Scotland’s broader history.  As Ella Leith, secretary of Deaf History Scotland, said at the unveiling of the plaque in 2015:

IMG_7389

“It’s partly about pride for the deaf community in seeing their history recognised, but also about raising awareness among hearing people that Scotland’s heritage should include deaf people too. Their heritage is as much part of Scotland as general heritage.”

A Historian’s Quest in the Archives: How to Study Controversy around a Controversial President

Currently, I am in Hyde Park, New York, combing through the archives of President Franklin Delano Roosevelt. Have you heard of him? Of course you have! He was a pretty big deal. Not only was he elected when over a quarter of Americans were unemployed during the Great Depression – pulling them out of their collective misery through massive public works projects and reviving America’s trust in the economy through weekly radio broadcasts called “Fireside Chats”– but he also held office during one of the deadliest wars in American history. Oh, and he was crippled too. Having contracted polio in his 30s, he was the only physically disabled president to be elected to office. Ever.

Roosevelt

FDR served as US President from 1933 to 1945. Here’s a flattering photo, courtesy of Densho Encyclopaedia

Considered the most influential president of the 20th century, FDR’s impact has been felt ever since. Under his watch, unions were given the right to form. His government was the first to provide old-age security, unemployment benefits and disability and single-parent allowances. He introduced the American public to a new relationship with its government by calmly discussing the issues of the day over the radio while they sat comfortably in their homes. He declared that the role of the central government was to secure the material well-being of the American people.   Having enacted the Executive Reorganisation Bill in 1939, he broadened and increased the presidency’s overall responsibilities. He supported the United Nations and ensured the US had a key role in the UN’s Relief and Rehabilitation Administration (and thus, a key role in reshaping post-war Europe). FDR substantially changed America, and its position in the world.

But FDR also had major flaws. Politically, he broke the no-third-term rule in 1940 and sought to centralise the power of the presidency, leading some to question if he would become a dictator. Under his command, he allowed the harsh internment of Japanese-Americans on the west coast. After his death, many questioned why Roosevelt never took a leading role in helping the Jews of Europe, leaving their welfare instead to private organizations and charities. Some claimed he was a racist. Others said he was just a narcissist.

FDR also had an unusual personal life. He was a proper Mama’s boy.  The closeness to his mother created a toxic atmosphere, leaving little emotional room for anyone else. Despite his mother’s fierce opposition, he married his rather remarkable wife, Eleanor. They were cousins, albeit distant. But Eleanor was unusual too; she was an independent thinker and terribly clever, likely a lesbian, and eventually became a politician in her own right in the 1950s.

FDR Mom Eleanor

Sarah Roosevelt was a clever and educated woman who apparently doted on her son. At age 26, she married FDR’s father, James, 52 years old. The birth was difficult and she bore no other children. After James’ death in 1900, she held the majority of the Roosevelt fortune. Sarah died in 1941.

The Roosevelts had an odd relationship, which historians have commented served political ends rather than being a romantic union. But, they did produce six children! While Eleanor advocated for women’s rights and various social reforms, FDR pushed his own career towards vice presidency then eventually presidency. He dealt with a painful disability daily and he adequately “overcame” the perception of it (apparently, people didn’t realise the extent of his immobility because he was so excellent at hiding it in public). He even created a foundation and rehabilitation park for other polio victims in Well Springs, Georgia.

Roosevelt warm spring

FDR had contracted polio while at his summer home in Campobello, Canada, in July 1921. He was just 39 years old. He eventually founded a home for other polio victims in Warm Springs, Georgia, depicted here in 1924.

Affairs were rampant in the Roosevelt household. FDR kept close company with Eleanor’s secretary, Lucy Mercer and, later, his own personal secretary, Marguerite “Missy” LeHand.  Meanwhile Eleanor formed “close” relationships to like-minded women, going on holidays with them regularly, all with FDR’s blessing. He even built a small cottage for Eleanor and her friends to have sleepovers just two miles from the family home in Hyde Park. After her husband’s death, Eleanor became a chief philanthropist in post-war Europe, advocating for human rights (and especially children’s rights) in the new United Nations. Evidently, the Roosevelts lived remarkable and unusual lives, both together and apart.

Eleanor

Eleanor Roosevelt is celebrated as one of the most influential women of the 20th century. She pushed for social reforms, women’s rights, and human rights,  offering her help and influence to marginalised groups and fringe societies. Notably, Eleanor was chair of the United Nation’s Human Rights Commission and, in 1948, was the chief proponent of the Universal Declaration of Human Rights.

In 1941, just a few years before FDR died, he oversaw the construction of the FDR Library and Archives on his family estate in Hyde Park. Not long after his death, his immediate family relinquished their rights to the estate and, at FDR’s request, it became a national park. Today it houses multiple series of the Roosevelt’s papers, with over 20,000 boxes of documents.

IMG_6583.jpg

The FDR Library. Critics claimed FDR’s library was a shameful display of self-promotion, but he claimed it accomplished two goals: preserve documents and provide transparency of all his actions to the American people.

This brings me back to why I’m here. Considering the complexity of these multi-faceted pillars of American history, it is important to approach the Roosevelts with caution and respect, right?

But it’s tricky. As a historian, it’s hard to remain objective when you want certain things to be true. Or, when your research subject is just as controversial as the Roosevelts.

While Eleanor intrigues me, I am actually here for FDR alone. I want to discover why FDR was an obstinate and obstructive SCHMUCK to his closest ally, the British, during the Second World War. Let me explain…

During the Second World War, one of the most powerful weapons the Allies held against Nazi Germany was the economic blockade of Nazi-controlled continental Europe. ALL trade, including relief, sent by the Allies to Germany OR German-occupied countries was strictly forbidden during the war. This prevented Germany from plundering relief, while also forcing Germany to take full responsibility for the territories it conquered. Over time, the blockade would apply considerable pressure upon Germany and strain its resources and, thus, its ability to win the war. Seems logical, right?

The blockade policy was one of those items that was constantly discussed by all levels of multiple governments. I’ve witnessed this in the German, British, Swiss and now American archives. It’s incredible. And surprisingly, very rarely discussed by historians in any great detail (see Meredith Hindley or Jean Beaumont’s “Starving for Democracy”).

Public pressure from various groups (for example, thousands of letters written by concerned Yorkshire women’s groups or Pennsylvanian famers or Belgian mothers or various Red Cross branches) meant that governments were always rejecting pleas for relief from well-meaning citizens, large reputable charities, or governments in exile. And, due to Germany’s considerable exploitation of its conquered territories, the list of those governments begging for relief was very long: Polish, Belgian, Norwegian, Dutch, French, Yugoslav…

But blockade policy remained practically unyielding. (The single exception during the entre war was Greece because of a massive famine, but you can read about that here). So long as the Allies could hold it together, maintain unity on this key war policy, then the blockade have the strongest effect on the enemy.

But humanity is cunning. Swiss charities sought to overcome the blockade by relocating children to Switzerland instead. Massive child evacuations, which is the core of my research, successfully relocated Belgian, French and Yugoslav children to Switzerland for three month periods of recuperation. And the Germans allowed it! No great inconvenience to them, because it removed many mouths to feed and pacified parents. Over 60,000 children were successfully evacuated in this way during the war, and another 100,000 in the post-war period. Impressive, eh?

Untitled6.png

Swiss Red Cross nurses prepare to receive thousands of French and Belgian children at a train station in 1942 in Basel, Switzerland.

However, this changed in August 1942. Hitler’s armies invaded southern, unoccupied France and, soon after, began large round-ups of Jews. Initially, Jewish children were not included in the deportations to the East, which meant that thousands of children were abandoned and parentless. (A few weeks later though, the Germans rounded them up too).

Belgian Child

A Belgian child (4 years old) with severe malnutrition at a Swiss train station, 1942.

Due to this invasion and deportation, thousands of French children now needed immediate relief. Swiss charities grappled with how to help. They approached the Allied governments that perhaps these Swiss-run evacuations could be increased – possibly to over 100,000 children!  But, crucially, Switzerland too was experiencing war shortages – it could not adequately provide for all the children of Europe.  So perhaps the Allies would send relief (food, medicines, vitamins) directly to Switzerland for all these children?  Of course, the Swiss emphasised, they were neutral, not Nazi-controlled, so they were excluded from the Allies’ blockade policy.

It all sounds very logical. A clever and elegant solution to a major humanitarian crisis. While memos shot excitedly across the Atlantic between the US State Department and the British Foreign Office, dear President Roosevelt was having informal meetings with the Ambassador to Norway, Wilhelm Thorleif von Munthe af Morgenstierne. According to strongly-worded and angry British documents, in late October 1942, FDR promised the Ambassador – without consulting the British – that the US would send relief to Norway!

When the British heard of FDR’s assurances, they insisted that there was no way that relief could be sent to Norway without it being allowed also to Belgium, France, Poland, etc, thus breaking the blockade! Also, FDR’s promises complicated the possibility of sending relief to children evacuated to Switzerland, which would both relieve children while also respecting the blockade. Therefore, the British emphatically conveyed their absolute rejection of FDR’s promises to Norway in November 1942 and keenly awaited the American reply.

However, no reply was given. British documents reveal acute frustration and abhorrence that the US would ignore the British regarding such an important subject, to such an extent that Churchill himself was lobbied to become involved. And although Churchill and FDR met at the Casablanca Conference in January 1943, the British government still received no official reply. WHY?

FDR_Churchill_BW.jpg

FDR and Churchill were close allies during the war. The complete (and overwhelmingly detailed) correspondence was compiled by Warren F Kimball in THREE volumes. Notably, when writing informally, Churchill was referred to as “Former Naval Person.” Only in official correspondence was his title “Prime Minister,” indicating the intimacy of their relationship.

By August 1943, the Americans finally gave a half-hearted, vague and conditional reply that they might support extra provisions to Switzerland, but the British, Swiss and Americans took no action. Allied support for child evacuations was not raised again between the British and US until May 1944, just one month before the Allied invasion of Europe on DDay. Of course, by that point, a humanitarian mission for children was hardly as important as the rapid liberation of oppressed nations by the largest invasion in history.

BUT.

Why did FDR promise such a thing to Norway? Was it during a schmoozy, drunken lunch or a formal high-level meeting? Was the promise conditional or was it a blank cheque? Did the Norwegian Ambassador perhaps misunderstand FDR’s “promise” and in fact, no promise was made? Or was FDR’s “promise” actually hollow – perhaps a vain attempt to get the insistent Norwegian off his back – and the British were just overreacting? But then, if that was the case, why would FDR not reply immediately to his ally? Why ignore their determined attempts to find out what happened? WHY? Why, oh President Roosevelt, why?

Meanwhile, let’s all remember: children are starving, being rounded up and sent to concentration camps, experiencing violence and bombings and general oppression. This makes any bureaucratic error or deliberate avoidance all the more inexcusable.

This is the purpose of my research visit. To discover the answer to these questions. My current historical opinion of FDR is not too complimentary. But even I know it’s not fair to FDR, his legacy, or the study of history to jump to conclusions. Which brings me back to my original assessment of FDR…

President Roosevelt was obviously a brilliant politician and, in many ways, a great strategist. His lasting legacy is a testament to his commendable, practical approach and determination to improve American lives. But he also prioritised certain lives over others, and was a blatant narcissist. FDR liked being in control – to such an extent as being classified as a dictator – and sought personal validation from various audiences.  Some legitimate, and some behind closed doors.

A large part of good historical research is accurately determining the motives, personalities, and fears of major historical figures. Both the problem and beauty of studying FDR as a historical topic is that he was just as flawed as he was extraordinary. Throughout his remarkable but challenged life, he engaged with a broader spectrum of victory (and failure!) than others, so predicting his motivations will be exceptionally difficult. He is an infinitely complex character.

My hunch about the whole promising-relief-to-Norway thing? Based upon all the research, documentaries, articles and books I’ve had to read about the man, FDR was NOT impulsive. FDR was deliberate and purposeful.  Everything he did was meaningful and goal-oriented. He was an impeccable strategist. Therefore, I truly think that President Roosevelt had a reason behind his promise to Norway. Now, I just need to figure it out…

Wish me luck!

“Wars Are Not Won by Evacuation”: Untangling the Truth from the Evolving Dunkirk Myth

This month, Christopher Nolan’s long-awaited war epic “Dunkirk” hit screens worldwide. Critics have praised it as Nolan’s best film yet: a “powerful, superbly crafted film” and “a visceral, suspenseful, at times jaw-dropping historical war movie.” With a formidable British cast, a massive budget, the largest marine unit in movie history (60+ boats), and authentic filming actually occurring in the English Channel, “Dunkirk” will invariably be added to the list of war epics including Saving Private Ryan, Schindler’s List, the Great Escape and Das Boot.

Dunkirk Movie Scene.jpg

Dunkirk (2017) already as a spot in the top 30 war movies ever made

My thoughts just moments after watching the film? You get a real sense of urgency. The unwavering and intense anticipation was steadily increased throughout every scene by a soft, but perpetual tick-tock in the background. Every action sequence becomes a catharsis from the tick-tock only to return again, bringing with it this heavy feeling of apprehension that Britain’s brief, hopeful window to escape from Dunkirk is coming to an end. Time is truly ‘of the essence’ in this film.

Nolan’s Dunkirk is perhaps better appreciated by clarifying what it is not. This is not a comedy (as in, there’s not a single joke made to lighten the mood, even briefly). This is not a commentary about the highest-level political decisions of the period (there is no scene that shows Churchill furrowing his brow or naval/army/air force commanders bickering in Westminster). This is not a romance film (in fact, other than a few nurses, there was no female cast, nor insinuation to homosexual love). This is not a transnational film that attempts to bond enemies (there is no scene that shows German soldiers, except a rare glimpse at a Messerschmitt 109E and a few bombers, but even that is from a distance). This is not a documentary (despite a small amount of text after the opening credits, this film does not provide historical facts, nor interviews with survivors).

So what is it?

Perhaps this is best answered by you, the audience. For me, it was a story of survival. Well, a story of British survival. I really enjoyed it. I cringed, I cried, I squirmed, I begged, and I felt the greatest sense of hope when I saw the RAF Spitfires doing their intricate dances in the sky. (Which, coincidentally, is an excellent foreshadow to what would follow the Dunkirk evacuations – the Battle of Britain).

Walking out of the theatre on a Tuesday afternoon in July in Scotland, I followed a mother with her teenage sons. They were enthralled by the movie, but bursting with questions: “Did Grandddad fight in that? How come there weren’t more fighter planes to help the lads on the beaches? I’d shoot every German plane. The RAF were pretty incredible, can you imagine landing a plane like that on the water? Too bad they ran out of fuel. God, I’d be proper scared landing like that.”

Nolan’s film provides a fresh starting point for discussing the war, and Dunkirk particularly. Films are some of our greatest resources to access history. Of course, they must be taken with a grain of salt. According to a study by Dr. Peter Seixas, Professor of Education at the University of British Columbia, the more engaging the film, the less likely audiences were to criticize its historical merit.

Dunkrik Movie Poster (2017)  Dunkrik 1958

Instead, filmic devices, such as realistic violence and use of blood, boosted the perceived authenticity of the historical event. Older films depicting the same event, despite being limited by 1950s or 1960s censorship, were seen as less historically genuine. Interesting, no?

But if Dr. Seixas’ observation is true – that the more engaging the film, the less likely audiences are to question its historical accuracy – then Nolan is stuck between a rock and hard place. Is it possible for Nolan (or any director) to create a film that is both highly entertaining and historically accurate?

No. Let’s get real. It’s impossible to exactly mirror history into any medium, film included. But, we can give credit to Nolan for attempting to gain authenticity in other ways. Nolan wanted to make his Dunkirk epic as British as he could, despite the need for American-sized film budgets to achieve his vision. After all, Dunkirk was a British failure. And depending on your perspective, a British success. Nolan chose only British actors and emphasized the Britishness of this endeavor. Ironically, the film is expected to be more lucrative with American audiences than British. But, whereas the British are educated about the failure of Dunkirk from a young age, many Americans will be introduced to Dunkirk for the very first time through this blockbuster film.

But, importantly, Nolan’s Dunkirk is also contributing to Dunkirk’s ongoing cultural legacy.

The “Dunkirk Myth” might be defined as a Britain’s ability to embrace defeat as a platform for eventual victory; the humanity and compassion of the British people to help one other created the perception of a strong community and an enduring nation. It was the marriage of the home front with the battle front, the defeat with the victory. Since 1940, the Dunkirk Myth has been influenced by various novels, speeches, poetry, and, importantly, films.

This is why is it so very important not to lose sight of the historical facts within this national myth – now reintroduced to new generations through a super visceral, action-packed CGI-enhanced, American-budget British war movie, right?

So what was Dunkirk?

Simply put, it was evacuation of 338,000 Allied soldiers (chiefly from the British Expeditionary Force and French Army) from the beaches of Dunkirk, France from 26 May to 4 June 1940.

A few weeks earlier, Germany had launched a surprise Blitzkrieg (lightening war) on the Allied forces in western Europe. This same German maneuver had epically failed in the First World War (resulting in stagnant trench warfare). But in May 1940, Germany was incredibly successful due to the element of surprise, wireless communications, anti-aircraft guns (called flak), the tight coordination of land and air forces, and stronger tanks.

2.WK., Frankreichfeldzug 1940: Deutsche Militaerkolonnen

This German motorised column secretly advanced through the Ardennes in May 1940. This was no easy feat with 134,000 soldiers, 1,222 tanks, and nearly 40,000 lorries and cars that had to narrowly navigate heavily wooded areas.  Even “Traffic Managers” flew up and down the columns to alleviate any deadlock. But it was a stunning success. Historian Richard J. Evans claims that Germany achieved the greatest encirclement in history with 1.5 million prisoners taken with less than 50,000 German casualties.

Over 66,000 British soldiers died from mid-May until the end of the evacuations on 4 June. A combined total of 360,000 British, French, Belgian and other Allied forces died during the Battle of France, which ended with its surrender on 22 June 1940.

(For more further reading, seen Richard J Evans’ (2009) Third Reich at War, Julian Jacksons’s (2003) The Fall of France: The Nazi Invasion of 1940, Ian Kershaw’s (2008) Fateful Choices: Ten Decisions That Changed the World 1940–1941 or Ronald Atkin’s (1990) Pillar of Fire: Dunkirk 1940).

Time Map 1940

This map from Time Magazine from 10 June 1940 shows the “Nazi Trap” enclosing in on the British and French forces.

Fleeing the German advance, nearly 400,000 Allied soldiers were pushed as far west as possible, until they reached the English Channel at the beaches of Dunkirk. Churchill called it the greatest military disaster in British history. This was also the last time any Allied Forces would be in France, Belgium, the Netherlands or Luxembourg until the famous D-Day landings nearly four years later. This evacuation also meant that all of western Europe was left alone to suffer German occupation for four long years.

Why is this disaster considered a success?

Due to the mobilization of over 800 boats, ships, yachts and other private holiday vessels, 338,000 men who were standing helplessly on the beaches of Dunkirk (as many naval ships could not dock to collect them) were successfully evacuated within just 10 days. From a humanitarian perspective, this is obviously impressive.

But it also meant that commanders made impossible choices, such as leaving behind the sick and wounded, and destroying Allied vehicles, equipment and resources, lest they fall into enemy hands. It was truly a fight for survival against overwhelming enemy forces, low morale, and very few resources. It was also a fight against time. Tick-tock, indeed.

Kenneth Branagh.jpg

Kenneth Branagh’s role as a Naval Commander (with a changed name from the original) reflects  the impossible choices that British commanders faced. All army materials and vehicles were destroyed. Also,  the BEF was the top priority for evacuation. Although some 140,000 French soldiers were evacuated, nearly 40,000 were left behind.

What happened after Dunkirk? (And yes, there is a point for asking this)

Dunkirk ended the “Phoney War,” the 7-month lull on the Western Front following Hitler’s invasion of Poland in September 1939. This shocked the world and brought international attention to the fact that Germany was a formidable force. Hitler’s fiery promises to conquer Europe were not just hot air, but had legitimate merit.

HItler 19 July 1940.jpg

The conquest of France marked the highest point in Hitler’s popularity for the entire war. As the Battle of Britain began raging overhead, Hitler called for peace on 19 July 1940: “A great world empire will be destroyed […] In this hour I feel compelled, standing before my conscience, to direct yet another appeal to reason in England. I believe I can do this as I am not asking for something as the vanquished, but rather, as the victor, I am speaking in the name of reason. I see no compelling reason which could force the continuation of this war.”

Immediately after Dunkirk, the war took to the skies in a fierce combat for air superiority called the “Battle of Britain.” Why? So that Hitler’s forces could invade Britain without constant aerial bombardment in the autumn of 1940 – before winter made it impossible to invade. German Luftwaffe planes initially attacked British air bases in southern England. Royal Air Force pilots (including Commonwealth and Polish pilots) were vicious competition for the vastly superior and better equipped Luftwaffe. Daily “dogfights” were witnessed by civilians. RAF planes and pilots dropped like flies. Churchill’s famous observation that “Never in the field of human conflict was so much owed by so many to so few” reflected the fact that these tireless pilots had become the last line of defense.

RAF Pilots

The average age of a RAF pilot in the Battle of Britain, such as these handsome men above, was just 20 years old. The average life expectancy for a Spitfire pilot was just four weeks. Over 20% of pilots were from Commonwealth nations, or were Polish or Czech. Despite having a much better equipped air force, Germany suffered 2,600 pilot casualties. Britain lost just over 500 RAF pilots.

By late August 1940, a small bomb was dropped on London (German command alleged it was an error). Error or not, this expanded the range of targets to now include civilian centers. The RAF then bombed Berlin. The Luftwaffe again bombed London. The “Blitz” of British cities shadowed the same quick, surprise tactics that the German Luftwaffe had recently used so successfully against infantry forces in western Europe. Night bombings and devastating daily air raids on homes, factories, ports, lasted until May 1941, killing an estimated 40,000 Brits and making hundreds of thousands homeless.

WAR & CONFLICT BOOKERA:  WORLD WAR II/WAR IN THE WEST/BATTLE OF BRITAIN

One of the most iconic photos from the Blitz is St. Pauls Cathedral standing intact after a raid on 29/30 December 1940.

The Blitz, as it would be called, meant that British urbanites had to persist through the most difficult circumstances to continue surviving. Londoners especially “kept going” with daily tasks that came to epitomize the archetype of endurance. If ever there was a time in British history when the national character became so well tested, and so well defined, this was it. (For more reading on this very interesting topic, check out Angus Calder’s (1969) The People’s War: Britain 1939-1945 and (1991) The Myth of the Blitz and Jeremy Crang and Paul Addison’s (2011) Listening to Britain: Home Intelligence Reports on Britain’s Finest Hour, May-September 1940).

Blitz Milkman

Photos, such as this London milkman continuing his deliveries (while firefighters douse a fire in the background), came to typify the resilience and endurance of Londoners to “keep on” despite the war unfolding around them.

What about the Dunkirk Myth?

Although “The Dunkirk Myth” preceded the Blitz, it also developed alongside the Blitz spirit through various culturally-important products (for those who want the pure academic stuff, see Nicolas Harman’s 1980 Dunkirk: The Necessary Myth or an excellent review by my old supervisor, Prof. Paul Addison):

Broadcasts from JB Priestly in May 1940 reporting on the flotilla of “little ships” in the English Channel. Priestly’s depictions of ordinary Englishmen coming to the rescue of the helpless troops transformed this war from a military affair to one which required the entire mobilization of the home front. (But, to be historically accurate on this point, Englishmen weren’t voluntarily throwing themselves into the fray, but the British navy normally took charge of their vessels then used them as required to save the troops).

JB Priestly.jpg

Priestly became a formidable voice of calm reporting (and propaganda) for Britain, though he faced criticism in later life.

Churchill’s famous “We shall fight them on the Beaches” speech to the House of Commons. Everyone has heard this speech. It’s epic. But most everyone does not know that the speech was not broadcast. British newspapers printed excerpts of it, but it was not until 1949 when it was recorded.

Churchill we Shall Fight them on teh beaches COMIC.jpg

This comic from Reddit uses Churchill’s historic rhetoric to satirise reactions by today’s British authorities to threats against modern Britain.

Paul Gallaco’s Snow Goose, a short story (and eventually a novella) first published in the Saturday Evening Post in 1940. This tearjerker reveals a growing friendship between a disabled artist and lighthouse keeper and young woman, who discovers a wounded Snow Goose. Loads of symbolism paints the picture of innocence and loyal love dismantled by the tragedies of war. And the evacuations of Dunkirk become a sort of self-sacrifice for humanity, art, and first loves. The Snow Goose novella had a strong impact on British society. It was a favourite for young readers due to its short but eloquent length and even Michael Morpurgo cites it as an influence on his much-loved War Horse. People saw Dunkirk not for what is was in strict military terms – a colossal disaster – but a sort of coming of age story about the enduring spirit of British compassion and humanity.

the-snow-goose-cover1.jpg

Fantasy Book Review claims Snow Goose is a “a tribute to the indomitable human spirit”

Dunkirk (1958). Starring Richard Attenborough, John Mills and Bernard Lee, it became the second largest grossing film in Britain of that year. By following an English civilian and a British soldier, the film unfolds from two key perspectives, again cementing the myth that Dunkirk united the home and battle front in one great national rescue mission.

Richard Attenborough Dunkirk

Richard Attenborough starred in Dunkirk (1958) but did not receive an Oscar nod

Ian McEwan’s Atonement (novel) and Atonement (2007) film. Atonement follows the blossoming love of a young couple interrupted by the shocking and criminal accusations of a younger sister. Soon enough, the war unfolds and both sisters become nurses in London while the protagonist is sent to France to fight. Dunkirk (again) is used as a historical event that binds together the home front and battle front, becoming both a barrier and vehicle for unity. Director Jo Wright’s incredible scene of the Dunkirk beaches is praised as “one of the most extraordinary shots in the history of British film – a merciless ten minutes, panning across an army of bedraggled and bleeding British troops huddled on the beach at Dunkirk, with ruined ships smouldering in the shallows beyond.”

Atonement 2007.jpg

This 5 and a half minute unbroken sequence in Atonement (2007) was filmed by director Jo Wright with 1,000 extras to emphasise the chaos and disaster of the Dunkirk beaches. See it here.

Finally, Nolan’s Dunkirk (2017). An epic war film that refuses to be classified in all the genres we normally assign. I suspect it will haunt and challenge both critics and audiences for many years to come. But perhaps we should also be wary of a film that is so very one-sided? So singular in its storytelling?

tom-hardy-dunkirk

One-Dimension’s Harry Stiles may have taken all the limelight, but Tom Hardy’s performance as an sharp shooting RAF pilot definitely won my vote. Swoon.

So what?

Historically, Dunkirk was the rude awakening that not only shocked the British Expeditionary Force, but also the British home front. As Churchill said solemnly “Wars are not won by evacuation.” People began to fear for their sovereignty, their homes and their country in a way that they had never before. How they reacted was a real testament to their national character, and how they survived was a real testament to their national legacy.

Culturally, Dunkirk planted the seeds of a national myth that developed, grew and transformed as the war unfolded. Initially it was highly propagandistic with Priestly’s broadcasts or Snow Goose love stories, but as time has passed, Dunkirk’s legacy appears to still enthral the imaginations of a new generation. It was a paradox that such a defeat could be transformed into a stunning success. Now, 77 years later, we are still discussing Dunkirk’s historical relevance and cultural impact on British national identity in the face of overwhelming odds and great uncertainty.

Which begs the question – the same question others have already asked: What about Brexit?

 

 

Should the Youth be given the Vote? Historical Reasons Why Age is Arbitrary

I made a rather startling discovery. Those who suffer from dementia can still vote in the UK and Canada. “Really?” you may ask. “Really,” I reply.

Man yells at cloud

Voting in an inalienable right in democratic nations. Once you gain the right to vote, it is extremely difficult to lose.

Criminals are some of the only disenfranchised groups. In Britain, a criminal’s right to vote is suspended while you serve your sentence. This is the same for Australia, except prisoners who serve less than three years can still vote. In Canada, criminals still retain the right to vote, regardless of the prison sentence. The United States has some of the most punitive measures against voting for criminals and because it varies drastically between states, I excluded the USA from this article. (Apologies to my American friends, but you can read more about the almost 6 million felons, or nearly 2.5% of voting Americans, who could not vote in the 2012 federal election here).

Voting is a pillar of equality among citizens and the act of voting is a benchmark in a person’s life.

What about the Youth Vote?

Historically speaking, the argument that youth aged 16 and above should get the right to vote is a very recent phenomenon. Before the Second World War, only people aged 21 years and older were given the right to vote in most major western democracies. In the 1970s, this age was lowered to 18 years of age in the UK, Canada, Germany, and France due to the fact that 18 years was the age of military conscription. However, some nations retain 20 or 21 years as the age of suffrage. Only since the 1990s have some nations successfully lowered the youth vote to include 16 year olds. Scotland is one of them.

Youth Polling Place Scotland

Over 75% of Scottish youths voted in the 2014 Scottish Independence Referendum

After years of campaigning, the Scottish National Party were able to give youth the right to vote in the June 2014 Scottish Independence Referendum. Impressively, over 75% of those youths aged 16 and over (who registered) turned out, compared with 54% of 18- to 24-year-olds. This turnout was considered hugely successful and resulted in Westminster granting new electoral powers to the Scottish Parliament in December 2014. Now, all youths aged 16 and over can vote in both parliamentary and municipal elections in Scotland.

Nicola with Babies

Nicola Sturgeon and the SNP Party campaigned successfully for years to secure the youth vote (Photo from BBC Article)

For the rest of Britain, youth cannot vote in UK general elections until age 18. Although calculating the youth turn-out rates must not be accepted entirely at face value, in the recent general election one statistic claimed that 72% of all 18 to 24 year olds turned out to vote. This means that turn-out rates for young British voters were remarkably high.

MollyMep

Molly Scott Cato said that denying the youth the right to vote because they aren’t responsible enough was “elitist rubbish” (Photo from BBC Article)

British politicians hotly debate the voting age. The Tories believe it should remain at 18, while Labour proposes lowering it to 16. The Liberal Democratics are somewhere in the middle, suggesting it should be only lowered for local elections. The Scottish National Party, who are very popular with Scottish youth, believe it should be lowered to 16 for general elections. My favourite, perhaps, was when the Green Party’s Mary Scott Cato said that arguments that claim 16 year olds aren’t responsible enough to vote is “elitist rubbish.”

Age is Arbitrary: “Childhood” is a Young Concept

Age as a marker is quite arbitrary, especially when you look at it historically.  In the wake of the Second World War, when over 15 million children were left homeless and resettlement was a huge crisis, the United Nations defined anyone under the age of 17 as a child. Today, childhood ends in the majority of sovereign states at age 18.

war orphans poland

These Polish war orphans at a Catholic Orphanage in Lublin, on September 11, 1946, are among the 15 million children displaced by the war. To expedite the settlement process, the UN defined all children under age 17 as a “child”

But childhood as a historical concept has only been closely examined in the last few decades. That is not to say that children or childhood were never discussed in historical sources. But, similar to race and gender, age was often overlooked, understudied or poorly represented within historical accounts.

In the 1970s, a revival of the historiography of childhood occurred as the result of the book “L’Enfant et la vie familiale soul l’Ancien Regime” (or “Centuries of Childhood,” 1962) by a French medievalist named Philippe Ariès. He argued that childhood was actually a recently-invented modern term, which evolved from the medieval period. Importantly, the concept of childhood was not static but underpinned heavily by the culture of the time. This revolutionized social history and led many scholars to investigate how Europeans transitioned from a pre-children-conscious world to one which had ‘invented’ childhood. (For an excellent overview, see Nicholas Stargardt, “German Childhoods: The Making of a Historiography,” German History, 16 (1998): 1-15).

With state intervention in education in the 19th century, and the subsequent child labour laws from the Industrial Revolution, children’s ages became both legally and economically relevant. How old must a child be to work? Can a child be charged with crime? Records of child delinquency are often the first historical accounts that children existed in certain cultural contexts. For example, historians are aware of the punishments of child delinquents in 19th C Irish workhouses, but we know little else about Irish children’s experiences in workhouses who were not delinquent.

Irish Workhouse

Illustration of children in a 19th C workhouse courtesy of workhouses.org.uk

Even biological markers of age are debatable. In the USA, lawyers have used science to argue that grey matter is still being developed well into our 20s in the same area of our brains that regulate self control; this has led to numerous cases where juveniles charged with murder have had their prison sentences reduced.  The use of puberty as a reproductive “line in the sand” has also changed in the last few hundred years: the age of puberty today (10-12 years for girls, 12 for boys) is lower today than it was centuries ago (15-16 for girls). And unlike a few centuries ago, “Puberty today marks the transition from childhood to adolescence more than it does childhood to adulthood.” Meanwhile, in the animal kingdom, biologists define animals as adults upon sexual maturity. It seems that neither the historian or biologist can agree about childhood.

And, to make it even more complicated, children as individuals also vary greatly.  Children’s experiences and what they’re subjected to also vary greatly. When in doubt, think of Joan of Arc, Malala Yousafzai, or Anne Frank.

Anne Frank

Anne Frank was just 15 years old when she died in Bergen-Belsen Concentration Camp

So what does this have to do with voting?  

If our definitions and beliefs about childhood are culturally dependent, then the ages we assign it, or the assumptions we have about it, are a product of our culture, and not necessarily an authentic reflection of “childhood.” (If such a thing actually exists).

During the medieval era, children were considered “little adults” who needed to be civilized, which presumes that children are born with inborn rationality and intelligence, but lacking social graces. A medieval parent therefore viewed childhood as a rigorous lesson in civility.

Medieval Children

During the Medieval era, children were viewed as “little adults” and as as Bucks-Retinue points out, even their clothing was just “smaller versions of adult clothes.”

But today’s parent does not view it quite like that. Due to the legality of certain social freedoms – driving a car or drinking alcohol – the state has defined a child in contradictory ways. You can join the military at age 16 in the UK, but you’re not legally entitled to toast the Queen’s health until age 18.  The predictable argument is that if you can join the military, drive a car, leave school for full-time work, pay taxes, marry (and thus have the state’s endorsement to be a parent), then you should have the right to vote. I see no fault in this argument.

So why did I start this conversation by talking about people with dementia?

Dementia is an umbrella term for various progressive neurological disorders that includes memory loss, anxiety/depression, personality and mood changes, and problems communicating. We most often associate dementia with Alzheimer’s disease, which has no cure. 46 million people suffer from dementia world wide, which is expected to double every 20 years.

In Britain, 1 in 6 people over the age of 80 have dementia, or a total of 850,000.  But having dementia, similar to having learning difficulties or other mental health problems, does not preclude you from voting. According to the Electoral Commission’s Guidance:

“A lack of mental capacity is not a legal incapacity to vote: persons who meet the other registration qualifications are eligible for registration regardless of their mental capacity.”

If someone suffers from mental incapacity or physical disability, they may assign a close relative as a proxy to vote for them (These situations are generally meant to help those serving overseas, or temporarily inaccessible, so they can still exercise their democratic rights and, sometimes, must be approved by a health practitioner or social worker). If a proxy is authorised, the Electoral Commission makes is absolutely clear that no one – whether relative, doctor, nurse or lawyer – can decide how to cast that ballot. The choice alone lay with the voter. Period.

In Britain, you cannot vote if you reside in a mental institution due to criminal activity or if you are severely mentally incapacitated and cannot understand the voting procedure. Those with dementia are still legally entitled to vote because it is not considered legally incapacitating (especially in its early stages) and worthy of disenfranchisement. Usually it is not until a doctor is requested to authorise a proxy vote whereupon someone possibly becomes disenfranchised, depending on the doctor’s judgement.

In Canada, 1.5% of the Canadian population (around 480,000) have dementia, most of which experience this after the age of 75. The Canadian Human Rights Act makes it illegal to discriminate against persons based on age or (dis)ability.

Dementia

Age is the number one risk factor for dementia.

Canada was one of four countries (Italy, Ireland and Sweden) which did not impose any mental capacity requirement (dementia included) upon the right to vote. After a legal challenge in 1992, the call for a minimum mental health requirement was repealed and by 2008, only Nunavut will disqualify someone from voting based upon mental incapacity. Thus, similar to Britain, Canadians with dementia also retain the right to vote.

What does this tell us about our society?

It is impressive that people suffering from dementia (often elderly) still retain this right. This demonstrates that nations like Britain and Canada strongly respect equality among citizens, irrespective of (dis)ability, mental (in)capacity, or age. And, importantly, this demonstrates that these nations honour the incontrovertible democratic rights of its aging and sick citizens. Discrimination is fundamentally not tolerated.

BUT to deny the youth vote while granting it to someone with a progressive neurological condition seems unfair. Should a 16-year-old “child” be considered less politically capable than someone with dementia?  Is that fair?

Youth Vote vs. “Elderly” Vote

In my frustration at this quandary, I read a provocative and humourous commentary calling for disenfranchising all elderly in Time Magazine. Joel Stein said simply “Old people aren’t good at voting”.  Although Stein avoided getting his hands dirty with dementia, he highlighted the “out of touch” policies endorsed by “old people”: They’re twice as likely to be against gay marriage, twice as likely to be pro-Brexit and nearly 50% more likely to say immigrants have a negative impact on society. Although funny, I am a staunch supporter of democracy and believe we should enfranchise people even if we disagree with them. That’s the point of democracy: to find consensus among disparate voices. Young, old, sick, healthy, rich, poor, all should be allowed to vote.

Trudeau Obama

In June 2017, Justin Trudeau and Barack Obama had an intimate dinner

Justin Trudeau and Barack Obama recently enjoyed their enviable bromance over a candlelit dinner in a little Montreal seafood restaurant. They spoke of a great many things, but one was “How do we get young leaders to take action in their communities?”

Such conversations among politicians reflect a growing interest to include the youth’s voice and agency within our political process and communities.  If what medievalist Philippe Ariès said is true – that our concept of childhood is culturally-dependent – then how our culture interprets our youth needs to change. Historically speaking, it appears that that change is already beginning. And although Scotland has taken remarkable strides towards giving political agency to Scottish youths, this can be taken even further.

By engaging youths in political process, supporting their agency and action in multiple national bodies and networks, and listening to their needs and incorporating their voices into politics, then our cultural assumptions will shift. In the same way as we honour our elders and our sick, let us honour our youths.