The Echoes of Nazism in Today’s America: History or Hype?

Is anyone else heartbroken and equally fascinated by what’s happening in America right now? The removal of the statue of Confederate general Robert E Lee from a public park in Charlottesville, Virginia, on 13 August sparked demonstrations which culminated in the tragic death of a 32-year-old woman, Heather Heyer.

But had it not been for Trump’s subsequent statement, then perhaps Heather’s death and these demonstrations could have been recorded as a local (though tragic) event. Heather may have been seen as “just one more casualty” in a long violent history between right-wing nationalists nostalgic about the old South, and everyone else that enthusiastically excludes themselves from that label.

rump-speech-hero-1_fynh2w

President Trump threw fuel on the metaphorical fire by saying, “We condemn in the strongest possible terms this egregious display of hatred, bigotry and violence on many sides, on many sides.”

But then President Donald Trump condemned the violence at a white supremacist rally “on many sides.”

“On many sides”……? Huh?

It’s been remarkable to watch the visceral uproar following Trump’s trivializing remarks. Scheduled “Free Speech” demonstrations in Boston last weekend were overwhelmed with counter demonstrators numbered in the tens of thousands. At the base of Trump Tower in NYC another 1,000 protesters shouted “Black Lives Matter,” while various sand-filled white dump trucks were positioned as a barricade. And comedian Tina Fey devoured an entire sheet cake:

Tina Fey

Tina Fey reminded white supremacists that: “It’s not our country–we stole it. We stole it from the Native Americans. And when they have a peaceful protest at Standing Rock, we shoot at them with rubber bullets. But we let you chinless turds march through the streets with semi-automatic weapons.”

Even other world leaders unequivocally condemned the white supremacists. In Germany, where any Nazi salutes, gatherings or symbols are illegal (even arresting tourists who flout the law), politicians have tweeted their dismay.

Martin Schulz, leader of Germany’s Social Democratic Party, tweeted: “One must denounce Nazis definitively. What Trump is doing is inflammatory. Whoever trivializes violence & hate betrays western values.” German Chancellor Angela Merkel called the events in Charlottesvile “absolutely repulsive.” And former Israeli Foreign Minister, Tzipi Livni, tweeted: “In Nazism, anti-Semitism and racism there are never two equal sides — only one side is evil. Period.”

And some of the strongest voices of all are those of Holocaust survivors and veterans of the Second World War. These brave people are standing up against any revival of the same racist intolerance they experienced in 1930s Germany.

Rubin Holocaust Survivor

Marianne Rubin’s granddaughter, Lena Schnall, captured a photo of her grandmother at a rally in NYC after Charlottesville. Rubin was a Jewish child in 1930s Germany and managed to escape with her parents first to Italy, France and then the US.

But are today’s racial clashes in the US actually echoing the same sentiments of Nazi Germany? Is Trump the new Hitler? Are Muslims and refugees the new Jews? Is World War III around the corner? Or is this all just a bunch of short-lived hype that will shortly disappear? 

Professor Richard J. Evans (my favourite historian and eminent Cambridge scholar) was interviewed by Slate magazine shortly after Trump’s inauguration. Well before the events of Charlottesville, Prof. Evans suggested that some parallels can be drawn between Trump and Hitler’s early days in power. I will now elaborate on three points my biggest intellectual crush raised:

1. The stigmatization of minorities.

Hitler despised those he deemed non-German. Although Hitler was a notorious anti-Semite, his prejudice towards Jews also extended to Bolshevists and progressive liberals who endorsed the Weimar Republic. Importantly, a strong right-wing community supported Hitler, which blamed Germany’s defeat in WWI upon the socialists, communists and Jews that had back-stabbed them on the homefront.  Anti-Semitic newspapers and anti-Jewish clubs were commonplace in prewar Germany.

And, once you combine this common hatred with a socially-acceptable belief in “scientifically-measurable” progression and eugenics (which was studied in numerous universities since Darwin coined the idea of “evolution”), then it’s not difficult to see how even the most civilized of societies – which Germany was in the 20th century – could believe that one race was superior to another. By 1935, the Nuremberg Laws solidified the legal persecution of Jews on the basis of “race” and by 1939, Jews were forced to wear yellow stars in public, moved into ghettos and eventually exterminated in concentration camps. Notably, this stigmatization was also experienced by other minorities, such as the disabled, homosexuals, political opponents, criminals, Roma/Gypsies, black people, religious groups…

Eugenics

A class studies the Bertillon method of criminal identification, based on measuring body parts in Paris between 1910-1915. Eugenics was practised throughout the world and was considered to be ground-breaking field of scientific research which would improve society as a whole.

Today’s America is obviously a far cry from outright genocide, but stigmatization of minorities still exists. For example, Trump’s travel ban imposed against people from Iran, Libya, Syria, Somalia, Sudan and Yemen, as well as all refugees caused huge controversy last February. Trump claimed he was protecting American borders from Muslim extremists, despite the fact that only three refugees in American history (Cubans in the 1970s) were ever convicted of terrorism.

Muslims in protest

Muslims pray in protest of Trump’s travel ban in Dallas earlier in 2017. Although  judges have overturned the order, aspects of the ban were implemented and will be reviewed by the US Supreme Court in October.

Trump’s attempts to reverse Obamacare could also be perceived as a stigmatizing the poorer segments of American society who cannot afford private healthcare. The ongoing struggle of the #BlackLivesMatter black-community against state violence and discrimination is still speaking out against various Trump policies (although the 2012 murder of Trayvon Martin which sparked the movement occurred before Trump took office).  Also, the Trump administration announced last month that transgendered individuals would be banned from serving in the US military. Even some of the most conservative Republicans and retired military generals have denounced the policy.

transgender

In August 2017, civil rights groups have announced their intent to file suit against Trump for the intended ban against transgendered people serving in the military.

2. Not adhering to conventions of normal political life.

Although the Nazi party won 37% of the vote through an entirely democratic and legitimate political process, Hitler’s almost immediately changed German laws to create a one-party state. Although historians cannot agree over who exactly burnt down the Reichstag –  Germany’s parliament – afterwards Hitler enacted emergency measures which allowed him to suspend German citizens’ rights, including habeas corpusfreedom of expressionfreedom of the press, the right of free association and public assembly, the secrecy of the post and telephone. Hitler never reinstated these rights during the Third Reich. This was, again, in the name of protecting German citizens from domestic terrorism.  In June 1934, Hitler also violently eliminated about 100 rivals in a violent purge called “Night of the Long Knives.” Then, after President Hindenberg’s death in August 1934, Hitler merged the presidency with the chancellorship. Incredibly, Hitler consolidated his power and created a violent dictatorship in less than two years.

Hitler HIndenberg

President Paul von Hindenburg and newly-elected Chancellor Adolf Hitler in a parade in Berlin in 1933. By consolidating the roles of presidency and chancellorship shortly after Hindenburg’s death in 1934, Hitler was able to create a secure dictatorship.

What about Trump’s adherence to normal political life? Trump’s administration has passed more Executive Orders than any other president since Roosevelt, but many of these orders are politically insignificant (such as designating buildings or naming people to certain positions). However, the controversies surrounding Trump, especially with regard to Russia hacking the US elections and the dramatic resignation (dismissal?) of FBI Director James Comey, throw doubt on Trump’s ability to “adhere to normal political life.”  What about Trump barring certain journalists from the White House press room? Or the appointment of his daughter Ivanka Trump, and husband, Jared Kushner, as “special advisors” and even representing the US government at the latest international G20 conference? Or the fact that less than a year into the job, some seven high ranking officials were either sacked or resigned from the Trump administration?

Sean Spicer

After White House Press Secretary, Sean Spicer‘s resignation, he said what has been claimed as the most Sean Spicer thing ever: “You can keep taking your selfies.” Huh?!

3. Spurning international agreements.

A month after Hitler came to power, he withdrew Germany from the League of Nations. He argued that the Disarmament Conference did not allow it military parity with the Western nations (meaning that Hitler wanted to rearm and the Allies wouldn’t let him).  Hitler obviously would not agree to any international policy that would limit Germany’s autonomy. He continued to flout international laws by annexing Austria and invading the Sudetenland without any hassle from the French, British or Soviets. Well, not until he invaded Poland.

Goebbels at Geneva

Propaganda Minister Joseph Goebbels in Geneva in February 1932 with the League of Nations. Shortly after, Germany would withdraw from the League.

Trump’s isolationist policies (if we can call them that) echo the goal of many historic politicians (Hitler included) who wished to put the needs of the nation before any obligations towards the international community. “America First” is Trump’s slogan which seeks to revive the exhausted American coffers by tightening federal budgets with more domestic investment (for example, by slashing diplomacy and development funding by a 32%). Recently, Trump withdrew the US from the Paris Agreement which nearly 200 countries signed in December 2015 in an effort to combat global warming and help poorer countries adapt to an already-changed planet. Similarly, however, Britain has left the European Union so perhaps isolationism is a growing economic model in an increasingly globalized world. But by spurning such international agreements, Trump appears as an antithesis to the incredible global action by the US in the last 3 years: stopping the Ebola epidemic, rallying more than 65 partners to fight ISIS, and leading those same 200 countries to forge this historic climate change agreement in the first place! Barack is probably on a beach somewhere, shaking his head…

Trump Paris Agreement

World leaders were visibly upset that Trump would not concede on an international policy for climate change.

So what?

Although the words of caution from Holocaust survivors are obviously crucial warnings during such social upheaval, I’m not convinced that Hitler’s Germany and Trump’s America are synonymous. But perhaps I’m being cynical. Prof. Evans claimed there is one big difference between Hitler and Trump: while Hitler’s speeches and policies were exceptionally well-practiced, focused and deliberate, Trump’s tweets and policies are spontaneous, erratic and unguarded. This implies that Trump is not single-mindedly achieving some ultimate vision, but leisurely and arrogantly deciding grand policies without much foresight. As comedian Frankie Boyle quips, “Trump’s nothing like Hitler. There’s no way he could write a book.”

The good news? The violence from the neo-Nazi demonstrations in Charlottesville followed by Trump’s trivializing comments has forced Americans everywhere to defend their most basic national values, whatever they perceive them to be.

Trump may not be the voice for Americans everywhere – he certainly struggles to reflect the mood of the country – but at the very least he’s forcing Americans to have a conversation, a confrontation, of what it means to be American. He’s the proverbial trickster. A rousing ringleader. The grand master of controversy. The court jester who masquerades and provokes. The words that spill out of his mouth are not intricately prearranged, but reckless, and thus easier to destroy.

Fortunately, the greatest difference between 1930s Germany and today’s America is the lack of political violence by its leader. Whereas Hitler mobilized his Sturmabteilung brown-shirt henchmen to beat up political opponents in the streets, Trump has no such organized paramilitary government wing. Instead, Prof. Evans argues, political violence has taken the form of tweets and trolling, poisoning our political discourse.

But at the same time (and with the greatest respect, Prof. Evans!), Twitter, Facebook and other social media also allow everyone – Trump included – a political voice and online presence to confront ideas, create communities and mobilize politics. Smart phones allow people to record events like Charlottesville, so that those who promote hatred and bigotry are exposed. Modern communications certainly attracts trollers and hackers, but the brutal political violence that typified Nazi Germany is not what one experiences when they open their laptop, check their news, and write their blog. Thankfully.

Covefefe.

 

Why Save the Children’s Graphic Photos Still Work Today

There is a massive famine and outbreak of cholera currently in Yemen. The United Nations recently calculated that over 20 million Yemenis are in need of immediate assistance. To put this in perspective, Yemen is a country with only 28 million people. That means that two thirds of an entire country are suffering to such a degree to require international assistance. Incredible.

In the background of this massive crisis is a civil war. In January 2015, decade-long tensions erupted between a separatist group named the Houthis (a Zaidi Shia Muslim minority) and the authoritarian president Mr. Hadi. After the Houthis surrounded the presidential palace and placed the government under house arrest, Saudi Arabia intervened and is now leading another eight Sunni Arab states in a bombing campaign to restore power back into the Yemeni government’s hands. And civil war continues to this day.

taiz-yemen

The city of Taiz has been ravaged by two years of battles between forces loyal to President Hadi, Houthi rebels and al-Qaeda

Importantly, a major port in the south called Hodeidah was seized by the Houthis. This port supplies Yemen with over 80% of its food imports. The Saudis won’t let relief ships dock there because the supplies would fall into the Houthis’ hands. This has delayed life-saving supplies for months.  Currently, the UN Security Council is trying to intervene to claim the port as strictly neutral. Let’s hope they can succeed.

In the last two years, hospitals and clinics have been destroyed. Government health officials have not been paid in a year. The basic necessities of life, like clean water and food, are a daily struggle to obtain. Cholera, which is spread by contaminated water, can kill within hours if untreated. By August 2017, it has infected more than 425,000 Yemenis and killed 1,900. And the situation is growing so severe that Oxfam calculates those infected with cholera could rise to more than 600,000 (which would exceed Haiti in 2011). The situation is obviously very grim.

Yemen Cholera Water

These Yemeni women queue for clean water. Rowa Mohammed Assayaghi, a medical microbiologist at Yemen’s Sana’a University is teaching people how to wash their hands. “Focusing on health awareness is one of the most important measures to follow,” she says.

Calls for relief from various NGOs and charities are spreading throughout the West. I’ve noticed it more recently, even on my Facebook feed. But with more than one million malnourished children under the age of 5 living in areas with high levels of cholera, charities are getting desperate. Pictures of emaciated Yemeni children are now popping up repeatedly on news websites and social media everywhere. It’s heart-breaking to watch, and uncomfortable to see (especially after I Instagram my latest foodie pic).

hodeidah-yemen starvation

A mother carries her son Imran Faraj, 8 year-old, who is suffering from malnutrition at a hospital in the port city of Hodeidah. This photo is from an Independent article in June.

When inundated with these grim photos, it sadly echoes so many other previous campaigns we may remember from past: AIDS orphans, Rwandan genocide victims, displaced children in the Sudan, starving children in Somalia, and so many others. But it’s effective. By pushing the suffering and starvation of the world’s absolute poorest children upon the western world, charities are using a remarkable game-changing strategy first used by Save the Children in the early 1920s. It changed both how we perceive children, and how we perceive ourselves. But first, the history…

Immediately after the First World War began, the Allies/Entente Powers blockaded Germany and Austria, meaning they did not send supplies, exports or any traded goods to their enemy. Much like Saudi Arabia is doing to Yemen today, blockading supplies was an effective economic weapon, especially against countries (like Germany) that depended heavily upon imports to feed its citizens.

Blockade against Germany

A Berlin butcher’s shop is looted in 1919. A combination of bad harvests and incompetent regulation of food distribution, in addition to the British blockade, made the situation far worse.

The First World War was slow-moving, hard-fought and resulted in massive causalities. An estimated 10 million people were displaced during the war. And despite the Armistice in November 1918, the food blockade against Germany and Austria continued and did not end until Germany signed the Treaty of Versailles in June 1919. That eight month period between the “end of war” and the “start of peace” resulted in mass starvation among the children of Germany and Austria.

For example, a Swiss doctor of the International Committee of the Red Cross, Dr. Frédéric Ferrière, reported that out of nearly 60,000 children examined in 1918 in Vienna, only 4,637 had been in good health. In other words, 93% of children were in bad health. (For more, see André Durand’s History of the International Committee of the Red Cross from Sarajevo to Hiroshima).

Jebb 2

Eglantyne Jebb (1876-1928) spent many years working for charities before founding Save the Children. Despite her good education and well-to-do British background, Jebb found that she was a poor teacher and not fond of children. Ironically, she became one of their chief champions in modern history.

Meanwhile, one of the first women educated at Oxford, Eglantyne Jebb, had worked for charities for years and was growing concerned about the fate of German and Austrian children under the blockade. We must remember that Germans (“the Huns”) were Britain’s national enemy for four long years. Thus, to overlook this and consider the suffering of the Germany and Austria’s children was quite remarkable. Jebb formed the Famine Council on 1 January 1919 with the direct desire to end the British blockade.

Newpaper Blockade 1919

The front page of the Detroit Sunday News on 29 June 1919

But Jebb soon discovered that her new council was not very effective. Numerous British charities were pleading for donations for various causes in 1919, such as for veterans returning home who were disabled and jobless, or countless families that fell into poverty after the war. Distributing leaflets with dense information, and by collaborating with churches and clubs to get members to donate, these various charities relentlessly campaigned for vulnerable groups. Jebb’s message was not only drowned out by the various other charities, but people were not rising above their national interests, their national prejudices, their national perspectives, to care for foreign children. Children, especially foreign ones, were often the last priority.

But Jebb and her sister, Dorothy Jebb Buxton, found a remarkable solution. They took to the streets of London and circulated a graphic “Starving Baby” leaflet. Instead of using dense text to explain her campaign to readers, Jebb plastered a large photo of a starving, desperate and pitiful 2-and-a-half-year-old Austrian baby on her leaflet. This image was haunting and even caught the attention of the local police. Although they were both arrested for spreading “unpatriotic propaganda,” Jebb (acting as her own attorney) argued the leaflets were not political, but humanitarian. The judge gave her a light fine of £5 and she reportedly felt victorious.

This was the beginning of a new type of campaigning. This was a new type of humanitarianism.

Starving Baby Leaflet

This leaflet was an unconventional way to provoke attention and revolutionised how charities campaigned for children. You may notice that Jebb does not identify the child as Austrian.

On 15 April 1919, Jebb founded the Save the Children Fund. This charity was the first to promote an abstract image of a “child.” It was the first charity to present children a symbol, an universal archetype, which were worthy of humanitarian relief, irrespective of race, nationality or creed.

Meanwhile, various noteworthy international organisations gathered in Switzerland.  They adopted neutrality and impartiality as a key strategy to facilitate relief and prevent further war. Even Save the Children moved its headquarters from London to Geneva symbolise its separation from political powers. Humanitarian historians Emily Baughan and Juliano Fiori claim that Save the Children’s apolitical approach meant that the “innate innocence and value of children (prevented) popular opposition to its humanitarian activities.” (“Save the Children, the humanitarian project, and the politics of solidarity,” in Disasters, 39 (S2): 132). For who, indeed, would oppose such humanitarian action for children?

Herbert Hoover’s relief programs, which had been incredibly successful in Belgium, also provided American food aid to Austrian children. However, relief was given in exchange for gold in 1919, which drained what little remained in Austria’s coffers in the aftermath of the war (see William E. Leuchtenburg’s Herbert Hoover). But Save the Children channelled its relief towards those same children without compensation or political gain.

Save teh Children Russia

By 1921, when the Russian Civil War had produced countless refugees and starving children, the Save the Children Fund had found it’s stride. It campaigned on the big screen by showing films of the conditions children faced to British audiences. It was unlike anything else seen at the time.

By depoliticizing the Save the Children charity and the concept of suffering children, the response for famine relief for children was considerably successful, especially in Russia. Although no humanitarian organisation can ever be entirely apolitical (!!!), Jebb and Save the Children had found a way to overcome the nationalist and prejudiced perceptions of its donors. The archetypal child had been born.

The idea of the “universal child” was also strongly defined by the Declaration of the Rights of the Child in 1924. Much like Moses descending from the mountain, the story goes that Eglantyne Jebb returned from a walk in the hills around Geneva and wrote five famous articles:

  1. The child must be given the means requisite for its normal development, both materially and spiritually.
  2. The child that is hungry must be fed; the child that is sick must be nursed; the child that is backward must be helped; the delinquent child must be reclaimed; and the orphan and the waif must be sheltered and succoured.
  3. The child must be first to receive relief in times of distress.
  4. The child must be put in a position to earn a livelihood and must be protected against every form of exploitation.
  5. The child must be brought up in the consciousness that its talents must be devoted to the service of its fellow men.
Declaration of Rights of the Child

Jebb’s Declaration (1924), pictured here, also formed the basis of the ten-article Declaration of the Rights of the Child adopted by the United Nations on 20 November 1959, some 40 years after the foundation of the Save the Children Fund.

On 26 November 1924, this Declaration was approved by the League of Nations. The members of the League were not obligated to integrate the Declaration into their own national legislation, so it did not guarantee any changes to national laws. But historian Bruno Cabanes (The Great War and the Origins of Humanitarianism, 1918-1924) argues that the 1924 Declaration singled out the protection and welfare of children as priorities for the international community and, ultimately, was more significant for its moral import than for its legal weight.

So what?

The methods of Save the Children really has saved the children. Due to the Jebb’s honest, graphic but highly impartial approach, children from all over the world are valued, regardless of their race, class or religion. Although this may not guarantee that everyone generously donates to children’s charities, it does, at the very least, overcome many nationalist and racial prejudices. And, what’s incredible is that it’s still effective today! Whether it’s a starving Austrian child due to a blockade, a African orphan of AIDS,  a drowned Syrian child on a beach, a war-stricken bombed out boy in an ambulance in Aleppo, or now Yemeni children with cholera in the midst of civil war, we can go past many labels and prejudices to see them for what they are – children.

To a certain extent, this also changes how we perceive ourselves. By promoting the concept of the “universal child” it also simultaneously reinforces the concept of a “universal guardian.” Human cultures fundamentally protect and provide for society’s most vulnerable members.  By reacting to these images of starving children with dismay and shock, and by feeling a sense of injustice, then the viewers are also imparted with a sense of responsibility. Children cannot protect or provide for themselves so we – the guardians – must intervene.

Children’s rights today are still evolving world-wide. Over 100 million children work in hazardous conditions and have no access to education. Thousands are child soldiers. Some states imprison children as young as 12 years old. Over half of today’s 65 million refugees are children.

Although Eglantyne Jebb may have been discussing starving German and Austrian children, her words are still present in today’s campaigns for Yemeni children: “The only international language in the world is a child’s cry.”

“Wars Are Not Won by Evacuation”: Untangling the Truth from the Evolving Dunkirk Myth

This month, Christopher Nolan’s long-awaited war epic “Dunkirk” hit screens worldwide. Critics have praised it as Nolan’s best film yet: a “powerful, superbly crafted film” and “a visceral, suspenseful, at times jaw-dropping historical war movie.” With a formidable British cast, a massive budget, the largest marine unit in movie history (60+ boats), and authentic filming actually occurring in the English Channel, “Dunkirk” will invariably be added to the list of war epics including Saving Private Ryan, Schindler’s List, the Great Escape and Das Boot.

Dunkirk Movie Scene.jpg

Dunkirk (2017) already as a spot in the top 30 war movies ever made

My thoughts just moments after watching the film? You get a real sense of urgency. The unwavering and intense anticipation was steadily increased throughout every scene by a soft, but perpetual tick-tock in the background. Every action sequence becomes a catharsis from the tick-tock only to return again, bringing with it this heavy feeling of apprehension that Britain’s brief, hopeful window to escape from Dunkirk is coming to an end. Time is truly ‘of the essence’ in this film.

Nolan’s Dunkirk is perhaps better appreciated by clarifying what it is not. This is not a comedy (as in, there’s not a single joke made to lighten the mood, even briefly). This is not a commentary about the highest-level political decisions of the period (there is no scene that shows Churchill furrowing his brow or naval/army/air force commanders bickering in Westminster). This is not a romance film (in fact, other than a few nurses, there was no female cast, nor insinuation to homosexual love). This is not a transnational film that attempts to bond enemies (there is no scene that shows German soldiers, except a rare glimpse at a Messerschmitt 109E and a few bombers, but even that is from a distance). This is not a documentary (despite a small amount of text after the opening credits, this film does not provide historical facts, nor interviews with survivors).

So what is it?

Perhaps this is best answered by you, the audience. For me, it was a story of survival. Well, a story of British survival. I really enjoyed it. I cringed, I cried, I squirmed, I begged, and I felt the greatest sense of hope when I saw the RAF Spitfires doing their intricate dances in the sky. (Which, coincidentally, is an excellent foreshadow to what would follow the Dunkirk evacuations – the Battle of Britain).

Walking out of the theatre on a Tuesday afternoon in July in Scotland, I followed a mother with her teenage sons. They were enthralled by the movie, but bursting with questions: “Did Grandddad fight in that? How come there weren’t more fighter planes to help the lads on the beaches? I’d shoot every German plane. The RAF were pretty incredible, can you imagine landing a plane like that on the water? Too bad they ran out of fuel. God, I’d be proper scared landing like that.”

Nolan’s film provides a fresh starting point for discussing the war, and Dunkirk particularly. Films are some of our greatest resources to access history. Of course, they must be taken with a grain of salt. According to a study by Dr. Peter Seixas, Professor of Education at the University of British Columbia, the more engaging the film, the less likely audiences were to criticize its historical merit.

Dunkrik Movie Poster (2017)  Dunkrik 1958

Instead, filmic devices, such as realistic violence and use of blood, boosted the perceived authenticity of the historical event. Older films depicting the same event, despite being limited by 1950s or 1960s censorship, were seen as less historically genuine. Interesting, no?

But if Dr. Seixas’ observation is true – that the more engaging the film, the less likely audiences are to question its historical accuracy – then Nolan is stuck between a rock and hard place. Is it possible for Nolan (or any director) to create a film that is both highly entertaining and historically accurate?

No. Let’s get real. It’s impossible to exactly mirror history into any medium, film included. But, we can give credit to Nolan for attempting to gain authenticity in other ways. Nolan wanted to make his Dunkirk epic as British as he could, despite the need for American-sized film budgets to achieve his vision. After all, Dunkirk was a British failure. And depending on your perspective, a British success. Nolan chose only British actors and emphasized the Britishness of this endeavor. Ironically, the film is expected to be more lucrative with American audiences than British. But, whereas the British are educated about the failure of Dunkirk from a young age, many Americans will be introduced to Dunkirk for the very first time through this blockbuster film.

But, importantly, Nolan’s Dunkirk is also contributing to Dunkirk’s ongoing cultural legacy.

The “Dunkirk Myth” might be defined as a Britain’s ability to embrace defeat as a platform for eventual victory; the humanity and compassion of the British people to help one other created the perception of a strong community and an enduring nation. It was the marriage of the home front with the battle front, the defeat with the victory. Since 1940, the Dunkirk Myth has been influenced by various novels, speeches, poetry, and, importantly, films.

This is why is it so very important not to lose sight of the historical facts within this national myth – now reintroduced to new generations through a super visceral, action-packed CGI-enhanced, American-budget British war movie, right?

So what was Dunkirk?

Simply put, it was evacuation of 338,000 Allied soldiers (chiefly from the British Expeditionary Force and French Army) from the beaches of Dunkirk, France from 26 May to 4 June 1940.

A few weeks earlier, Germany had launched a surprise Blitzkrieg (lightening war) on the Allied forces in western Europe. This same German maneuver had epically failed in the First World War (resulting in stagnant trench warfare). But in May 1940, Germany was incredibly successful due to the element of surprise, wireless communications, anti-aircraft guns (called flak), the tight coordination of land and air forces, and stronger tanks.

2.WK., Frankreichfeldzug 1940: Deutsche Militaerkolonnen

This German motorised column secretly advanced through the Ardennes in May 1940. This was no easy feat with 134,000 soldiers, 1,222 tanks, and nearly 40,000 lorries and cars that had to narrowly navigate heavily wooded areas.  Even “Traffic Managers” flew up and down the columns to alleviate any deadlock. But it was a stunning success. Historian Richard J. Evans claims that Germany achieved the greatest encirclement in history with 1.5 million prisoners taken with less than 50,000 German casualties.

Over 66,000 British soldiers died from mid-May until the end of the evacuations on 4 June. A combined total of 360,000 British, French, Belgian and other Allied forces died during the Battle of France, which ended with its surrender on 22 June 1940.

(For more further reading, seen Richard J Evans’ (2009) Third Reich at War, Julian Jacksons’s (2003) The Fall of France: The Nazi Invasion of 1940, Ian Kershaw’s (2008) Fateful Choices: Ten Decisions That Changed the World 1940–1941 or Ronald Atkin’s (1990) Pillar of Fire: Dunkirk 1940).

Time Map 1940

This map from Time Magazine from 10 June 1940 shows the “Nazi Trap” enclosing in on the British and French forces.

Fleeing the German advance, nearly 400,000 Allied soldiers were pushed as far west as possible, until they reached the English Channel at the beaches of Dunkirk. Churchill called it the greatest military disaster in British history. This was also the last time any Allied Forces would be in France, Belgium, the Netherlands or Luxembourg until the famous D-Day landings nearly four years later. This evacuation also meant that all of western Europe was left alone to suffer German occupation for four long years.

Why is this disaster considered a success?

Due to the mobilization of over 800 boats, ships, yachts and other private holiday vessels, 338,000 men who were standing helplessly on the beaches of Dunkirk (as many naval ships could not dock to collect them) were successfully evacuated within just 10 days. From a humanitarian perspective, this is obviously impressive.

But it also meant that commanders made impossible choices, such as leaving behind the sick and wounded, and destroying Allied vehicles, equipment and resources, lest they fall into enemy hands. It was truly a fight for survival against overwhelming enemy forces, low morale, and very few resources. It was also a fight against time. Tick-tock, indeed.

Kenneth Branagh.jpg

Kenneth Branagh’s role as a Naval Commander (with a changed name from the original) reflects  the impossible choices that British commanders faced. All army materials and vehicles were destroyed. Also,  the BEF was the top priority for evacuation. Although some 140,000 French soldiers were evacuated, nearly 40,000 were left behind.

What happened after Dunkirk? (And yes, there is a point for asking this)

Dunkirk ended the “Phoney War,” the 7-month lull on the Western Front following Hitler’s invasion of Poland in September 1939. This shocked the world and brought international attention to the fact that Germany was a formidable force. Hitler’s fiery promises to conquer Europe were not just hot air, but had legitimate merit.

HItler 19 July 1940.jpg

The conquest of France marked the highest point in Hitler’s popularity for the entire war. As the Battle of Britain began raging overhead, Hitler called for peace on 19 July 1940: “A great world empire will be destroyed […] In this hour I feel compelled, standing before my conscience, to direct yet another appeal to reason in England. I believe I can do this as I am not asking for something as the vanquished, but rather, as the victor, I am speaking in the name of reason. I see no compelling reason which could force the continuation of this war.”

Immediately after Dunkirk, the war took to the skies in a fierce combat for air superiority called the “Battle of Britain.” Why? So that Hitler’s forces could invade Britain without constant aerial bombardment in the autumn of 1940 – before winter made it impossible to invade. German Luftwaffe planes initially attacked British air bases in southern England. Royal Air Force pilots (including Commonwealth and Polish pilots) were vicious competition for the vastly superior and better equipped Luftwaffe. Daily “dogfights” were witnessed by civilians. RAF planes and pilots dropped like flies. Churchill’s famous observation that “Never in the field of human conflict was so much owed by so many to so few” reflected the fact that these tireless pilots had become the last line of defense.

RAF Pilots

The average age of a RAF pilot in the Battle of Britain, such as these handsome men above, was just 20 years old. The average life expectancy for a Spitfire pilot was just four weeks. Over 20% of pilots were from Commonwealth nations, or were Polish or Czech. Despite having a much better equipped air force, Germany suffered 2,600 pilot casualties. Britain lost just over 500 RAF pilots.

By late August 1940, a small bomb was dropped on London (German command alleged it was an error). Error or not, this expanded the range of targets to now include civilian centers. The RAF then bombed Berlin. The Luftwaffe again bombed London. The “Blitz” of British cities shadowed the same quick, surprise tactics that the German Luftwaffe had recently used so successfully against infantry forces in western Europe. Night bombings and devastating daily air raids on homes, factories, ports, lasted until May 1941, killing an estimated 40,000 Brits and making hundreds of thousands homeless.

WAR & CONFLICT BOOKERA:  WORLD WAR II/WAR IN THE WEST/BATTLE OF BRITAIN

One of the most iconic photos from the Blitz is St. Pauls Cathedral standing intact after a raid on 29/30 December 1940.

The Blitz, as it would be called, meant that British urbanites had to persist through the most difficult circumstances to continue surviving. Londoners especially “kept going” with daily tasks and came to epitomize the archetype of endurance. If ever there was a time in British history when the national character became so well tested, and so well defined, this was it. (For more reading on this very interesting topic, check out Angus Calder’s (1969) The People’s War: Britain 1939-1945 and (1991) The Myth of the Blitz and Jeremy Crang and Paul Addison’s (2011) Listening to Britain: Home Intelligence Reports on Britain’s Finest Hour, May-September 1940).

Blitz Milkman

Photos, such as this London milkman continuing his deliveries (while firefighters douse a fire in the background), came to typify the resilience and endurance of Londoners to “keep on” despite the war unfolding around them.

What about the Dunkirk Myth?

Although “The Dunkirk Myth” preceded the Blitz, it also developed alongside the Blitz spirit through various culturally-important products (for those who want the pure academic stuff, see Nicolas Harman’s 1980 Dunkirk: The Necessary Myth or an excellent review by my old supervisor, Prof. Paul Addison):

Broadcasts from JB Priestly in May 1940 reporting on the flotilla of “little ships” in the English Channel. Priestly’s depictions of ordinary Englishmen coming to the rescue of the helpless troops transformed this war from a military affair to one which required the entire mobilization of the home front. (But, to be historically accurate on this point, Englishmen weren’t voluntarily throwing themselves into the fray, but the British navy normally took charge of their vessels then used them as required to save the troops).

JB Priestly.jpg

Priestly became a formidable voice of calm reporting (and propaganda) for Britain, though he faced criticism in later life.

Churchill’s famous “We shall fight them on the Beaches” speech to the House of Commons. Everyone has heard this speech. It’s epic. But most everyone does not know that the speech was not broadcast. British newspapers printed excerpts of it, but it was not until 1949 when it was recorded.

Churchill we Shall Fight them on teh beaches COMIC.jpg

This comic from Reddit uses Churchill’s historic rhetoric to satirise reactions by today’s British authorities to threats against modern Britain.

Paul Gallaco’s Snow Goose, a short story (and eventually a novella) first published in the Saturday Evening Post in 1940. This tearjerker reveals a growing friendship between a disabled artist and lighthouse keeper and young woman, who discovers a wounded Snow Goose. Loads of symbolism paints the picture of innocence and loyal love dismantled by the tragedies of war. And the evacuations of Dunkirk become a sort of self-sacrifice for humanity, art, and first loves. The Snow Goose novella had a strong impact on British society. It was a favourite for young readers due to its short but eloquent length and even Michael Morpurgo cites it as an influence on his much-loved War Horse. People saw Dunkirk not for what is was in strict military terms – a colossal disaster – but a sort of coming of age story about the enduring spirit of British compassion and humanity.

the-snow-goose-cover1.jpg

Fantasy Book Review claims Snow Goose is a “a tribute to the indomitable human spirit”

Dunkirk (1958). Starring Richard Attenborough, John Mills and Bernard Lee, it became the second largest grossing film in Britain of that year. By following an English civilian and a British soldier, the film unfolds from two key perspectives, again cementing the myth that Dunkirk united the home and battle front in one great national rescue mission.

Richard Attenborough Dunkirk

Richard Attenborough starred in Dunkirk (1958) but did not receive an Oscar nod

Ian McEwan’s Atonement (novel) and Atonement (2007) film. Atonement follows the blossoming love of a young couple interrupted by the shocking and criminal accusations of a younger sister. Soon enough, the war unfolds and both sisters become nurses in London while the protagonist is sent to France to fight. Dunkirk (again) is used as a historical event that binds together the home front and battle front, becoming both a barrier and vehicle for unity. Director Jo Wright’s incredible scene of the Dunkirk beaches is praised as “one of the most extraordinary shots in the history of British film – a merciless ten minutes, panning across an army of bedraggled and bleeding British troops huddled on the beach at Dunkirk, with ruined ships smouldering in the shallows beyond.”

Atonement 2007.jpg

This 5 and a half minute unbroken sequence in Atonement (2007) was filmed by director Jo Wright with 1,000 extras to emphasise the chaos and disaster of the Dunkirk beaches. See it here.

Finally, Nolan’s Dunkirk (2017). An epic war film that refuses to be classified in all the genres we normally assign. I suspect it will haunt and challenge both critics and audiences for many years to come. But perhaps we should also be wary of a film that is so very one-sided? So singular in its storytelling?

tom-hardy-dunkirk

One-Dimension’s Harry Stiles may have taken all the limelight, but Tom Hardy’s performance as an sharp shooting RAF pilot definitely won my vote. Swoon.

So what?

Historically, Dunkirk was the rude awakening that not only shocked the British Expeditionary Force, but also the British home front. As Churchill said solemnly “Wars are not won by evacuation.” People began to fear for their sovereignty, their homes and their country in a way that they had never before. How they reacted was a real testament to their national character, and how they survived was a real testament to their national legacy.

Culturally, Dunkirk planted the seeds of a national myth that developed, grew and transformed as the war unfolded. Initially it was highly propagandistic with Priestly’s broadcasts or Snow Goose love stories, but as time has passed, Dunkirk’s legacy appears to still enthral the imaginations of a new generation. It was a paradox that such a defeat could be transformed into a stunning success. Now, 77 years later, we are still discussing Dunkirk’s historical relevance and cultural impact on British national identity in the face of overwhelming odds and great uncertainty.

Which begs the question – the same question others have already asked: What about Brexit?

 

 

Should the Youth be given the Vote? Historical Reasons Why Age is Arbitrary

I made a rather startling discovery. Those who suffer from dementia can still vote in the UK and Canada. “Really?” you may ask. “Really,” I reply.

Man yells at cloud

Voting in an inalienable right in democratic nations. Once you gain the right to vote, it is extremely difficult to lose.

Criminals are some of the only disenfranchised groups. In Britain, a criminal’s right to vote is suspended while you serve your sentence. This is the same for Australia, except prisoners who serve less than three years can still vote. In Canada, criminals still retain the right to vote, regardless of the prison sentence. The United States has some of the most punitive measures against voting for criminals and because it varies drastically between states, I excluded the USA from this article. (Apologies to my American friends, but you can read more about the almost 6 million felons, or nearly 2.5% of voting Americans, who could not vote in the 2012 federal election here).

Voting is a pillar of equality among citizens and the act of voting is a benchmark in a person’s life.

What about the Youth Vote?

Historically speaking, the argument that youth aged 16 and above should get the right to vote is a very recent phenomenon. Before the Second World War, only people aged 21 years and older were given the right to vote in most major western democracies. In the 1970s, this age was lowered to 18 years of age in the UK, Canada, Germany, and France due to the fact that 18 years was the age of military conscription. However, some nations retain 20 or 21 years as the age of suffrage. Only since the 1990s have some nations successfully lowered the youth vote to include 16 year olds. Scotland is one of them.

Youth Polling Place Scotland

Over 75% of Scottish youths voted in the 2014 Scottish Independence Referendum

After years of campaigning, the Scottish National Party were able to give youth the right to vote in the June 2014 Scottish Independence Referendum. Impressively, over 75% of those youths aged 16 and over (who registered) turned out, compared with 54% of 18- to 24-year-olds. This turnout was considered hugely successful and resulted in Westminster granting new electoral powers to the Scottish Parliament in December 2014. Now, all youths aged 16 and over can vote in both parliamentary and municipal elections in Scotland.

Nicola with Babies

Nicola Sturgeon and the SNP Party campaigned successfully for years to secure the youth vote (Photo from BBC Article)

For the rest of Britain, youth cannot vote in UK general elections until age 18. Although calculating the youth turn-out rates must not be accepted entirely at face value, in the recent general election one statistic claimed that 72% of all 18 to 24 year olds turned out to vote. This means that turn-out rates for young British voters were remarkably high.

MollyMep

Molly Scott Cato said that denying the youth the right to vote because they aren’t responsible enough was “elitist rubbish” (Photo from BBC Article)

British politicians hotly debate the voting age. The Tories believe it should remain at 18, while Labour proposes lowering it to 16. The Liberal Democratics are somewhere in the middle, suggesting it should be only lowered for local elections. The Scottish National Party, who are very popular with Scottish youth, believe it should be lowered to 16 for general elections. My favourite, perhaps, was when the Green Party’s Mary Scott Cato said that arguments that claim 16 year olds aren’t responsible enough to vote is “elitist rubbish.”

Age is Arbitrary: “Childhood” is a Young Concept

Age as a marker is quite arbitrary, especially when you look at it historically.  In the wake of the Second World War, when over 15 million children were left homeless and resettlement was a huge crisis, the United Nations defined anyone under the age of 17 as a child. Today, childhood ends in the majority of sovereign states at age 18.

war orphans poland

These Polish war orphans at a Catholic Orphanage in Lublin, on September 11, 1946, are among the 15 million children displaced by the war. To expedite the settlement process, the UN defined all children under age 17 as a “child”

But childhood as a historical concept has only been closely examined in the last few decades. That is not to say that children or childhood were never discussed in historical sources. But, similar to race and gender, age was often overlooked, understudied or poorly represented within historical accounts.

In the 1970s, a revival of the historiography of childhood occurred as the result of the book “L’Enfant et la vie familiale soul l’Ancien Regime” (or “Centuries of Childhood,” 1962) by a French medievalist named Philippe Ariès. He argued that childhood was actually a recently-invented modern term, which evolved from the medieval period. Importantly, the concept of childhood was not static but underpinned heavily by the culture of the time. This revolutionized social history and led many scholars to investigate how Europeans transitioned from a pre-children-conscious world to one which had ‘invented’ childhood. (For an excellent overview, see Nicholas Stargardt, “German Childhoods: The Making of a Historiography,” German History, 16 (1998): 1-15).

With state intervention in education in the 19th century, and the subsequent child labour laws from the Industrial Revolution, children’s ages became both legally and economically relevant. How old must a child be to work? Can a child be charged with crime? Records of child delinquency are often the first historical accounts that children existed in certain cultural contexts. For example, historians are aware of the punishments of child delinquents in 19th C Irish workhouses, but we know little else about Irish children’s experiences in workhouses who were not delinquent.

Irish Workhouse

Illustration of children in a 19th C workhouse courtesy of workhouses.org.uk

Even biological markers of age are debatable. In the USA, lawyers have used science to argue that grey matter is still being developed well into our 20s in the same area of our brains that regulate self control; this has led to numerous cases where juveniles charged with murder have had their prison sentences reduced.  The use of puberty as a reproductive “line in the sand” has also changed in the last few hundred years: the age of puberty today (10-12 years for girls, 12 for boys) is lower today than it was centuries ago (15-16 for girls). And unlike a few centuries ago, “Puberty today marks the transition from childhood to adolescence more than it does childhood to adulthood.” Meanwhile, in the animal kingdom, biologists define animals as adults upon sexual maturity. It seems that neither the historian or biologist can agree about childhood.

And, to make it even more complicated, children as individuals also vary greatly.  Children’s experiences and what they’re subjected to also vary greatly. When in doubt, think of Joan of Arc, Malala Yousafzai, or Anne Frank.

Anne Frank

Anne Frank was just 15 years old when she died in Bergen-Belsen Concentration Camp

So what does this have to do with voting?  

If our definitions and beliefs about childhood are culturally dependent, then the ages we assign it, or the assumptions we have about it, are a product of our culture, and not necessarily an authentic reflection of “childhood.” (If such a thing actually exists).

During the medieval era, children were considered “little adults” who needed to be civilized, which presumes that children are born with inborn rationality and intelligence, but lacking social graces. A medieval parent therefore viewed childhood as a rigorous lesson in civility.

Medieval Children

During the Medieval era, children were viewed as “little adults” and as as Bucks-Retinue points out, even their clothing was just “smaller versions of adult clothes.”

But today’s parent does not view it quite like that. Due to the legality of certain social freedoms – driving a car or drinking alcohol – the state has defined a child in contradictory ways. You can join the military at age 16 in the UK, but you’re not legally entitled to toast the Queen’s health until age 18.  The predictable argument is that if you can join the military, drive a car, leave school for full-time work, pay taxes, marry (and thus have the state’s endorsement to be a parent), then you should have the right to vote. I see no fault in this argument.

So why did I start this conversation by talking about people with dementia?

Dementia is an umbrella term for various progressive neurological disorders that includes memory loss, anxiety/depression, personality and mood changes, and problems communicating. We most often associate dementia with Alzheimer’s disease, which has no cure. 46 million people suffer from dementia world wide, which is expected to double every 20 years.

In Britain, 1 in 6 people over the age of 80 have dementia, or a total of 850,000.  But having dementia, similar to having learning difficulties or other mental health problems, does not preclude you from voting. According to the Electoral Commission’s Guidance:

“A lack of mental capacity is not a legal incapacity to vote: persons who meet the other registration qualifications are eligible for registration regardless of their mental capacity.”

If someone suffers from mental incapacity or physical disability, they may assign a close relative as a proxy to vote for them (These situations are generally meant to help those serving overseas, or temporarily inaccessible, so they can still exercise their democratic rights and, sometimes, must be approved by a health practitioner or social worker). If a proxy is authorised, the Electoral Commission makes is absolutely clear that no one – whether relative, doctor, nurse or lawyer – can decide how to cast that ballot. The choice alone lay with the voter. Period.

In Britain, you cannot vote if you reside in a mental institution due to criminal activity or if you are severely mentally incapacitated and cannot understand the voting procedure. Those with dementia are still legally entitled to vote because it is not considered legally incapacitating (especially in its early stages) and worthy of disenfranchisement. Usually it is not until a doctor is requested to authorise a proxy vote whereupon someone possibly becomes disenfranchised, depending on the doctor’s judgement.

In Canada, 1.5% of the Canadian population (around 480,000) have dementia, most of which experience this after the age of 75. The Canadian Human Rights Act makes it illegal to discriminate against persons based on age or (dis)ability.

Dementia

Age is the number one risk factor for dementia.

Canada was one of four countries (Italy, Ireland and Sweden) which did not impose any mental capacity requirement (dementia included) upon the right to vote. After a legal challenge in 1992, the call for a minimum mental health requirement was repealed and by 2008, only Nunavut will disqualify someone from voting based upon mental incapacity. Thus, similar to Britain, Canadians with dementia also retain the right to vote.

What does this tell us about our society?

It is impressive that people suffering from dementia (often elderly) still retain this right. This demonstrates that nations like Britain and Canada strongly respect equality among citizens, irrespective of (dis)ability, mental (in)capacity, or age. And, importantly, this demonstrates that these nations honour the incontrovertible democratic rights of its aging and sick citizens. Discrimination is fundamentally not tolerated.

BUT to deny the youth vote while granting it to someone with a progressive neurological condition seems unfair. Should a 16-year-old “child” be considered less politically capable than someone with dementia?  Is that fair?

Youth Vote vs. “Elderly” Vote

In my frustration at this quandary, I read a provocative and humourous commentary calling for disenfranchising all elderly in Time Magazine. Joel Stein said simply “Old people aren’t good at voting”.  Although Stein avoided getting his hands dirty with dementia, he highlighted the “out of touch” policies endorsed by “old people”: They’re twice as likely to be against gay marriage, twice as likely to be pro-Brexit and nearly 50% more likely to say immigrants have a negative impact on society. Although funny, I am a staunch supporter of democracy and believe we should enfranchise people even if we disagree with them. That’s the point of democracy: to find consensus among disparate voices. Young, old, sick, healthy, rich, poor, all should be allowed to vote.

Trudeau Obama

In June 2017, Justin Trudeau and Barack Obama had an intimate dinner

Justin Trudeau and Barack Obama recently enjoyed their enviable bromance over a candlelit dinner in a little Montreal seafood restaurant. They spoke of a great many things, but one was “How do we get young leaders to take action in their communities?”

Such conversations among politicians reflect a growing interest to include the youth’s voice and agency within our political process and communities.  If what medievalist Philippe Ariès said is true – that our concept of childhood is culturally-dependent – then how our culture interprets our youth needs to change. Historically speaking, it appears that that change is already beginning. And although Scotland has taken remarkable strides towards giving political agency to Scottish youths, this can be taken even further.

By engaging youths in political process, supporting their agency and action in multiple national bodies and networks, and listening to their needs and incorporating their voices into politics, then our cultural assumptions will shift. In the same way as we honour our elders and our sick, let us honour our youths.

From a Land of Immigrants to a Land of Colonisers: A Lesson in Canadian Diversity for British Policymakers

This is a big year for Canada. After 150 years of explosively entertaining hockey, igloo-icy winters, and deliciously decadent Timbits, people around the world will celebrate Canada’s sesquicentennial. Happy birthday, Canada.

Canada Day.png

For those who have ever travelled, studied or lived abroad, you begin to appreciate your homeland in an entirely new way. As American philanthropist Cliff Borgen said “When overseas you learn more about your own country than you do the place you’re visiting.”  The novelty of other cultures is endearing and even helpfully distracting from the monotony of your normal life. But it’s when we are forced into new cultures when we are confronted with the reality that our own customs, traditions and protocols are sometimes arbitrary, bizarre and inefficient.

In this sense, travelling is not just gazing into the porthole of another new place, but actually a much more inverted and introspective experience. You begin to realise the ways you are fortunate, and the ways you are deprived. This even makes you think differently, apparently. According to one study, “People who have international experience or identify with more than one nationality are better problem solvers and display more creativity.” But, crucially, this depends on openness, an ability to embrace other people, cultures and ideas, which also means you’re happy to accept ambiguity and a lack of closure.

But what happens if you already hail from a country that values inclusivity, openness, diversity? How does that change your experience abroad?

As a Canadian, I think I am already “open” to others. It’s part of my “culture,” eh? Just under 40% of Canadians are immigrants or second-generation immigrants, and that is expected to rise to half the population by 2036. Canada is about as diverse a country as you can experience. A true land of immigrants. Canada is not a melting pot. Unlike the USA, newcomers to Canada are not expected to shed their cultural cloaks, assimilate and promptly adopt the “Canadian Dream.” Instead, Canada’s strength is its diversity. We embrace others.

best-diversity-employers

Happy Photo of Canadian Diversity from candiversity.com

It wasn’t always like that. In 1970, Prime Minister Pierre Trudeau faced a major domestic crisis due to rising French nationalism in Quebec. Separatists wanted quicker political process and to expedite their demands they kidnapped a cabinet minister and British diplomat, resulting in the FLQ or October Crisis. Trudeau enacted the War Measures Act and tanks rolled into Montreal. Martial law was controversial and when asked by a reporter how far he would take such policing, Trudeau famously replied: “Just watch me.”

PETrudeau Just Watch Me

Watch Pierre Trudeau’s steely reaction to reporters here

In the background of this domestic upheaval was the introduction in the late 1960s of a new points-based system for immigration. Applicants were awarded points for age, education, ability to speak English or French, and demand for that particular applicant’s job skills. If an applicant scored enough points, he or she was granted admission together with their spouse and dependent children.

These “landed immigrants” were given all the same rights as Canadian-born citizens. A new sponsorship system also meant that immigrants could also sponsor relatives abroad for settlement. This allowed naturalized Canadians to engage in the immigration process. And, importantly for Trudeau, immigrants were given the right to vote.

By opening the doors and flooding the country with immigrants, while espousing a strong multiculturalist ideology, Trudeau and his Liberals diluted the Anglophone vs Francophone tensions. The Liberals, predictably, courted the newly arrived voters and sought policies that would appeal to them. Politically speaking, it was superior “checkmate” move against the radical separatists. Decades later, the same maneuver was used again by Conservative PM Stephen Harper, who needed to win a coalition in order to stay in power. The newly arrived minority voters were wined and dined which, in turn, meant that anti-immigrant groups were kept on the edges of politics. In the 2011 and 2015 elections, the Conservatives won a higher share of the vote among immigrants than it did among native-born citizens.

If it wasn’t already clear from centuries of Canadian history, then such politics firmly cemented the immigrants’ place in Canada’s national identity.

Right-wing, anti-immigrant political agendas are rare in Canada. Of course, there are always exceptions. Canada still has anti-Semites and people shooting up mosques out of fear of “the other.” One study recently claimed that anti-immigration sentiment was rising in Canada, although the same study claimed that over half of Canadians still agree to allow immigrants from poor countries. (Sweden’s 75% approval for immigration is the highest of all nations studied).

But let’s also remember the difference between immigrants and refugees. In 1978, Canada instituted the Canadian Immigration Act, whereby refugees – persons fleeing armed conflict or persecution – would no longer be an exception to Canadian immigration regulations. Although there were some problems, it remains a cornerstone of Canadian immigration policy and law.

For example, the Syrian Refugee Crisis caused the United Nations High Commissioners for Refugees to call on western nations to resettle 130,000 refugees. Canada has carefully focused on selecting families, children and members of the LGBT community, while single men will be processed only if they are accompanied by their parents or identify as LGBT. From 2013 until January 2017, Canada has welcomed over 40,000 refugees, or a staggering 248% of its “share” of refugees.

The United Kingdom? It has welcomed 216 Syrian refugees under the UNHCR scheme. Through another domestic policy called the Vulnerable Person Resettlement Scheme, it has welcomed 5,423 Syrians by March 2017, or just 18% of its “share.”

Prime Minister David Cameron, under severe public pressure in 2015, promised to take on 20,000 Syrian refugees by 2020. Even more mounting pressure caused him to announce the Dubs Amendment, whereby 3,000 lone child refugees from the Middle East were to be welcomed. Due to pressure from Theresa May (who was then Home Secretary), Cameron conceded child refugees should come from Europe, not the Middle East, and the number was lowered to just 350 children.

Refugees Welcome.jpg

Demonstrators in Berlin in November 2015

When Calais’ Jungle Camp was at a breaking point in 2016, and Prime Minister May was securely in control at Downing Street, more public pressure forced to her accept another 750 lone children.  (This was done reluctantly and controversially, as refugee children’s dental records were screened to “verify” their true ages. As Hugh Muir writes, “We want to do right by a handful of children, but it is really a way of shirking our duty to do the right thing”).

Calais children

Children in Calais’ “Jungle” Refugee Camp, October 2016

Welcoming 1,000 refugee children by modern day peace-time Britain stands in stark contrast to the 10,000 refugee children resettled via the Kindertransport to Britain in 1938 to 1940. As a historian, I shudder to think what would have happened to those thousands of children if they had stayed under Nazi Germany’s control throughout the war.

Kindertransport.jpg

German-Jewish refugee children arrive at Southampton in 1939

Additionally, Theresa May proposes to lower annual net immigration from 273,000 to just 100,000. But it doesn’t stop there. From April 2017 onwards, the Tories implemented a policy whereby British employers must pay £1,000 per year for each skilled migrant they hire. The Tories wish to increase it to £2000/year. That means that if your average Indian IT software engineer or Canadian postgraduate student successfully gets through Theresa May’s restricted immigration net, then they face further fiscal penalization in the pursuit of employment due to being foreign. Thanks, Britain.

As an immigrant in the UK who hails from a country where immigration is a cornerstone of my home culture, I just hang my head in shame. As a historian of refugees and modern warfare, I can say that the same self-serving, nationalist ideologies that caused so many borders to close and so many refugees to flee during the Second World War, are still true today.

So, what are some solutions?  

Political inclusion of minority voters. Enfranchisement of immigrants (including EU nationals). Open (though still selective) immigration policies. Bring back the Dubs Amendment. Invest in affordable housing. Delegate to charities (where possible). Celebrate all forms of Britishness, including minority groups. Delight in globalism and mobility.

But the best solution requires a major attitude shift. 

Britain was once a colonial and imperial superpower. Although this was by no means a peaceful power-dynamic on native populations or settler colonies, British rule also enabled enormous trade of goods, cultures and ideas. Some nations became immensely wealthy, while others were robbed of their natural and human resources. The gap in global living standards today are often a long-term result of colonialism’s exploitation.

British Empire.jpg

At its height, more than 458 million people and 23% of the world’s population were under British colonial rule

Professor of Sociology at the University of Warwick, Gurminder K Bhambra, claims that when thinking about today’s refugees and immigrants, we must remember that:

The economic motivation that drives poorer people to migrate has been produced and continues to be reproduced by practices emanating from richer countries and their own deficient understandings of their global dominance… The failure to properly understand and account for Europe’s colonial past, cements a political division between ‘legitimate’ citizens with recognized claims upon the state and migrants/refugees without the rights to make such claims.

It would be unfair to claim that Canadian history has been bloodless and peaceful, while Britain’s has been singularly exploitive and war-ridden. But personally moving from a land of immigrants to a land of colonisers has been an eye-opening experience.  

Canada, as a nation of immigrants, has attempted to confront its differences in an ongoing process of renegotiating and re-conceptualizing national identity, bringing immigrants to the fore with policies that directly value and embrace their diversity. Britain may have neglected to engage in such a process on their own soil, but the opportunity to do so is now arriving alongside the refugees and immigrants who greatly wish to be part of the British community. Myself included.

I am an immigrant and I love my new home in Britain.  By learning from my new culture while sharing my own, I am participating in a “very Canadian way” to integrate in society. I hope my British friends don’t mind.

Happy Birthday, Canada.