Skeleton – that gentle pastime, wherein you throw yourself down a claustrophobic strip of ice, head first (of course), steering yourself past 80mph with intricate wiggles and taps of your feet.
At some point it became a British pastime. The ice mountain has become a modern-day medal mine, starting with Alex Coomber’s bronze at Salt Lake City in 2002, in the first ever Olympic women’s skeleton.
Coomber had started something. Shelley Rudman claimed silver four years later in Turin, before Amy Williams took the big prize, dominating the ice at Vancouver 2010. Then Lizzy Yarnold turned up, matching Williams’ achievement with gold in Sochi. Fast forward to Pyeongchang, she’s at it again, now with Laura Deas for company. Yarnold defended her title with a mighty final slide, joined on the podium by Deas who took bronze. Britain’s skeleton women are world beaters.
Not bad for a nation that is seemingly allergic to winter, where a patch of snow shuts everything down. Yet the British sliding tradition goes back way beyond Coomber, to the very origins of the sport.
It all started with a wager, in the Swiss resort town of St. Moritz. In 1864, St. Moritz was a spa town, popular with wealthy Victorians who headed to the alps to replenish themselves in summer. Then September came, and the British packed up and headed to London with their riches in tow, much to the chagrin of the hotelier Johannes Badrutt. So that year, he convinced his guests to return for winter. If you are bored, cold, or unhappy, he said, I will pay for everything. It is said that the British frolicked in glee at the newfound Alpine winter, gasping in amazement of the sunshine (of all things) and the frosted scenery. Badrutt won his bet, of course, but unleashed a can of worms upon the Swiss hills. The British came back every winter and, like the British tourists of today, decided to take over the town and unleash mayhem.
The legend goes that one group of tourists got hold of a delivery sled, and got on it, sliding through the icy, precarious, and dangerously downhill narrow streets of St. Moritz, terrifying all who crossed their path. Their (slightly) more sophisticated successors decided that a lowly delivery sled would just not do – they would not travel in anything less than a carriage – and so the first bobsleds were built.
The Swiss hoteliers had to face the monster they had created – the townspeople had grown weary of the troublesome tourists and so the hoteliers built purpose-built toboggan tracks to get the British off their streets. One of these became the Cresta Run, built by Major John Bulpett in 1884/5, and soon became the spiritual home of the sport and the base of the St Moritz Tobogganing Club (SMTC). In 1885, they held their first “Grand National” race and, five years later, an “erratic” member called Mr Cornish decided to slide the Cresta head first. Skeleton was born.
Sadly, like many sports, sledding grew very exclusive very quickly, once codified. It’s always been a sport for the wealthy, but in those early days there were no rules or limits as to who could grab a sled atop an icy hill and throw themselves down it. In the 1920s, the SMTC saw it fit to ban women from their course.
“Mrs J.M. Bagueley was the last lady to ride the Cresta in a race on 13th January 1925. Ladies rode in practice after that date, but were banned from riding on 6th January 1929.”
The ban remains. The Cresta’s terms and conditions simply state “women are not permitted to ride the Cresta run.” I wonder what changed their minds. In the early 20th century, there was a global backlash against women in sports, based on some preposterous notion of physical inferiority. In the Olympics, women were not allowed to run further than 200m until 1960. In the USA, women were banned from running the marathon for fifty years. In England, a similar ban existed for women in football, deemed “too much for a woman’s physical frame.”
Women have campaigned for decades to turn things around, and in recent years the situation has improved. The marathon ban was lifted in 1972, in part thanks to Kathrine Switzer, who ran the Boston Marathon in 1967 – entering as K.V. Switzer – and finished in good time despite attempts to remove her from the course. Each Olympics sees an increased number of women participants, and gets ever closer to event parity – the last male-only event in the Winter Games remains the Nordic Combined.
Yet the birthplace of skeleton remains closed to Shelley Rudman, Amy Williams, and Lizzy Yarnold. Perhaps the embargo endures because of the Old Boy tradition that surrounds the Cresta. Visitors, like Matt Dawson and Ian Cowie, note the mess hall atmosphere of the place and even spot a few descendants of the Nazi Joachim von Ribbentrop, but largely seem untroubled by the complete absence of women. Are they afraid of a little competition? Four years ago I thought this was a shame, but now I think it pitiful. Britain’s skeleton women are unstoppable – they are the headline acts of every Winter Games. Inspired by the “marginal gains” of British cycling, with less of the institutional misogyny, the skeleton set-up in Bath is the very model of a modern sporting powerhouse – professional, competitive and smooth as ice. If the SMTC doesn’t want to share their ice with such athletes, then that is their loss.
This is a dynasty whose queens are great symbols and great advocates of women in sport. Upon winning in Sochi, Yarnold came home determined to go into “as many schools as possible” and encourage girls to take up sport, and “not [to worry] about what the media image is of the perfect woman, it’s about being you and being proud of what you are.”
They are even inspiring the men (how could they not?). Dom Parsons followed in their footsteps on Friday, taking skeleton bronze. The Times connected his success to the Cresta Run “crazy aristocrat” pioneers, but he follows in different footsteps. This tradition, crowned by Lizzy Yarnold, started in 2002 with RAF officer Alex Coomber. She who slid the course at Salt Lake City with a broken wrist, which she’d injured just ten days earlier in training, and took the bronze.
New Year’s Eve likes to fool you into feeling cautiously optimistic.
That optimism is, of course, relative to that which we’ve waded through to make it through those twelve months previous. So on the last day of 2016, some might have thought “maybe 2017 wouldn’t be as bad as this one.”
Two millennia from now, the archaeologists and archivists of the hyper-intelligent irradiated cockroach people uncover evidence of 2017, and after copious analysis, come to view it as a seminal year in the coming of their Megaloblatta-Sapiens race.
From the vantage point of their compound eyes, the debates of the day, such as how many North Koreans is it morally acceptable to incinerate, or whether it is wrong to punch a Nazi (a question which reckons a priori that it’s ok to be a Nazi), seem moot or trivial. All must be incinerated, after all, to achieve the supremacy of the mandible.
Their museums trumpet the self-destruction of the human race. Adorning the 2017 gallery walls are their heroes of our age.
The paedophile judge almost elected to a Senate that steadfastly clung to the notion that they were once something more than but the guardians of a phallic ivory hierarchy. The false cowboy’s failure allowed them to delude themselves for another, fatal year, as the floor fell from under their feet.
A cabinet commemorating the orange wigs and boot polish that decorated the year’s Halloween festivities, and yet more pretty little tricksters charming their way out of the ooze – another “change” candidate.
Ah, the Britain exhibit. Sir Nick Clegg. Knighted for services to something – lost to the ages, no doubt. Sir Ringo, the great historian of tank engines, and the days folk could afford rail. Barry Gibb got one too. I suppose in the end times, you get points just for stayin’ alive.
Loyalty is valuable, but our lives are valuable too
There is much chuckling below chitin among patrons as they look back at us steaming headlong into the ravine, distracting ourselves with royal weddings, royal babies, royal Netflix, and royal racism scandals. And how dare they blame us? But for a little escapism, what joy was left us outside of the pharmacy. And lo, the greatest distraction of all, Europe. All eyes looked to the channel trying to work out what it meant to be British, whilst all eyes looked down in Kensington, avoiding the stare of those cremated in their homes because the tower block was deemed too unsightly.
Does it look better now?
Our dear patrons may then scuttle on through to a cinematic rendition of 2017’s finest quotes, courtesy of the rape apologists and baby demagogues, now widely accessible on the vast online archive humanity’s ghost left behind. According to Prof. Blaberoid the Hisser, eminent human historian, these virtual sabre-tongues mimicked the behaviour of their all-powerful leaders, who enjoyed an unprecedented period of rubbing the vomit of their impervious corruption into the faces of those who dared challenge them.
Pol. Pot. Pol. Pot. Pol. Pot. Pol. Pot.
And the wrong words make you listen
Humanity doomed itself in this quagmire, the professor explains on the latest edition of the Gregarious FM podcast. The situation reached critical mass around these figures, who sucked in all challenges, spitting out mercury and lead into the brains of all who listened. Many ducked for cover, their already-fragile minds could not stand another hit. Others chose noble hills upon which to perish, but this was no age for martyrs, and such warriors were dragged to their doom by another barrage of fascist incomprehension (or else stabbed in the back) – they lay in graves unmark’d, with legacies stolen and diluted. With their would-be challengers now scattered and divided, it was only a matter of time before these rat-king leaders turned upon one another.
The intellectuals emerged from their fox-holes, temporarily, to look aloof upon the massacre, pondering only where all the millennial poets went.
As for me, I started the year attending the cremation of a friend, who fell because we could not break his fall, and ended it getting robbed, so you can forgive my cynical tone.
In between, I’ve been looking for something. Strength, defiance, hope. On darker days, it can feel like hopes fade into prayers or delusions. When there’s nothing else out there, that is still a great deal. At the start of the year, bereft, I looked back at the dust-covered words of Obama’s ostracised pastor Jeremiah Wright. The audacity of hope – it sometimes requires a fantastic imagination or a leap of faith. But somehow, I’ve got to stay grounded, or else I lead myself toward further disappointment.
Elsewhere, I just tried to take a moment, and make sense of it all, looking for comparisons in the past to try and understand the ostensible chaos of the present (like with Catalonia and Kosovo), or staring at the sun long enough to gain some blind understanding (“What a time to be alive”).
I’ve got to write it down, but I’m still getting educated
More often though, I’ve been trying to find strength to stand up when feeling particularly helpless or lost, in those who have done it all before. In the old punks who rocked against racism, or John Fitch, who witnessed first-hand a horrific disaster and dedicated his life to see that such things cannot repeat themselves.
There are two things that have kept me going these last few months. The first is through seeking meaning in the death of my friend, or more accurately, meaning in his life. I’ve been holding on tightly to what made him proud of me, and what I admired in him, and trying to keep it alive daily.
I’ve got to write it down, and it won’t be forgotten
The other, I remember well, was walking to work on the 23rd May, seeing all the bees. The Manchester bees. I have always called it the Bin Wasp, because before this I usually came across the winged mascot on the street bins of Manchester streets.
I did not want to leave the house that day, because I was scared. Like everybody else, I felt vulnerable that a place I knew well, a place I walked by every week, had been the scene of such horror. And like many brown folk in England that day, I feared the looks, the reprisal, the armed police, the misplaced suspicion. All I saw that day in my adopted city were bees. Rejecting the lot. That’s the way I chose to look at it, that day. People feeling vulnerable together, finding strength together, in each other. I was reminded of something I read when teaching the Freedom Summer.
“When we sing ‘We are not afraid,’ we mean we are afraid. We sing ‘Ain’t gonna let my fear turn me round,’ because many of you might want to turn around now.”
Strength in the past, strength in the present. Sometimes it’s important not to seek too much solace in history, or fear too greatly a roach-infested future. And I think about the best moments of the year gone by – teaching my seminars, going to the test match with a good friend, bidding friends farewell as they set off for new jobs, new homes, or new adventures. Spending time with the people I love, be it in the cinema, a fancy dress party, or sat on a sofa in Manchester somewhere. Whatever else next year holds, I hope for more instants like this.
For in the event, that this fantastic voyage should turn to erosion, and we never get old, we can always hold close the very best moments in even the worst of years.
(Disclaimer. I actually think cockroaches are neat)
Punk nostalgia is back. This age is ripe for it, as there’s a whole host of reasons for people to be frustrated and angry, particularly for the young among us, as opportunities diminish. There’s something very punk, after all, about tearing down a goading monument to those who wished you in shackles.
There’s something very ’76 about now, and it makes a lot of sense for people to draw from a moment where people were moved by art, music and fashion to stand up for themselves. In 2017 I’ve often kept London Calling in my earphones, blasting out Clampdown for inspiration, for strength.
But there’s also something in punk that embodies the mass appeal of manifest right-wing hate, which is once again loose, having bubbled for years. There’s something “punk” about donning a swastika, for shock or for awe.
Nostalgia observed through blinders can be a dangerous thing. Young(er) punks like me came to the culture at its nadir, and knew it as a scene that welcomed anyone and everyone so long as you shared its passion, its strip-it-down catharsis, and its tolerance. Its British origin story is dominated by the memory of the Pistols and the Clash – between anarchy and socialism – between the expression of ‘70s working class feeling and those who tried to channel that anger into a revolutionary riot.
Something’s missing here. The memory of this spirit of ’76 is incomplete. Punk is (and was) polarisation writ loud, a centrifugal splattering of all things, an explosion of possibilities and frustrations. It was the creation of a new voice, but it was also in its founding moments a REACTION against the tame, the overblown, and the delusionary. Sometimes wonderful things can be produced, when the centre fails to hold. You talk to the old punks now, and their recollections lack the political romance us third-gen Clash disciples ascribe to that moment.
There always was in punk a leftist appeal, and the movement quickly developed an activist wing, exemplified by the famous Rock Against Racism carnivals of ’78. Yet so much more washed through the maelstrom of punk. No punk would have been even slightly surprised when John Lydon backed Trump and Brexit.
Many of the old guard insist that punk was not a political event. I’ve seen this often in response to accusations from Johnny-come-lately historian types that punk “didn’t do enough” to oppose racism and the rising fascist tide of the late ‘70s.
Were they supposed to?
As it turned out, due to the movement’s initial actions and the turmoil of the late-70s, punk rockers had to deal with the Nazis within, whether they wanted to or not.
So declared Strummer in the 1977 reggae-inspired track “(White Man) in Hammersmith Palais”, which railed at both the punk and pop reggae scenes for ignoring the racial inequality and bubbling white supremacy in the streets of late-70s London. Its iconoclasm is a sign that Strummer felt isolated in his views within the scene, rather than an embodiment of punk ethics. Indeed, the Clash’s political articulation was inspired as much by the activist reggae – the roots, rock, rebel – of Marley, Cliff, Tosh and Marvin than elsewhere. More than anything, Strummer was moved by black activism in West London embodied in the scorched streets of the ’76 summer and the Notting Hill carnivaliants’ response to police abuse that year. In (White Man), he took aim at his punk contemporaries –
– and particularly targeted The Jam, who famously worked to tear down the established order by encouraging their fans to vote Conservative (a call Paul Weller later regretted).
Upon whose shoulders falls the responsibility to take on the fascists?
It’s another weird quirk of history to remember that Thatcher cast herself as the “change” candidate in late-70s Britain- when the lights were going out, in millions of homes and thousands of flats, guarded by the growing garbage monuments of discontent.
Then as now, the Tory presses spammed the notion that the turmoil was caused by Jack Jones and the corrupt trade unionists, and an overblown public sector protected by Jim Callaghan’s faltering government (ignore Callaghan’s significant and pioneering privatisation, that’s just Fake News). Thatcher would return order, trim the fat, get Britain working again (three million would soon be unemployed). It was a lucrative line of attack for the Tories, to paint Labour as luddites, clinging on to an overgrown, stagnant way of life, whilst suggesting that turning the clock back to Victorian England was somehow a path to progress (wonder if they’ll try that trick again). To Weller’s comparative moddish-punk mind, the Tory claim sat neatly to what punk meant to many – back to basics, to chord and melody, away from the inaccessible self-indulgence of prog’s excesses. The stars aligned.
The Conservatives of course baulked at punk’s brashness. It was too unstable an element, made of too many unreliable parts, to conform to any confined ideology, let alone the sensitive prudishness of old-school Toryism. The Pistols had already taken good care of that before punk really took off.
The Pistols were delirious gunmen, shooting wildly into a crowd at anything they could take down a notch. What politics you might gauge from this manufactured nihilism depends entirely on when you freeze the frame.
But there was plenty political about the Pistols, and the Ramones and the Dolls and this strip it down approach. Punk is challenging at its very core. It insists on denying anybody authority on knowledge. It hates lecturers, soapboxers and pedants – punk’s orators are instead conduits of feeling and frustration. It’s dimed out, all in. Like reggae, punk is a small axe. Ready to cut you down.
But first up, it was about marketing shock factor. Malcolm Maclaren, Bernie Rhodes and proto-punk’s executive vice-presidents saw fame and fortune in tearing it up in a blitzkrieg of taboo. Sid Vicious and Siouxie Sioux painted themselves in swastikas – and they were not alone. At this point, Mick Jones and Tony James had formed a proto-punk band, managed by Bernie Rhodes. Their Jewish manager sat the kids round the table, and opened an envelope full of red armbands, white circles and clockwise gammadian crosses. The band were to be called the “London SS”, declared Rhodes.
Stylish indeed. Fascism was the contemporary look of innovative, chameleonic trend-catalyser and punk inspiration David Bowie, then posing as the Thin White Duke. His Duchy was cocaine-driven ramblings, “theatrical” Nazi salutes and Hitler-loving commentaries. The Duke claimed he was “clowning”, holding up a mirror to English society. There was plenty of fascism in ’76 to reflect back upon him, after all.
Nowadays, we know Bowie was not a well man during this phase. Regardless, Nazi chique was IN, and infectiously marketed. Like teenage shoplifters, punks quickly tried to see just how much they could get away with. In the North, a band called “The Moors Murderers” came together. It is said that Chrissie Hynde briefly played guitar for them. They tried to release a song called “Free Hindley” but, thankfully, punk had found its limits on the opposing side of serial killer apologism. They were ostracised, and soon disbanded.
Soon after the Pistols triggered the Big Bang of British Punk, sub-cultures began to solidify underneath the hollow shells of the Pistols’ anarchic edgelordic imagery. The Clash took the lead as Britain’s foremost socialist rockers, whilst bands such as The Slits and X-Ray Specs carved out a forthright women’s voice in punk, joined by Crass (who usually mixed genders on vocals), especially on their 1981 LP Penis Envy (banned by HMV, and confiscated from shops by Greater Manchester Police).
Crass became pioneers of the burgeoning anarcho-punk set, drawing from, of all things, hippie counter-culture (the traitors) to make their brand of anarchy into something more than just a chaotic veneer. Anarcho-punk then fired itself back across the Atlantic, especially finding voice in the Dead Kennedys (note the name), who mixed up Southparkian college-boy shock humour with sophisticated leftist critiques and infectious hardcore energy, resulting in classic tracks like “Holiday in Cambodia” and “Let’s Lynch the Landlord.”
There is something essentially egalitarian about punk, and it rapidly drew in left-wing spirit with every breath. “When I saw the Pistols I suddenly realised I wasn’t alone in the fact that I couldn’t play…I felt inferior, but when I saw the Pistols I thought it was great, because it just suddenly struck me that it didn’t have to matter.” Punk is participatory, punk is democratic. Punk is a leveller.
Lefty punk was in full swing, but it was not alone, and was severely tangled up in punk’s other fan clubs. Although Weller quickly binned his braindead Thatcherism, and Jones and Rhodes threw out their swastikas, right-wing punk was, and still is, a massive thing. Longstanding punk rockers Misfits make no secret of their conservativism, though their efforts to organise online through conservativepunk.com (in response to Fat Mike’s 2004 punkvoter.com and Rock Against Bush) was about as successful as Moggmentum – may it never zombify.
Nazi-sympathisers quickly latched on to punk’s anger and energy and correctly saw it as a useful conduit of hate. Fascist punk was born quickly and grew quickly, its path to the off-limits nuked by punk’s pioneers. Lefty punks were guilty too – Strummer’s early celebration of black activism in London, in which he called for an extension of its spirit across Britain, was very unfortunately given the name “White Riot” (Joe would get better). Punk’s roots in whiteness were solidified by such statements, as well as those such as Elvis Costello and Jello Biafra who liberally used the n-word (I don’t fucking care how ironic you are, keep that word off your tongues). White punks still do it – nothing is off-limits, remember.
Fascism was in vogue in politics as well as in style. Enoch Powell’s shadow cast itself across the nation, and NF boots were marching in earnest. Thatcher would later herself tap into this manifestation, driven above all by the pursuit of votes. This rise was reflected in the fears and angers of leftish punk voices – Elvis Costello’s Armed Forces record was obsessed, fascinated and terrified by fascism.
Punk went centrifugal. It’s a movement that resists organisation, and embraces forceful contradiction. As the winter grew discontented, gigs increasingly saw violence overwhelmingly instigated by stormtrooper wannabes. Crass – blinded by flower power – decried the violence “on both sides” against evidence to the contrary, and with it sold much of their punk ethic down river.
The fighting at Crass gigs was happening in dive bars and music halls across the nation. The NF targeted the punk movement early on. The white working classes were finding their voice, and the fascists wished them to speak in time with the march of their boots. The way in, they decided, was Oi! – the punk of punk – a rebellion to the art school anarchists of the first wave. Oi! and its pioneers were less interested in statements, whether out of ideology or image, and more in talking about the things that went on in their daily lives. They were Saturday’s Kids, and in some ways, a punk by fans for fans.
Oi! was pretty masculine, and it went hand in hand with football factionalism and the resurgent skinhead movements – gigs were often a violent crossroads between punks, skins and rival firms. That’s when the NF got involved in a big way. The fascists had already started infiltrating football firms, and next stop were the skins and the punks. Oi! was their target, but for the large part failed miserably in their attempts to recruit any bands, with the latter exception of Blackpool flops Skrewdriver, who reformed in the ‘80s under the banner of White Power. Instead, they formed their own groups (household names like The Dentists and The Ventz) with the express wish to draw punks and skins into their circle of hell.
This didn’t go too well, either, so they instead tried to recruit existing fan groups. Sham 69, for a short time Britain’s biggest punk band, were top of their list. The Hersham Boys, bursting out of the deprived part of Surrey that nobody ever talks about, were distilled male youth, rage and joy wrapped in a southern snarl. Frontman Jimmy Pursey commanded great loyalty amongst his fans, and in turn he valued them dearly. They were huge among football kids, especially Hammers fans, and Sham gigs soon found themselves swamped by Nazis, seig heiling in the cauldron of noise.
This was different to Siouxie’s swastika.
The art-punk’s Nazi imagery created the initial association between punk and fascism, and the idea that the scene could be fertile terrain for NF recruitment. Now the Nazis had succeeded in wrapping Sham in far-right rhetoric, never espoused by the band or their music. Pursey himself was soon tarred with the fascist brush. He was loathe to condemn his fans, many adoring, for any dabbling in fascism. Not all Sham fans went this way, and he was to an extent sympathetic with those who found false solace in hate.
Pursey’s initial inaction led him to be seen by some as a poster boy of Nazi punk. In an interview, Joy Division (whose name originates from Nazism) defended the band’s etymology by pointing the finger at Pursey – “Everyone calls us Nazis…but compared to Jimmy Pursey, who is an out-and-out racist…Nobody can remember the beginning of Sham 69 and the things he said then.” (PS I can’t find any quotes that support this. @ me if you have some).
Others, such as Gareth Holder of the Shapes, hated him for failing to protect his fansfrom the fascists. “That idiot Pursey had his head so far up his own arse it wasn’t true. He just didn’t want to deal with it. He’d be singing “If the Kids are United” and the whole fucking place would be a war zone while he was doing it.”
This tactic was not shared by some of the Oi! Bands. The Cockney Rejects took matters into their own hands, by beating the shit out of the Nazis who threatened them. “We weren’t going to have it,” remembered Stinky Turner. “We just went down and absolutely slaughtered them. We declared to them that if they ever set foot where we were again, we’d decimate them.” Pursey, like Crass, wasn’t into violence. But things were getting serious at Sham gigs. Nazis were rushing the stage.
Jimmy had a choice to make, as did the whole punk movement. Where did it stand? Decision time came when Red Saunders approached him to be the face of Rock Against Racism. RAR was formed after Eric Clapton, who made millions out of covering Bob Marley, declared that Enoch Powell “was right.” RARs founders could see the infiltration of far-right ideas into pop music, they could see the Thin White Duke, Sid’s gammadia, and Costello’s n-bombs, and they saw the connection between these incidents and the trouble at Sham gigs.
Saunders knew exactly what he was doing. There was no point headlining with The Clash, that wouldn’t surprise anyone. Saunders had to get to the core of the power of the anger, the essence of punk’s loud expression of strip-it-down radicalism.
It had to be Sham, it had to be Pursey. If we could get Sham to take the stage and denounce racism, it could mortally wound the NF’s entrance into the punk scene.
Pursey agreed. He took the stage at Victoria Park with the Reggae band Steel Pulse, at the end of the festival, which was preceded by a massive march through London, protesting the threat of the far-right. For this, Sham got death threats, and Pursey lost his nerve. He pulled out of the RAR sequel at Brockwell – that was, until some kid approached him on the tube and told him he’d got no balls. This one finally landed. The next day, Pursey stood up to be counted.
RAR was a big success. It was so far removed from the overblown Bono-infected live aid shite. This was a rally with direct action at its heart – a mission to root out racism from Rock & Roll, an art form rooted in African-American expression, and take on the far right in London and elsewhere. RAR went on tour to the provincial towns. From then on, the NF were never able to grasp hold of punk, retreating into the scene’s darkest corners. It held a foothold among the skins, but was barely able to transmit its voice through song.
RAR could not, however, undermine the association between punk and fascism. Perhaps punk will always house fascists, doomed by its initial “artistic choices”, or its amplification of white working-class anger, too often misinterpreted as, or reduced to, fascistic. Perhaps because punk exists essentially in tension, dancing on the volcano of human emotion and expression.
For my part, escaping to (often terrible, often class) punk gigs as a kid, I felt I was going somewhere where no-one could ever hurt me, where the well of human hatred ran shallow and dry. A place that valued thinking for yourself. For me, punk remains a place of catharsis and a source of strength to hold firm against the 2017-vintage fascists. But in this place, as within any temple of inclusion, there is a line to be drawn quickly and decisively against the creeping hatred of our age. As Jimmy Pursey discovered, there’s no time to equivocate, and no valour in it either.
Better to deal with those who wish you dead the way the Cockney Rejects did,
On Friday, the Catalan Parliament unilaterally declared independence from Spain. Spanish PM Mariano Rajoy responded by dismissing Catalan premier Carlos Puigdemont, and imposing direct rule on the region.
It’s hard to say what will happen next, but it is very likely that Catalonia is in for a rough ride as it seeks self-determination. Rajoy has shown he isn’t afraid to spray and pepper rubber bullets along Las Ramblas. The USA, the EU, France, Germany and the UK have all said they will not recognise an independent Catalonia. The reaction of others will likely shape what place Catalonia would take if it enters the world stage, solo.
The last European country to take the plunge was Kosovo. On 17th February 2008, it announced it had split from Serbia and was now going it alone.
Catalonia is not Kosovo. The circumstances are very different. But the response of the nations of Earth to Kosovo highlights how the recognition of new nations has far more to it than the situation itself in the region in question.
No, Never, Not in a Million Years
Many nations have viewed Kosovo’s declaration with vanity, fear and loathing. Might Kosovo’s audacity encourage dastardly separatists within our own borders?
WE CANNOT ALLOW THIS!
Nations with diverse regions that varied cultures, languages, religions and fortunes have been especially vulnerable to this way of thinking, as are those whose borders hid ancient tensions brushed under the rugs of history. Spain sits in both categories, and refuses to recognise Kosovo. On the record, this stance is in stern defence of international law and the UN Charter. Spain, as it showed the other week, takes such things very seriously. It is a poorly-kept secret that it is worried that Catalonia and the Basque Country might call foul on any concession to a secessionist movement outside of Spanish borders. In 2012, Rajoy even said that Spain refuses to recognise Kosovo because “it is what suits the general interest of the Spanish.” The way things are going in the Pyrenean shadows, don’t expect a birthday card from Madrid soon, Pristina.
They weren’t alone in this. China, Argentina, Israel. Moldova, Georgia, Cyprus. Many of these ghosting characters are linked by their involvement in what some call “Frozen Conflict Zones” – geographical war hangovers – unresolved political disputes permafrosted upon land, homes, societies. They might have longstanding claims upon groups that have sought to govern themselves as autonomous, or else might be part of a geographical tug-o-war with another nation-state over the sovereignty of a region, whose own ideas of to which flag they pledge their allegiance are but a sub-plot. Imperial borders were often drawn on sand, and reinforced only by the bayonet. Sudanese or Yugoslav fractures exhibit what can happen when the coercive rug of the metropole is pulled out from under foot. A conglomerate state like China views separatism with great caution.
And the Kingdom of Spain? Founded under the marriage of Castile and Aragon. Strengthened in the inquisition and expulsion of those not bound to Christ. Made rich by Cortes’ sword that cut open golden Aztec veins, but eventually chased back into Iberia by forces that left Spain in dust. Perhaps, for many a Spanish patriot, a Catalonian schism would represent the last act in this retreat, a final nail in Spain’s imperial coffin, the confirmation that their pride in a nation past is no longer present.
Why hang onto a nation, for nostalgia’s sake?
This fence I’m sitting on is really comfortable…
The region of Artzakh is currently drawn on the Azerbaijani map. It is 95% Armenian, and its flag resembles the Armenian banner, altered by a thick brush on Microsoft Paint (RIP). Formerly known as Nagorno-Karabakh, it essentially rules itself. Azerbaijan, however, has not recognised it, and it is therefore unsurprising that it has turned a blind eye to the Republic of Kosovo as well. What’s weirder is that Armenia hasn’t recognised Artzakh or Kosovo either.
Geopolitical allegiances have a huge part to play in the recognition of nations. There’s quite a club of countries that has refused to acknowledge Kosovo out of respect to its good ol’ pal, Serbia. Some of these might be expected, such as Belarus or Cyprus. But perhaps others might surprise you. Algeria, Angola, the DRC, for example.
Who knew Serbia had so many friends?
Again, we cannot forget how the very existence of Kosovo reverberates back onto so many nations with fragile pasts. Alongside, some nations may wish to tread carefully to avoid stoking Serbia’s gigantic neighbour to the northeast. Armenia, for example, publicly welcomed Kosovo’s step, but avoided granting an official welcome to the family of nations so as not to irritate Azerbaijan, Russia, or Artzakh. Better to recognise everybody at once, rather than pick and choose.
In other cases, the decision not to recognise may be guided by nowt but a noble sense of loyalty to the Serbs,
which is sweet
if perhaps not the romantic gesture one would like to see in the realm of high global politics, where blind loyalty to anything can get you up all sorts of creeks.
On the other hand, some of Kosovo’s earliest friends benefited from its independence, alongside those who sincerely believed in its self-determination. The USA and the UK quickly backed it up, alongside much of NATO, having played a key part in the Kosovan war and the defeat of Milosevic.
Legacy management 101 – look! What a lovely intervention! We made a state out of it, dontyaknow.
If Catalonia goes it alone, it might find friends in odd places, and gain unexpected enemies. A lot of this may have little to do with Catalonia itself.
I’ll do it this afternoon!
That brings us to our last group. Those who haven’t quite got round to it yet. The most recent country to recognise Kosovo was Bangladesh, in February of this year. There has been a steady trickle of recognitions after the initial flurry of ’08. As of October 2017, 111 countries consider Kosovo a sovereign nation. This tally matters. Some nations, including Spain, have in the past often justified their refusal by noting how few nations had done the deed. The higher that figure goes, the less valid that argument has become. In recent years, some nations have recognised Kosovo declaring that they have changed their minds over the matter, or else citing that the trend has changed in favour of Kosovan independence and they don’t want to miss out before all the good yoyos have been bought.
This change has allowed Kosovo to find its feet somewhat on the international stage, helped somewhat by the backing of large swathes of NATO and the EU, and the USA. Although Serbia officially still considers Kosovo part of its territory, it has largely accepted the 2013 Brussels Agreement between the two groups and has largely normalised relations with Kosovo. This has allowed Kosovo to participate more easily in UN activities, albeit often placed with an asterisk next to its name, to give the word “Kosovo*” a dual symbolism, representing either –
the Republic of Kosovo,
Kosovo, a really really autonomous part of Serbia
– depending on your persuasion. Even Russia has softened its stance, although this has much to do with its activities in South Ossetia, and in the Crimean Peninsula, wherein Moscow argued that if Kosovo had the right to unilaterally declare self-determination, so too did the Crimea (which was of course, unilateral. No Russians here, what Russians, where?)
Perhaps the remaining collection of nations are merely waiting for the herd to decide when it will cross the river. Perhaps they’ve got better things to do, or more important things to worry about, than recognising a diminutive, landlocked nation in South-East Europe. The weight of numbers, however, is very important in a nation’s journey to global recognition. Certain nations, such as the USA or Russia, are triple word scores.
Lines in the dirt
It might all sound like a board game played on a 1:1 scale map. But the verbal barrages can so quickly beget rubber bullets, and rubber can soon itself turn to lead. Some may understandably point toward the fear that nations may continue to exponentially slice themselves in two, on ethnicity, religion, history, or economics (ever since 2008, for example, there has been talk that the Serbian-dominated North Kosovo may seek autonomy from Pristina). Is it only empire, they might ask, that forces communities to co-exist and intertwine?
Others may look at how the world is growing increasingly uncertain at the same time as it pulls itself closer together through technology and the border-sapping forces of neoliberalism. Groups may look out at the window at the gathering clouds and think, at this instance, it’s better to go at it alone.
Or maybe it’s all too facile to think that another line in the dirt, this time between Barcelona and Madrid, will necessarily lead to further divisions in a fracturing Europe. Kosovo’s ongoing journey to recognition shows that there’s a great deal going on under the surface in any separatist movement, and its burgeoning relationships with the world outside. Perhaps Kosovo shows that most nations are more concerned with what separatism might represent than what it actually causes.
Perhaps we should be less wed to the notion of the permanence of borders. Perhaps it is just as parochial to insist that the map remains the same as it looked when we were children than to draw new lines and erase others.
Some borders, such as along the shore of the Etang Saumâtre between Haiti and the Dominican Republic, change daily as the water levels rise and fall. There are many still disputed. Some are deliberately porous, others are guarded with lethal force, or else are to be marked by great walls. But even the most ancient of borders have crumbled many times. Human societies never sit still. Nearly every society in history has been host to travellers, migrants and traders, explorers and conquerors alike, who continually puncture or remodel the little barriers we like to draw between ourselves.
On 11th June 1955, there occurred at Le Mans the worst disaster ever to befall motor racing. Pierre Levegh collided with Lance Macklin and his Mercedes departed the circuit – its disintegrated remnants rained down upon a heaving grandstand. Levegh was killed, along with eighty-three spectators.
There recently seems to have been too many disasters; some preventable, some malicious. Haunting images of worst-case scenarios realised play out continually in front of us, it seems. In the empty shells and cordoned-off zones of disasters past, there lingers a feeling of pointlessness, of lives extinguished without reason, without meaning, too damn early.
What did you have left to give us? Could you still be here had we done anything differently? In some ways, it feels like a hunger, an internal amputation, a curse of circumstance and a complete loss of power. This inert pain loiters within, and its eviction requires a strength beyond comparison.
This current feeling draws me now to the story of the Le Mans disaster, and to Levegh’s teammate John Fitch, whose reaction to the accident saved thousands of lives. Motor racing has always housed those who view safety concerns and regulations with contempt, believing them to somehow dilute the thrill of the spectacle.
The early days were especially foolhardy, littered with carnage and the ghosts of dead drivers lying at the feet of reckless racers and gold-hungry promoters. However, as long as there have been fatal incidents, there have been those – often drivers themselves – who campaigned against repetitions.
Prelude and Disaster
Pierre Eugène Alfred Bouillin was old school. He was inspired by his late uncle who died before he was born, a victim of the earliest, deadliest days of motor racing. Pierre took on his uncle’s anagramed surname, Levegh, both in tribute and as a statement of his pedigree and style.
When he was not behind the wheel, he could be found playing tennis, or ice hockey, and was proficient at both. In 1955, he was 49 years old, and had clocked up more Le Mans miles than anybody else still racing. In the 1952 event he spent 23 hours behind the wheel himself in a self-modified Talbot, and nearly won – denied only by a broken crank.
In 1955, Mercedes selected Levegh to partner John Cooper Fitch in the third Mercedes 300 SLR tipped to take Le Mans by storm. It was moulded from magnesium alloy and notable for its massive air brake at the rear, added because the car was too fast to be slowed by standard brakes alone.
On the 11th June, Levegh was racing well and would shortly pull into the pits and hand Fitch the vehicle. On the Tribunes Straight, leading car Mike Hawthorn turned sharply toward the pit-entry, effectively brake-testing the lapped Lance Macklin.
Levegh, catching the pair, had not enough time to evade Macklin’s Austin-Healey, but spent his last moment alive giving a “slow” signal to the following car, team-mate Juan Manuel Fangio, likely saving his life. Fangio lived for another forty years. Macklin veered toward the unprotected pit-wall, and four people were seriously injured, run down by the runaway Austin Healey. Macklin, fortunately, escaped unharmed.
Levegh’s car launched upwards and left, careering into a concrete staircase and exploding. The Mercedes’ mighty engine, radiator and suspension sliced through the packed grandstand, massacring eighty-three onlookers, many of whom had been stood on tables and chairs just to catch a fleeting glimpse of the sportscars as they flew by.
Levegh was thrown from his vehicle, dying on impact. The rest of his mangled Mercedes ignited and its magnesium bodywork burned uncontrollably for hours. The race was forced to continue beside the SLR inferno, this white-hot monument to the agony of that afternoon, so that emergency services were not hindered by hundreds of thousands of exiting spectators.
Calm Amongst the Chaos
John Fitch was behind the pit-lane with Pierre’s wife Denise when they heard the explosion. Fitch went outside to investigate, and came across the injured left by Macklin’s car. He tended to them. He then spotted Levegh’s burned corpse lying on the track, where it lay exposed for too long until a Gendarme covered it with a banner. Fitch returned to Denise.
“It was Pierre. He is dead. I know he is dead,” said Madame Bouillin, before Fitch could deliver the news.
He then rang home, to inform his family that he had not been behind the wheel of the fated No. 20 Mercedes, as some initial reports had suggested. It was now that Fitch caught word that the death toll had already climbed to 64. It was the first time he became aware of the scale of the disaster. In response he asked Rudi Uhlenhaut, the SLR’s designer, to insist to the Daimler-Benz board that they withdraw the remaining silver arrows from the race.
Within moments of the disaster, Fitch had already begun to work towards mitigating future catastrophe. The race had been billed as World War Two on the track. England’s Jaguars taking on the Deutsche Silver Arrows on French soil.
The accident had sent the unscathed SLR, driven by the legendary pairing of Fangio and Stirling Moss, into an unassailable lead. Fitch imagined the sight of a German car taking the victory in front of a grandstand stained with French blood, and knew that the Gallic anger would reverberate far beyond the racetrack. Uhlenhaut and team boss Alfred Neubauer agreed, and six hours later so too did the Mercedes bigwigs. At 1.45am, Fangio was called in to retire.
Hawthorn, with Ivor Bueb, eventually won comfortably. It was a muted celebration, but Hawthorn’s smile as he guzzled the champagne, it is said, was enough to meet the ire of the French L’Auto Journal. “To your health, Mr Hawthorn!”
Champion of Safety
Many people realised after Le Mans that motor racing had become too quick too fast – safety standards had not developed at the same pace. Numerous nations banned circuit racing – in Switzerland, the embargo exists to this day. The fatal kink at Le Mans that had sent Levegh on his lethal trajectory was removed, and the grandstand was demolished and rebuilt. However, until the mid-1970s and the advent of Jackie Stewart’s trade union for drivers, the GPDA, safety improvements trickled through at a dreadful pace.
Fitch came to racing from an engineering background. Back then, drivers were far-removed from today’s ultra-pro baby racers, who are often placed in karts before they can walk, identified by temper tantrums on track and Instagram fun in press conferences.
Yet, then as now, the rich kids dominated the scene, exemplified by Hawthorn, who always raced in a bow tie – the public school boy with failing kidneys, hard-driving, hard-drinking – determined to live life as quickly as possible, untied by responsibility.
Others grew up with oil in their blood, coming from families of mechanics, engineers or vocational drivers. There were more women involved in the highest levels of motorsport then – Lella Lombardi, the only woman to score points in F1 (Austria 1975), learned to drive in Piedmont in her butcher father’s delivery van.
John Fitch was a bit of both. He had an ancestral engineering pedigree. His great-great grandfather John had invented the steamboat. His step-father ran a motoring company, and Fitch loved engineering from a young age. He served as a fighter pilot in the Second World War, and was shot down in 1945 (his own fault, he claimed), spending the rest of the war in a PoW camp.
He loved to go fast, but not to the point of abandon. “It’s life condensed,” he once said of racing. He was never interested in what speed he was going at, but instead in remaining in control at the limit for as long as he could hang on. That was the challenge he loved. The element of danger was of less interest to him.
His problem-solving brain sparked into life as soon as Levegh crashed. Helping the injured, demanding Mercedes withdraw, maintaining control at the limit of a most testing period. After the crash, he threw himself into developing vehicle safety standards, both in motor racing and on the streets – motivated by the memory of his team mate, of those who had died having come to watch them race, and the idea that any death behind the wheel is one too many.
In the 1960s he began working on the Fitch Inertial Barrier. These are unremarkable things, unassuming little yellow barrels filled with sand, usually found on American asphalt at highway exit points. He was inspired by the anti-strafe barrels he used to protect his tent from aerial fire during the war, and began by filling old liquor crates with various amounts of sand, before driving into them at 70mph.
Fitch’s self-described “hero impulse”, alongside his selflessness, led him to crash-test his invention himself, taking the risk for any design flaws upon his own shoulders. At the end of the decade the Fitch barrels were introduced onto US roads, and are now in use in every state. They are credited with saving over 17 000 lives, and saving over $400m-a-year – largely medical expenses.
Fitch continued to invent until his dying days. In motor racing, many drivers are killed due to basilar skull fractures, caused by a whiplash effect at the moment of impact, where the body is held in place by seatbelts, but the head is not.
Many famous drivers met their end this way, notably Roland Ratzenberger and Dale Earnhardt (who infamously refused to wear the necessary protective gear). Fitch created the Full Driver Capsule, a full body system that aimed to prevent such deaths through holding both the head and the torso in place during a rapid deceleration. A similar contraption, the HANS device, is now mandatory in most major auto racing series.
After the “Weekend from Hell” in Imola, 1994, Fitch was creating again. Responding to the deaths of Ratzenberger and Ayrton Senna, Fitch designed and patented the Fitch Displaceable Guardrail, designed to capture and cushion a car on impact and absorb the energy of the collision, which would ensure a more controlled deceleration and avoid the car bouncing back across the circuit, as Lance Macklin’s Austin Healey had done all those years before.
Fitch did not just stick to roads and circuits – his many inventions included a cervical spine traction device. His fifteen patents largely had one thing in common – they were responding to disasters, accidents and atrocities. After 9/11 he worked on the Fitch Survival Vehicle (never completed), which would theoretically enable escape from similar situations.
Incidents which disturbed him were transformed into opportunities to create something that might prevent or mitigate another occurrence. Regardless of whether a disaster was born of malice or cruel coincidence, Fitch believed that within engineering existed the tools for society to reduce the potential impact of future atrocities. Fitch’s efforts began in earnest after Le Mans 1955, but he himself saw his safety advocacy as rooted in his wartime experience.
In Band of Brothers, Stephen Ambrose notes that so many of the US 101st Airborne, upon returning to the United States, worked in teaching or construction – jobs that create and provide. Likewise Fitch was motivated by dedicating his life to safety in response to the agony of war.
“I was a wartime bomber pilot and a fighter pilot and I was involved in some fatal events…this is a payback in a way.”
John Fitch died in 2012, aged 95, 57 years after his teammate Pierre Levegh had perished. It can be said that by the end his heroism was no longer an impulse, but a fitting description of a man who refused to be derailed by dark moments that threatened to envelop in despair all who lived through those days. A man who refused to let lives lost through disasters be lost in vain.
The current Rustler’s microwavable meat advert depicts a bloke who is sat at his bland table in his bland living room, about to have dinner. He’s in the November of his years, closer to St Andrew than Guy Fawkes, and as he reaches for his snack, his life flashes before his eyes. It is a life unfulfilled and failed, where every dream is a lie, and every hope rescinds upon him, jading him ever further. Every cultural and political movement is but a moment in time, a spark of promise quickly extinguished. Beaten at school, waiting in line in soup kitchens, beaten by police batons in ‘60s peace marches, bored and jaded from then on. No other fad or movement would awaken him again. Until now. You see, it was ok that this struggle was pointless, because along the way somebody figured out that you could irradiate a soggy roll encompassing a lump of indeterminable meat to an acceptable temperature, cutting down dinner preparation time by up to half an hour. What a time to be alive.
The advert is called “80 years of torment.” I guess it’s meant to be tongue-in-cheek but, after these last few months, for an activist soul it is trolling your effort, your ideals and your hope. It is telling you to GIVE UP, forget any such pretence of a better tomorrow, because the only progress is in rapid-ready snack food. Maybe pop a tub of pringles open for dessert – they are right there. What even is fruit anyway. A waste of time, that’s what. You don’t have time to prepare a proper meal, what with all the working you have to do. It’s alright though, because the politicians have your back. They are working late too, and what salt of the earth they are, they are also having a burger for dinner. No Rustler’s for them though, they have sent for a gourmet, onion-ringed affair straight from the heart of Dalston, moulded from a hand-fed beast that, before it too was sacrificed for the cause, had a bigger home than you, and a better diet to boot. It was sped to the Royal Mile through the streets of London by a brave cyclist, darting through the taxis and the tourists on her less-than-zero hours contract, risking her life and others to make sure the burger gets to the Chancellor of the Exchequer before it loses too much temperature.
(Actually, he’s not Chancellor any more is he. He’s a tabloid hack, or a professor, or something.)
He can eat what he wants at whatever job he wants, because he is free to do so. That’s what they say they are about, freedom. They have so much freedom already, and all they want is more. They have always been mercantile, freedom is the silver in the mines of Potosi, the secret Porcelain recipe from Jingdezhen, grab it while it’s hot and hold on for dear life. You? You have to earn your freedom. As long as you live the right way, you too can have freedom. Put aside a penny a day for retirement, and you too can enjoy a microwavable burger when you are old. It’s triple-locked.
All this talk about liberty and they don’t even know what it is – they think it cannot be created, only taken away. So that’s what will happen next, now that they have purchased popular consent. Trickle down. The poor don’t want opportunity, they want stability, an unchanging, uncompromising dourness on the face of the Commons. A two-track broken record is comforting, when your work status is precarious, your rent fluctuates with the seasons – waiting for the eviction notice – “I’m sorry, we want families and young professionals to move into your dilapidated, ladybird-infested bedsit” – or the fire alarm. You never quite know who is waiting for you around the corner, these days. Do you feel safe? Not me. The promise of further change? Well that’s just terrifying. A microwave is reassuring, you know exactly when the beep will chime, you know exactly when your meal will reach lukewarm bliss.
It was a question of taking Britain back to the ‘70s, or maybe to the Age of Empire. Bring back the workhouse. Restart the Crusades, at a push. It would be good if everybody stopped time-travelling for a moment and looked outside the window. Capital looks after its initiates, the rest of us make their coffee and bring banquets to the door of another Junior Vice President. It’s in the nature of service. I love microwaves after long days, rip off the plastic and away we go. Technology isn’t simply there to improve our lives, but to make it possible to get more out of our bones. Afternoon tea allowed fourteen hour shifts, smartphone order apps allows two hour delivery. A paper-over-cracks health service allows a higher retirement age for those who haven’t spent their time living the right way, with chronic conditions as colleagues because ATOS said so. What a time to be alive.
You don’t get it, do you? Your grandparents never had a microwave. They had steady work, though. And here you have a smartphone. They aren’t for the poor, so you must be rich. Clive of India didn’t have a smartphone, and he never complained. Benefits aren’t for everybody, so they should be for nobody at all. People dressed in grey with grey countenance under miserable skies only see the world in blacks and whites, with us or against. A toasted Panini costs more than a week of mobile service, wealth has always been relative, dumbass. Thank you Rustler’s for feeding the people, one coronary at a time, relieving the pressure on our darling NHS. The population is exploding, and you want to keep people alive? You monster.
This is a time to celebrate. Haul your arse to the hypermarket (it might even be on offer), or your friendly neighbourhood food bank, take your meal from the shelf, remove all packaging, and wait for the beep. Take a seat, pick up the damp luxury that greets you, and eat every last bite. Don’t you dare complain you precious little soul about your life of diminishing returns, and remember the immortal words of Harold MacMillan – “you’ve never had it so good.”
Everybody develops a relationship with the American President. They enter and influence everybody’s lives in some way or another – from aggressive acts of war, to domestic health policy, trade deals, and speeches you may have caught wind of. For the past eight years, Barack Obama has been ever-present in our lives, whether we noticed it or not. For what it’s worth, this post is about my two terms with President Obama.
In part it’s a response to the global outpouring of liberal grief following the President’s farewell address, which he delivered with vintage oratorical charm. I completely get this reaction. With a Fascist Satsuma waiting impatiently to nest in the Oval Office, I can’t help but look at the future feeling that the floor is about to fall from underneath my feet. But I struggled to truly relate to this sentiment – I have a far more ambivalent and downbound attitude toward the outgoing President – one that is nine years in the making.
I remember when I got the Obama bug. I was nineteen – naïve, spotty and meek, living away from home (Birmingham) for the first time. In my first year at university, I was a pale brown kid in a predominantly white middle-class environment that I had not properly experienced before, even though I thought I had. I was seriously confused by the looks, the club security pat-downs, the “where are you froms” and my slow characterisation as an oversensitive chipped-shouldered teenage strop-monster.
It was Obama’s “A More Perfect Union” speech in March 2008 that perked my ears up. It’s curious, looking back on it now, as to why exactly it moved me so much. For the previous few years I’d been affected by post-9/11 racial politics, which had, among other things, made me wary of running for a bus or growing my beard too long. Bush, Blair and Brown had made all of us who were teenagers during the Iraq War desperate to hear something else – a politician who did not acquiesce to the hawkish racial profiling of the Naughties, where anyone not white was expected to behave in a certain manner (….and so it remains). In Spring 2008, as the American political classes rounded on Obama for his association with his preacher Jeremiah Wright – who’d once said “God Damn America” – it seemed as if Obama was to be subject to this policing. I think what I liked most about his response at the time was that it did not feel like an acquiescence, rather it appeared to me that here was a black politician taking command of the debate, channeling his own personal experiences into a speech on race in the USA. He did not seem afraid to take this issue on.
I now believe this speech to be more of a sleight of hand – a nuanced way to disparage Wright, his friend, whilst proclaiming an air of statesmanship. But at 19, struggling to even think about my race without imploding, I couldn’t believe my eyes. At the time, the symbol of Obama’s candidature overrode the underlying crude politics of his campaign. I feel that those who have always been cynical about HopeTM have missed that some of the fervour surrounding Obama was not of his own making. Here, in front of us, was a confident man, a smart-as-nails spine-tingling orator – in the eighth year of Dubya you can imagine how this must have seemed to desperate eyes. After four centuries (and counting) of Jim Crow and his ancestors, to see a black politician speaking passionately and earnestly about an issue from which white America has always preferred to duck and take cover, it felt so far removed from the vacuousness of Bush and the conniving of Blair. It felt so far removed from the USA.
And so I started rooting for him. I stayed up late to watch the primaries and checked the state-by-state polling obsessively. I also read his book over the summer. The first one, Dreams of My Father. I still wasn’t doing great that July, and it helped me to read of how the young Obama came to understand his multi-ethnic roots. It wasn’t particularly profound stuff and he certainly wasn’t the first person to have ever written about these issues, but his book was the one I read at the time and it made me feel slightly less alone, and slightly more comfortable in my own skin. Even now, that still means something to me. Eight years later, when everything Obama does feels very calculated and deliberate, I wonder to the service of what end he wrote Dreams. It is said he wrote it before he decided he wanted to pursue office. I don’t know what to think.
Ending the war in Iraq, closing Guantanamo Bay, preventing drilling in Alaska. Yes we can. Can we? Who’s we? Thousands of miles away, I felt a part of this, and so did many others. A wave of futurism, the audacity of hope. The Nebraska Second District, North Carolina. Indiana. SERIOUSLY? INDIANA? Even the American voters were getting behind this. The President of the USA is black.
Alongside, the Lehmann Brothers collapsed. Over here, Northern Rock followed. The financial crisis had been bubbling away quietly all summer and now it hit hard. It hit us for years, and all of us who graduated straight into the eye of this storm had quite a bit of fun trying to get jobs. Some of us worked for the Disney Store, or the Odeon, or Solihull drinking holes, and I fixed gearboxes for my uncle for a while. Some of us were paid by the hour, zero hours a week. Some of us didn’t work at all. Obama was elected into this global mess, and he began by rescuing those whose opulence, negligence and man-childish irresponsibility had sent half the world up the creek, without asking very much in return.
But still people believed, and they believe still. There exists and remains among many an unusual amount of faith in the US President – and in politicking in general – that far exceeds the constitutional role that a sitting president can take and the historical role presidents have largely played. Obama himself cultured this belief in his presidency especially, and he believes in it himself. He sees himself as a disciple of Lincoln, building behind him a Team of Rivals, for he believed that in debate and disagreement comes good policy and statesmanship.
As for his supporters, many people I know blame The West Wing. It may sound a bit silly, but the show has had a hold over liberal political consciousness over the last two decades. It presents an ideal liberal presidency – a USA run by Jed Bartlett, a genius economist with a strong moral core, ably supported by a gang of beautiful moral geniuses – complete with noble backing music, grand speeches, and rooms full of passionate-yet-civil debates between the absolute kindest representations of DC Republicans and Democrats. The belief among liberals that this is what Washington could be coalesced with Obama and his own self-image of rigorous statesmanship. Obama was as close to the West Wing ideal of the presidency as yet seen. This is more than mere comparison – Obama was the blueprint for Bartlett’s successor Matt Santos, and Obama-brand Democratic politics was certainly an influence on the show.
I was into this idea then. HopeTM got me good. I’ve always been left, but for a little bit I hoped that putting the right people in charge of existing institutions could provide necessary change. I thought about becoming a human rights lawyer and moving myself to DC where things happened (fortunately for me, the world has too many lawyers already). I binge-watched The West Wing in ’08. Like everybody else, I was watching only the veneer of the show. I see The West Wing differently now – likewise I see US political institutions in a very different light. President Bartlett is the perfect liberal candidate, yet in three seasons he transforms from a moral idealist into a stone-faced international assassin, ordering the killing of foreign diplomats from the gallery of a theatre. In eight years he does little but keep the USA ticking over, to the point where on his final day in office he has to be consoled by his wife, telling him over and over that he has done good.
Obama, the New Democrat, was always more of a pragmatist. The American President is assigned to preside over the myriad of checks-and-balances scribed within the Sacred Constitution that served to try and keep everybody *important* happy both in 1789 and for all eternity. On the domestic front, the President is Equivocator-in-Chief, a middle manager in an oval office. In 2008 Obama knew this and believed in this. He knew that it was better for his job security – better for the American President – to bail out the financial sector with little consequence. It was better to avert crises, he thought, than risk destabilising US political life for serious change. However, in healthcare policy he bucked this trend. The passing of the Affordable Care Act (or “Obamacare”) cost Obama a shedload of political capital, but in doing so, the number of those without health insurance in the USA has almost halved since 2010.
In some cases, Obama’s desire for change was handcuffed by the job, especially after the Democrats lost the House. Obama was clearly frustrated and exhausted by his inability to increase gun control in the face of US firearm culture and a hostile political environment. The USA largely brushed Sandy Hook under the carpet, and there’s quite a company beneath that rug. It’s difficult not to grow jaded in the face of such national carelessness. However, the excuse of presidential powerlessness runs very thin in other examples.
I think often of the black people killed by law enforcement. Crimes that go unpunished. Crimes committed with impunity. When Trayvon Martin was shot dead by neighbourhood watchman George Zimmerman, Obama was able to strike a small gesture of empathy, perhaps trying to educate white America as to the daily dangers of being black in public in the USA when he noted that Trayvon could have been “my son” or “me 35 years ago.” Yet Obama’s great “conversation on race” never really proceeded past this point, as the list of names grew ever longer, the executions ever public. When Mike Brown was killed, his body lying in the streets for hours, protests erupted in Ferguson, MO. Where were you then Mr. President? Where the fuck were you? The President could only ever muster some horseshoe centrism about “both sides,” and a plea for “non-violence” as law enforcement donned riot gear and rolled in on armoured vehicles. Obama ducked and took cover.
I moved to DC for six months in 2014, on a fellowship to continue studying Haitian history. I was there when the ruling was delivered that Brown’s killer was not to be indicted, watching the news in a diner on the Hill. They waited until the evening to tell everybody what we already knew, delaying it in an attempt to trigger a response. In the days following, the President stayed true, as he always has done, to vague appeals to “peace,” dialogue and patience following in the footsteps of some fictionalised, diluted version of Dr. King (it recalls also his speeches on gender, which were so often injected with grating “wives and daughters” rhetoric). These seemed like empty words, but in an ever-divided USA, they were a cold shoulder to some of his most faithful constituents. If the President is equivocator-in-chief, Obama increasingly seemed willing to earnestly play this role. It no longer held true that he was simply a man hampered by his institution.
Every major city experienced a massive mobilisation of people protesting this Ferguson hatchet-job. In DC there was a huge march that night, and sporadic marches bubbled up through Washington for the next few weeks. By then, it felt increasingly like the president was not on their side.
Then there’s foreign policy. The “Obama Doctrine” fittingly seems to defy definition, opaque to the core. Riding into office on a wave of anti-Iraq fervour, Obama quickly sought a way to continue business-as-usual whilst appearing to have changed tack. The pledge to close Guantanamo – carved out of Cuban soil, that symbol of US hawkish interventionism fuelled by extraordinary rendition – simply disappeared.
Drone warfare suited the Obama Doctrine to a tee, of intervention without deployment. Throughout the world, the USA was to no longer be seen, but always be felt, often with devastating effect. However, this misdirection, coupled with the scaling-back of US boots on the ground, was enough to convince the Nobel commission that we were back on the road to utopia. Yet there have been some changes. Obama has moved to warm relations with Iran and Cuba – moves for which he has been called a traitor and a communist. He received similar vitriol for expressing sympathy with Trayvon Martin over his racist assassin, and when he pushed through Affordable Care.
The intervention with which I am most familiar was in Haiti. In 2011, Secretary of State Clinton personally intervened in the presidential elections in order to place Michel Martelly, a friend of US interests in the country, in a position where he could take power. Ever since, US policy toward Haiti has actively discouraged Haitian democracy from behind the scenes. Clinton receives much of the criticism for this policy, but it was conducted under Obama’s White House and with Obama’s consent. This was an act of harm that continues to harm Haiti, and it upsets and infuriates me still.
By the time I made it to DC, the allure of “being there” had long evaporated. The Capitol Building outside my window held no romance in its scaffolding. Inside, its rotunda was adorned with stylised images of the conquest of the New World, and its stewards that November had moved decidedly rightward. But it was a city that captivated me regardless – DC is beautiful, cosmopolitan, and well-and-truly alive with the past and the present. It was here I saw President Obama speak.
For all of his fame as a great orator, the President gave a pretty strange speech about Jesus, and how the Messiah was basically a stand-up gent. It was a talk that would have fit in well on U Street at 2am. Michelle Obama read “Twas the Night before Christmas” and was, of course, brilliant. It was a weird feeling, seeing the President there in front of me. It was like going to a gig to watch a band you loved as a kid but had long since found them a bit tacky and issue-riddled. It was underwhelming, yet there was still something about them that you’d not lost – that there was a reason you liked them in the first place when you were younger and didn’t think about things so much.
The President on occasion still had the ability to pull my strings and echo the man I used to think he was. When he sang Amazing Grace in South Carolina at the memorial for the victims of Dylan Roof’s mass murder, my cynicism briefly melted away and I burst into tears. But then I think about how Roof was taken for dinner by his arresting officers, whilst police shoot down black kids with toy guns, and I realise this feeling is almost entirely window-dressing.
Of course I’ll miss him. I’ll miss the symbol of a black president in the US seat of power – the influence of which cannot be understated. I’ll miss the days when there was not a vicious, race-baiting kleptocratic sexual predator in the White House, and I’ll certainly miss not existing under the daily threat of global annihilation. Trump is a visceral reminder that the USA – and the world they influence so greatly – can yet fall leagues below its current state.
Gary Younge argues that “judged by what was necessary, Obama was inadequate; judged by the alternatives, he was a genius.” Similarly, by the relative standards of those who have previously served the Office of the Presidency, Obama will surely rank highly. But this isn’t sufficient to give Obama a free pass, and view him solely as a brief period of sanity between two destructive Republicans. These have not been eight years of national and global healing – many of the tensions that boiled in the Bush era have persisted and, in some cases, worsened.
What then, of hope? Was Obama’s dream of a New America just a lie? A simple ruse with which to take the Presidency? These last eight years I’ve definitely changed. My youthful idealism has vanished, and I view the way US Government works with fatigued scepticism. I don’t blame President Obama for this. When the promise is broken, you go on living. He was by no means the sole author of the hope that spearheaded him to power, nor did he co-opt it completely. Any frustration I have now with Obama is not rooted in any feeling of betrayal, rather it’s the result of my concerns with the choices he has made whilst within the office. The hope that I had felt was a product of misunderstanding the role of the Office of the President, believing that Obama was the liberal idealist he presented himself to be, and lastly believing that liberal idealism itself was sufficient to transform the Office of the President. Obama’s presidency helped me learn this lesson. Nowadays, if most of my heroes don’t appear on a stamp, then certainly none of my heroes have been President of the United States of America.
Nowadays, I look elsewhere for my hope. At this moment, we need it to give us strength in opposing everything the 45th president will throw at the world. President Obama’s long-forgotten pastor Jeremiah Wright was right about hope’s audaciousness.
In spite of a being on a world torn by war; in spite of being on a world destroyed by hate; in spite of being on a world devastated by distrust and decimated by disease; in spite of being on a world where famine and greed were uneasy bed partners; in spite of being on a world where apartheid and apathy fed the fires of racism…her harp all but destroyed except for that one string that was left – in spite of all these things, the woman had the audacity to hope. She had the audacity to hope and to make music and to praise God on the one string she had left.
On the worst days, I still hold on to this, tightly.
Today is the first Tuesday after the first Monday of November in a leap year, which means only one thing – it’s the 1463rd day of the US presidential campaign!
Election day, it’s nearly over. Like a sacred Leap Day, or a planetary alignment, this Tuesday is the only day in four years when nobody is running for president. For Hillary Rodham Clinton, said to have been running since at least the day she last left the White House, it is likely* to be the last day she is not President of the United States of America. In kind, in January 2017, Clinton is likely* to become the first woman president.
(*based on 538’s 69% chance of Hillary White House. no sure thing. UPDATE : 4:40am here, looks like Trump’s gonna win)
She would (figuratively) get the keys to her new presidential mansion – creatively named the “White House” after its fair complexion – sometime in the early hours of Wednesday morning, so long as at least 270 members of the electoral college pledge for her instead of her rival, Donald J. Trump.
This manner of selecting a Brand New Overlord dates back to the very first election, when 69 electors gathered in 1789 to pick the first president. Each elector was given two votes, on the understanding that all would give their first vote to George Washington, and the candidate who received a plurality of the second votes would win the prize of Vice President, which went to John Adams.
Of course, there was nothing democratic about this initial selection. Only the states that had ratified the constitution got to take part, with apologies to the indecisive North Carolina and Rhode Island. New York fell out with itself, so wasn’t allowed to play either. No matter, they’d have chosen Washington anyway. Only six of the ten participating states had a popular vote for their electors, of which only free people with sufficient property were eligible to vote.
The Electoral College has managed to outlast many of these old ways, mainly because it has sanctified in Article II, Section 1 of the Constitution. Tomorrow, the US voters indirectly vote for Hill or Donny by voting for pledged electors, “stand-ins,” for their will. Each state gets as many electors as it has Senators and Houses Representatives, and DC gets three too under the terms of the 23rd Amendment. Each state is winner-take-all (except Maine and Nebraska, but let’s ignore them today).
In the old days, there was nothing holding these electors to the vote other than a Gentleman’s Agreement. Reneging was common; it happened in every election from 1796 to 1808, and frequently after that. Such characters were known as “faithless electors.” In 1820, one generous New Hampshire elector gave his vote to his pal John Quincy Adams. How kind – Adams wasn’t even running that year. It wasn’t always intentional. In 1864, Nevada only cast two of its three votes for Lincoln, because one poor soul, on his way to vote, got snowbound in Colorado.
In 1824 John Quincy Adams actually ran, and he set a few records along the way. It was the first election where they recorded the popular vote, and he won with 30.9% of it. That may seem low – because it is. He didn’t win the popular vote. Andrew Jackson got 40 000 more votes (41.4% of the total vote), and even got 15 more electors. However, Jackson didn’t take a majority of electors, and so the decision went to the House of Reps, or more accurately, a dusty, mysterious Washington office – these days the natural habitat of Cigarette Smoking Men leaning on a filing cabinet. There, Henry Clay gave his support (he’d won 37 electors) to Ol’ Quince, handing him the presidency.
Some say Clay did it for the position of Secretary of State, which he duly received. Others point out that Clay was politically closer to Adams, and he thought little of Jackson, proclaiming that “I cannot believe that killing 2,500 Englishmen at New Orleans qualifies for the various, difficult, and complicated duties of the Chief Magistracy” (wonder what he’d make of hosting the Apprentice).
Adams was the only person to win the president through the House, and as the first child of a former President to follow in his father’s footstep, he founded the first Presidential Dynasty, which have become increasingly popular in recent years (google Chelsea 2024, for further information).
Adams, however, was not the last president to lose the popular vote but win the White House, thanks to the wonders of the Electoral College, a system whose beauty is supposedly in its simplicity but hides unending complications. It happened twice in the post-Civil War era, when there were a series of close elections – marked by mudslinging, shady deals and assassinations, as the USA struggled to reconcile its differences. It happened in 1876, when Rutherford Hayes won the college by a single elector (more of that an’ on). It happened again in 1888, where Grover Cleveland was temporarily evicted from the White House by Benjamin Harrison. Most recently, Al Gore won the popular vote by 500 000 in 2000, but George W. Bush took** Florida by 537 votes and with it came the White House.
(SIDEBAR – It seems a preposterous system in these incidences – but I’m not going to pretend I’m sat on some high British horse – the UK’s current government got a parliamentary majority of just 37% of votes cast, and 24% of those eligible to vote. The current Prime Minister was selected by a grand total of 199 people. That’s just the way of things.)
Elephants and Donkeys
The USA has a two-party system. The US has gone through a grand total of six party systems over the years, but the last few have all involved the Republicans and the Democrats. Both were originally founded for a purpose, but have shape-shifted a few times over the years, changing bases and constituencies in an eternal quest for power. The Federalists, Anti-Federalists, Whigs, Anti-Masons, Know Nothings, Bull Moosers, Progressives, Dixiecrats and Reformists have all come and gone, but the long-standing rivalry between the Reds and the Blues has stood firm.
The Democrats, symbolised by the donkey, sprouted from Thomas Jefferson’s now confusingly-sounding Democratic-Republican party. They initially saw themselves as the defenders of individual liberty against the malevolence of central power (embodied by Quince and Clay’s 1824 handshake), but as much as anything it became the very model of a modern political machine.
The Republicans (who claim the Elephant as mascot) were founded as an anti-slavery party in the 1850s, and quickly found support as the Whigs and Democrats pulled themselves apart in the slide towards civil war. Under William McKinley, the Republicans began their courting of Big Business, whilst the Democrats, retaining an element of southern populism, moved steadily towards social democracy characterised by FDR’s “New Deal.” Things changed again in the ‘60s, when the Democrats seceded the “Solid South” after their lukewarm embrace of the Civil Rights Movement. Meanwhile, Richard Nixon formed a “Southern Strategy” where the Republicans would say thinly-veiled racist, segregationist things to court the Deep South over to their side. Then Reagan came to town in the ‘80s and turned the entire USA over to neoliberalism (twas fertile ground, some might say), before Bill Clinton and the New Democrats responded by diluting the New Deal to incorporate the spirit of the Gipper.
So that’s how the parties came to look like how they look now. Sort of. They disagree on a fair few things, such as climate change, abortion, and the name of an east coast NFL team. But on many issues the two parties aren’t too far apart, such as taxes, foreign policy, business, trade, welfare, and the USA’s self-styled status as the “Leader of the Free World.” With the exception of Hillary and Barack, they’ve also tended towards wealthy, old white male candidates.
Their similarity is in part due to the centripetal nature of the Electoral College, and the parties’ longstanding record as efficient, election-winning political machines. It sits in striking contrast to a US society that is once again ripping itself apart; a fact that reflects itself in the electoral map. The USA is growing polarised on the fault lines of race, class, gender, policy and religion, and this is increasingly reflected in the voting habits in states. Swing states are becoming a rare breed. This phenomenon is not unique to the States; it’s happening here in Britain, starkly illustrated by the 52-48 Brexit vote. In the UK, our party system has splintered, but across the Atlantic the hegemony of the donkey and the elephant has held firm.
Sorry, Ross Perot.
Why? Well, US politics is a big money industry. It is difficult for a third-party campaign these days to compete with the big guns. Another reason is because of our good friend the Electoral College. As with much of the USA’s structure, it was designed to ensure that no one area could dominate affairs by racking up huge majorities in specific regions, whilst simultaneously ensuring the interests of regions and individual states are heard through its winner-take-all model. It’s nifty like that.
A successful third-party candidate has to compete across the country, and make sure they have a regional support base somewhere greater than that of the two main parties’ candidates. You need to be flush with cash to do that. Yet the USA has a lot of love for plucky outsiders. Perot did well in ’92, gaining 19.7m votes (19% of the total), but didn’t earn a single electoral vote.
In 2000, there was still a lot of frustration with the “lesser of two” choice that the main parties were now serving up. 2.8m voted for Green Party candidate Ralph Nader in an election where Bush and Gore were separated by just 500 000. In Florida, Bush was given the victory after weeks of recounting – lawyers everywhere – by just 537 votes. Nader had got 97,421 in the Retiree Alligator State. So many things can cause a 0.009% gap in an election. Weather, traffic, the 562 votes cast for the Socialist Workers Party, the “Butterfly Ballot” that supposedly encouraged votes for minor parties, hanging chads, votes denied to 1% of Floridians (and 3% of black voters) on account of being a “felon” including for crimes said to have been committed after the 7th November…buuuuuut for the most part Nader got the blame for taking Gore’s votes. It could be argued that the two-party system is so rigid in the States that Nader and his voters were naïve; myself, however, I’m uncomfortable with the notion that a candidate can take another’s votes, as if a candidate can own a vote before it is cast.
Third party candidates weren’t so popular after that. It’s easier these days to be an insurgent within one of the main political machines, thanks to their fluid ideologies and the Primary system of candidate selection, where anybody with enough cash or support can make an honest run at being a Democratic or Republican candidate for the presidency. It’s what Trump, Cruz and Sanders have tried this time around. Maybe we’ll see more of it in the future, especially on the red side. Once you’ve got the nomination, it seems the USA is so wrought in two that you’ve still got a chance at the White House. No matter how openly megalomaniacal you are, no matter how abusively racist and sexist you are in public and private, no matter how much of a nuclear-fallout-after-a-trainwreck-landslide-Godzilla-attack candidate you are, you’ll still likely do better than Dukakis. That’s just the way of things.
The Immortal Jim Crow
The voter suppression tactics that swirled around discussions of Florida 2000 were no stranger to presidential elections. They are no stranger still.
Back to 1876. Rutherford Hayes won by a single electoral vote, having lost the popular vote to Samuel Tilden. Tilden had taken 184 electors, but three Southern states, Florida, Louisiana and South Carolina, were yet to officially declare, amid reports of voter fraud and suppression that particularly targeted African-American voters. Importantly, these three states had Republican governors, and together their electors would see Hayes over the line. Here the Republicans set up “returning boards” to recount the election, root out Democratic voting fraud, and maybe doctor some results of their own.
By 1876 the Democratic voter suppression racket was fully operational. The party hacks made allegiances with Southern paramilitary groups the Red Shirts and the White (Man’s) League to intimidate black voters and break up Republican organisation in the South. It was working, and for the first time since the Civil War the Democrats look set to regain the region, sweeping even those districts with massive black majorities. Were there no vote-mangling at all, it is likely Hayes would have carried much of the South.
Unsurprisingly both sides claimed victory, each accusing the other of fraud. It got incredibly heated, and there were fears that a second civil war could erupt. Eventually, it (officially) went to Congress where a Commission voted 8-7 (along party lines) to give the states to Hayes. Secretly, however, in another smoke-filled room, Hayes met with senior Democrats promising a series of federal spending in the South and, importantly, the withdrawal of Federal troops from the region.
This ended Reconstruction, handing a monopoly of Southern violence to racist groups such as the Red Shirts, who would incorporate themselves into state militias. In exchange for a Republican presidency, the party seceded control of the South to their rivals, abandoning the newly enfranchised former slaves. Over the coming years, Democrats constructed a framework of laws alongside a widespread system of intimidation that locked out African-Americans from voting and running for office and denied them a whole host of civil liberties. This was the Jim Crow South, where black people lived segregated from white people in an Apartheid enshrined by the Supreme Court (Plessy v Ferguson, 1896). Although emancipated, ex-slaves in the South were not yet free.
Jim Crow was largely felled by the Civil Rights Movement, and the Civil Rights Act of 1964 (on paper) ended disenfranchisement on the basis of race. Yet, as Florida 2000 shows, it still goes on. In 2016, it appears to be making a strident comeback, alongside the white nationalist fervour of the Trump campaign. Poor, minority areas across the USA generally have fewer voting stations, with less staff. Voting takes place on a Tuesday, and the polls close in the early evening. Those with long, unforgiving jobs may not be able to spare enough time in the day to queue to vote. Voting bans on felons – the USA is the incarceration capital of the planet – take millions off the register and disproportionately affect black people. In North Carolina, over 6000 voters, mostly black democrats, have been taken off the register in a process illegal under federal law. Jim Crow lives. It never really went away. That’s just the way of things.
Until Next Time…
Robert McCrum in the Guardian says that many believe the electoral system to be broken, “but it has seemed broken before and somehow staggers on.” Maybe. Maybe it’s worked fine for those it is made to serve. Maybe, like the Second Amendment, the Electoral College is so ingrained into the American fabric first wove by the Founding Fathers that to change it would be considered treasonous. Maybe, as when it was first created, the Electoral College keeps the lid on American tensions and papers over the cracks of this nation. Either way, it isn’t likely to change any time soon, but the way the USA has chosen its president over the last 200 years has had a great bearing on who ends up in the White House, affecting all of our lives from that oval office.
Soon we’ll know who that next person will be. In the meantime, relax. The next election begins in less than twenty-four hours.
Disabled athletes have however competed at the very top of elite sport for almost as long as the Modern Olympics. In an era where disabled people were often hidden from view, these pioneers demonstrated that paralysis, amputation or illness were not to stop them reaching the peaks of their fields, and in some cases the athletes’ disability served to harness their potential. Some, like Lis Hartel, built a legacy that inspired future athletic and therapeutic achievements. Others, such as George Eyser, are more enigmatic. Yet all of these stories remind us that disabled people have long resisted the societal imposition of limits upon themselves, and they still hold the power to challenge this notion today, as new stories are told in Rio over the coming weeks.
Ray Ewry – “The Frog Man”
“Ray Ewry wasn’t even supposed to walk,” writes Eric Adleson, but this American, born with Polio, won (at least) eight gold medals, a record that stood until Michael Phelps came to town. In fact, he never lost. Ewry competed in standing jump events, which sadly have long since fallen off the Olympic Roster. He leapt 1.66m in the standing high jump in Paris 1900, whilst also winning the standing long and triple jumps, leading Parisians to christen him “L’Homme Grenouille” – The Frog Man.
Aged eight, the orphan Ewry was wheelchair bound with ascending paralysis, and in an attempt to regain proper leg function, his therapist prescribed a series of exercises that extended and contracted the leg muscles. In this way Ewry learned to walk again, and year after year his legs grew stronger. By the time he had graduated from high school he had moved to crutches, which he was able to abandon the following year. The therapy he was prescribed holds many similarities to the modern elite training technique called plyometrics that increases explosive power in the legs. Ewry set out solely to walk again, but became stronger than everybody else.
George Eyser – Amputee Turner
The 1904 Olympics were held amidst the “Louisiana Purchase Exposition” World’s Fair in St. Louis. A commemoration of this great leap toward Manifest Destiny unsurprisingly became a disturbing parade of all that had driven United States expansion over the previous century. Amerindian, Filipino, African and Islander peoples were paid to “perform” their expected backwardness, exhibited like living artifacts to give lip-service to white American exceptionalism.
Central to the Expo was the “Philippine village,” wherein residents of the (US) occupied territory were made to live out – on display – a day-in-the-life of a rural Filipino. The great Haitian Jean Price-Mars, attending the festival, recalled seeing “two young [Filipino] Blacks…surrounded by an excited crowd that was subjecting them to all sorts of indignities.” James E. Sullivan’s Olympic showcase mimicked these proceedings, hosting a duet of “Anthropology Days” at St. Louis 2014, that took untrained, unsuspecting participants from the fair and made them compete in a series of events, and they inevitably struggled, even at so-called “savage-friendly” events like the javelin.
Sullivan proclaimed this farce to be evidence of white supremacy, whilst the massive medal haul achieved by the US (no wonder, when they provided 523 of the 630 Olympians, and were only challenged in 42 events) was heralded as proof that the USA represented the ideal of civilisation. Only six women competed, as the Olympics continued to clamp down on women in sport, and the sporting disaster drew on for over three months. It was crowned by the Marathon when the first man, Frederick Lorz, travelled by car for eleven miles, and competitors were deliberately denied water on the course because the organisers wished to test the limits of the dehydrated human body.
Amidst the chaos, German-American gymnast George Eyser won three golds, two silvers and a bronze. He had one leg; legend has it that as a child, he lost it after a run-in with a train. However Eyser, born in Danisch-Nienhof, was entrenched in the German “turnverein” culture of the 19th Century that encouraged gymnastic practice (or “turning”) as a means of achieving Germanic physical potential, and cemented itself in US society thanks to the millions of German migrants that arrived in the USA after 1848. Eyser was not a rich man – he worked as a bookmaker for his entire life – but had acquired an advanced prosthesis that enabled him to perform his craft. He competed in these Olympics as a member of the Concordia Turnverein, run out of St. Louis, and won the Rope Climbing and Parallel Bars outright, and tied for gold in the Horse Vault.
Eyser’s achievements are often forgotten among the trainwreck that was the “strangest” Olympics. A Wall Street Journal article even subsumed him within it, using the fact that a one-legged gymnast won three titles to suggest the entire Olympiad was as illegitimate as the Marathon. Although nearly all of Eyser’s rivals were based in the USA, the competition was not weak, and he collected his haul of medals by defeating some of the finest gymnasts of his generation. Eyser was the first amputee to compete in the games, but his actions after 1910 are barely known.
Olivér Halassy – The Greatest Halfback
Hungarian water polo halfback Olivér Halassy also ended up on the wrong side of public transport, losing his left foot in a streetcar accident, but came to be considered as the greatest halfback of his era, winning a silver and two golds as part of the fabled Hungarian water polo team of the 1930s, and scoring twenty goals along the way. These mighty Magyars also won three European titles, and in 1931, hours after their victory, Halassy jumped back in the pool and won the 1500m freestyle. The foot is an important tool in water polo to help stay afloat, to quickly change direction and to launch out of the water.
His final gold came in Berlin 1936, where Hungary, complete with a disabled swimmer, overcame the much-fancied, regime-backed German outfit who aimed to demonstrate able-bodied Aryan superiority. These performances posthumously earned Halassy a place in the Swimming Hall of Fame, but unfortunately his life was cut short in 1946. Late one evening, on the way back to his Budapest home, Halassy was shot dead by a Soviet soldier, leaving bereft his heavily-pregnant wife.
Two years after Halassy’s death, Károly Takács followed in his countryman’s footsteps and won gold in London 1948. In the 1930s, Takács was a world-champion pistol shooter and a sergeant in the Hungarian Army. However, in 1938, a defective grenade exploded in his pistol hand rendering it useless. Takács was hospitalised for a month, during which his hand was amputated up to the middle of his forearm. Upon release, he secretly began to train his remaining left handin the art of pistol shooting, and a year later, he unexpectedly appeared at the World Championships. Legend has it that there he proclaimed “I didn’t come to watch, I came to compete.” He won. Nine years later, at the first Olympics held after the Second World War, Takács won gold with a world record in the 25-metre rapid fire pistol, and retained the title four years later in Helsinki.
The Danish equestrian Lis Hartel came from a family of hippophiles – she was a horsey person. In the early-1940s, Hartel became twice Danish dressage champion upon her unfortunately-named steed Gigolo. In 1944, whilst pregnant with her second child, Hartel contracted polio. The child, Anna, was born healthy but Hartel, aged 23, was left paralysed below the knee for the rest of her life, and also suffered damage to her thighs, arms and hands. After gaining enough strength to walk with arm crutches, Lis Hartel learned to ride again with the family horse, Jubilee, chosen for the task by her parents for her quiet temperament.
“They told her she would be lucky if she improves to walk on crutches again,” recalled her daughter Pernille Siesbye on Eurodressage. “She was lifted in the saddle and first guided in walk for her to get a feeling for the movement again. Step by step my Mum became more independent and finally rode on her own.” Horse riding requires strong leg and core strength for balance, and Hartel fell badly on many occasions as she struggled to adapt to her disability. Jubilee learned “that she had to react only to weight and back aids,” because Hartel now “rode with her back and by-gently shifting weight, because she was unable to use her legs in any way.” Hartel commanded Jubilee with very soft, subtle arm and leg movements. She did not have the strength for further force, but it suited the tasks of dressage and the gentle nature of Jubilee.
Soon they were competing again, but Hartel had to wait until 1952 to reach the Olympics. Before then,equestrian was only open to male military officers; a prohibition that was lifted for Helsinki for dressage, but not for jumping or eventing, which the Olympic committee still deemed too dangerous for women and civilians. Hartel entered the arena as one of the first four women to ever compete in Olympic equestrian. Her routine captivated the crowd, who were unaware of Hartel’s paralysis until she finished her routine and had to be carried off her horse by the gold medallist Henri St. Cyr. Hartel claimed the silver, becoming the first ever woman to medal in equestrian. She repeated the feat in Stockholm four years later.
Her greatest achievement (in her own eyes) was yet to come. Upon retirement, she founded the first Therapeutic Riding Centre in Europe, and through her advocacy work with the Polio Foundation, she is now “widely credited with inspiring a worldwide effort to better peoples’ lives through horses.”Hippotherapy has since been accepted as a highly effective therapeutic treatment for those with muscular afflictions such as cerebral palsy and multiple sclerosis, and has also been used for psychotherapy. The rhythm of horse riding replicates pelvic movements when walking, strengthening posture and thighs. Hartel died in 2009, but left a legacy that includes the rehabilitation of thousands, the demolition of equestrian’s glass ceiling, and the growth of dressage as a Paralympic sport. Her efforts live on also through the actions of the Lis Hartel Foundation.
Yes, They Could
The greatest Olympic moments – Jessie Owens, Ali, Carlos and Smith, de Lima in the Marathon, Kathy Freeman – they were not only sporting conquests but also triumphs over personal and societal pressures that stifled them. The wonder of the Paralympics is that every moment forms a public challenge against a world that denies the abilities of the disabled. Channel 4 calls them the Superhumans, but this article isn’t just to celebrate the remarkable individuals discussed above. Rather, the stories of Ewry, Eyser, Halassy, Takács, and Hartel demonstrate that people with physical disabilities have countered the derision of ableism for a very long time – long before the worthy events at Stoke Mandeville took place – and the Paralympic movement owes much to these pioneers.
“See this watch she gave me, it still ticks away, the days I’m claiming back for me.”
Eels, The Medication is Wearing Off
Into the Royal Albert Hall on a sticky Tuesday evening, to watch the Manic Street Preachers play Everything Must Go, 20 years after it was first released. This was their second anniversary tour in as many years. Last time out they reprised The Holy Bible, to commemorate the two decades that had passed since Richey James Edwards had disappeared, his car abandoned near the Severn Bridge.
I’m suspicious about bands playing albums live – usually it’s a money-spinning, pusheaded indulgence into their past – buttheManics, as I wrote of the Holy Bible tour, are not the sort of band to partake in such hollow reprisals of now-irrelevant back catalogues. Instead, The Holy Bible’s resurrection was a scathing reminder of how Richey’s rebuke of End of History triumphalism remains dishearteningly pertinent.
My first reaction to hearing about the EMG tour was excitement – it’s one of my favourite albums, and I have often turned to it in moments of grief. I gave my ticket in knowing I was in for a great show, but I wasn’t quite sure why the Manics were dragging their 1996 kicking and screaming back into the present. It made sense, somehow, that if you bring back the Holy Bible you have to follow it up with Everything Must Go. But whereas the Holy Bible externalises anger to rip down the hypocrisy of those who benefit from a violent society, Everything Must Go is an internal struggle that finds the band wrestling with the pain of Richey’s decline and disappearance. After a twenty-year journey in which the bereft Manics have marked a new path as a trio, why had Nicky, James and Sean decided to revisit the raw emotion of this schism? Surely, it wasn’t just to go through the motions, in a pale sequel to last years’ tour?
Far from it.
There was a (very) small part of me afraid that without the enduring pertinence of the Holy Bible Tour, and the album’s association with mid-90s Britpop nostalgia, the EMG reprisal could fall short and end up an overweight, out-of-date impersonation of a moment that was no longer there. At the NEC in Birmingham (yes, I was there too – long story), there was a sense that the band themselves were concerned that it would be read this way, and on more than one occasion James thanked the crowd for coming to listen to them, and hoped that “this hasn’t ruined the memory of the album for them.” This uneasiness was itself an echo of last year, where the Manics were hedging themselves, eager to explain that to play the Holy Bible was a personal (and difficult) necessity, but still far removed from self-indulgence.
However, in the Royal Albert Hall, I found this fear to be grossly misplaced. Everything Must Go was, in 1996, a journey taken by the trio downwards toward the darkness that had enveloped their friend, alongside the upward steps toward the sun within which is held the strength to live in grief. In this way, it is the thematic as well as the chronological successor to the Holy Bible.
Richey posthumously provides five lyrics to Everything Must Go. His words in the Holy Bible were fuelled by anger – railing against the violence of humanity, the music and the lyrics jettisoned his anguish. But there were warnings, noticeably in Die in the Summertime (“scratch myself with a rusty nail, sadly it heals”), that this was only a temporary solution, the last flares of a dying star before it collapses in upon itself. The EMG songs that come from Richey’s pen find the lyricist in a concerning empathy with the victims of the societal cruelty he documented in the previous album. What unites his characters on EMG is their fate – they are trapped, consumed by their own hand or by that of others. If “man kills everything,” these are its victims, the Removables, or those who are caved in by the reality that surrounds them. The Elvis Impersonator who opens the album with limited face paint can rise no further from his life as a Blackpool sideshow, existing only in the drunken eyes of Lancashire nightlife.
“It’s so fucking funny, it’s absurd.”
Small Black Flowers that Grow in the Sky, seeing through the bars of the decrepit, caged zoo, sympathises with the captive chimpanzee, who has nothing but a tyre swing to entertain the masses that rattle her cage, the Simian cousin of the Blackpool impersonator. The Girl Who Wanted to be God nods to Sylvia Plath, “spat out” by Faster’s protagonist, but now a touchstone for our tortured lyricist in his final moments. Track 3 talks of the life of photographer Kevin Carter – a true storyof the man who won a Pulitzer Prize for his voyeuristic photograph of a malnourished black child being stalked by a vulture, who took his own life – unable to live with the terrible things he had sat and watched, like an ecologist, waiting for the right moment to capture for his fortune. Richey’s characters are trapped in a cycle of pain, and death forms the only escape.
For Nicky Wire, staring at a blank piece of paper in 1995, his friend vanished, he had to attempt to grasp the darkness that had taken Richey, and from there forge a path for himself and his band out of their grief. How do you comprehend a suicide? To attempt to empathise with this agony is to plot a journey toward the pain that overwhelmed a loved one. In his songs Nicky is often his own protagonist. In Australia, Nicky finds himself subsumed by a rising tide of depression as he tries to make sense of the past few months – “I don’t know if I’m tired, I don’t know if I’m ill” – and he responds by fleeing, as far away from everything as he can possibly get, to recover. Live in Birmingham, he confessed that while he was writing this lyric, he only made it as far as Torquay.
Nicky’s songs are filled with determination in the face of this pain – there is a growing understanding that he cannot run forever, and an acceptance that if he is to be alive, he must live with the chronic pain of losing his friend to suicide. In Enola/Alone, he concentrates on the comfort provided by the simplest of actions, walking on the grass, and taking in every moment of being alive, finding the strength to look at an image of his lost friend and hold on to a better memory. The title track has James screaming the line – “Just need to be happy” – like a tortured mantra, if he sings it with enough feeling, it *might* just come true.
Interiors gives us the story of the artist Willem de Koonig, who continued to paint as Alzheimer’s overtook him. It is the keystone of the album, of a man who, through all his suffering, held on to the very fabric of being alive for as long as he possibly could. As the final solo of the album rang out through the hall, a ticker tape explosion fell upon us. Red, white and green. On the screen appeared the Stanley Kubrick quote “however vast the darkness, we must supply our own light.” James later appeared with his acoustic guitar, and played his cover of Raindrops Keep Falling On My Head, the first thing he recorded after Richey vanished, in a final act of defiant existence. The crowd all knew what this meant.
“What’s the point of always looking back?”
In some ways, the Manics were, this night, once again replenishing an old album from the past into the present. “We’ve brought it back from the brink,” James explained, by which he meant from the brink of falling into the dustbin of Britpop history. Nicky recalled in the Guardian that Richey used to love the confusion of it all. And he of all people would have appreciated the contradiction of the Manics finding mainstream success with this record. It wasn’t the intention to sound like Britpop’s cousin – A Design for Life was in fact a response to songs like Parklife that hollowed out and vulgarised British working class life. Yet for a band attempting to confront a friend’s suicide head on, to be “co-opted into Britpop,” placed into the same bracket as nonsensical Noel Gallagher lyrics must have smarted. Bringing EMG “back from the brink” is a personal necessity for the Manics, so that the grief they express in this record is not diluted or whitewashed.
Everything Must Go did not need salvage in the same way the Holy Bible did – there is no social criticism it provides more scathing than was given in 1996. But it needed to be played this night, and played in full, all the same. In many respects this is the true anniversary of Richey’s passing, the moment when the band accepted his disappearance, and to wrench open the scars left by the nihilism of the Holy Bible requires Everything Must Go to restart the process of patching up old wounds.
Everything Must Go is a triumph – for Nicky, James and Sean – and for all of us who have been wrought by bereavement and ploughed on nonetheless. But this album is not a moment of happiness, and it is not only Britpop association that threatens its meaning. To scream that “everything must go” is not to heal yourself of grief, rather it is to accept that time cannot heal old wounds, it only numbs. Further Away – “feel it fade into your childhood,” was this night adorned with images of Big Pit, eroded Welsh beaches, and eroding colliery towns. “The further away I get from you…the harder it gets.”
This acceptance that grief never disappears, but only fades, adds to the tragedy of the record with each passing anniversary. This is why they bring Everything Must Go into the 21st Century. To reprise the immediacy of bereavement is to defy the numbing of time, and to recall the pain they felt in losing a friend is to recall why they loved him in the first place.
“I’ve been here for much too long. This is the past that’s mine.”
Strangely, there was comfort in this message. This was not misery tourism, but healing. Many in the audience empathised with this pain – I don’t mean those old enough to miss Richey too – but those who knew you carry grief with you, wherever you go, and in some form this is a way you can keep the love you had for that person alive. There is comfort in sound, as another Welsh band – Feeder – proclaimed when they lost a band member to suicide. That night, there were lighter moments too – in the joy of Show Me the Wonder, and Nicky’s eccentric liner notes between songs. “This is not the only anniversary we are celebrating. Ten years ago, I Killed the Zeitgeist.”
In the Albert Hall, the music rolled through the floor, bounced off the roof of the auditorium, as, after the album was played and passed, we all belted out Roses in the Hospital, in tribute to Richey and to David Bowie, in a punk rock Last Night of the Proms, where the Welsh dragon was draped over the balcony in lieu of the Union flag.
That night, the chords were loud and simple, the melodies catchy yet plain, not in mimicry of Britpop, but in contrast to the nuance of the Holy Bible – it is raw, emotive music, contradictory yet plain – no surface and all feeling. Catharsis needs simplicity.