The Suicide Comedies of Jack Lemmon

We live in a weird moment for suicide humour, where it seems simultaneously omnipresent yet also impossible to find. Go on social media, and you’ll find an endless amount of jokes about wanting to die from millennials. Lots has been written about this tendency and how it acts as a form of catharsis for a generation with very little to look forward to in life. It’s a way to spit up a bit of the poison that we’ve spent our whole lives ingesting, a source of relief and even community, as we signal a shared anxiety about the future to other people and their likes, shares, retweets, comments, etc. signal to us that we’re not alone. Or so the theory goes anyway.

I’ve enjoyed and participated in this kind of absurdist suicide humour plenty. I sincerely believe the change.org petition to “let people drink the red liquid from the dark sarcophagus” should be studied as a defining work of millennial neo-Dadaism. Who else has spoken for their generation so succinctly as petition author and video game programmer Innes McKendrick when he wrote “we need to drink the red liquid from the cursed dark sarcophagus in the form of some sort of carbonated energy drink so we can assume its powers and finally die”?

But I’ve begun to have my doubts about “lol please kill me” as the dominant genre of suicide joke in our age. Because it’s not really about suicide, is it? It’s about suicidality, about the abstract feeling of wanting to die, not about suicide as it happens in the world. While it can gesture at a wider context – e.g. tweeting “just put a bullet in my brain now” in response to some horrible news stories – there is something self-centred about it. Not selfish, but literally centred on the self, on the individual and how they feel inside. It’s always “I want to die” and “please kill me” and “every night I pray that a burst of gamma radiation from space will incinerate the atmosphere and end my suffering”. And that’s fine as part of a diversity of comic approaches to suicide, but I have to ask: where are the jokes about a hanging gone wrong? Where are the jokes about other people’s indifference to your pain? (“I told my therapist I was gonna kill myself. He said I have to start paying in advance.”) Where are the funny scenes of attempted suicide in mainstream comedies? I get a kick out of the occasional funny tweet about wanting to die, but the genre isn’t hospitable to other kinds of jokes, particularly jokes with scenarios and characters where we’re looking at suicidal people, not being them. When just one style of humour has become this totalising and suffocating, it’s not enough. It’s overplayed and unsatisfying and dull.

It also dovetails unsettlingly well with the growing tendency to treat mental illness, and therefore suicide, as an issue of individual brains and their damage. Mark Fisher, the left-wing writer who took his own life in 2017, wrote in his 2009 book Capitalist Realism that treating mental illness as purely an issue of brain chemistry, or even of personal health, is necessarily comorbid with the depoliticisation of mental health. “It goes without saying that all mental illnesses are neurologically instantiated, but this says nothing about their causation. If it is true, for instance, that depression is constituted by low serotonin levels, what still needs to be explained is why particular individuals have low levels of serotonin.” We may agitate for more funding for mental health treatment, but if we don’t also agitate to change the social conditions that lead to such high rates of mental illness in the first place, it’s little different than fighting for medical care for the children of Flint, Michigan, but not fighting to get them lead-free water.

I’m not laying the responsibility to build a revolution at the feet of the mummy juice petition or any other similar jokes, obviously, but I am curious about the way these tendencies seem to have come of age together and how the first generation raised to think of mental illness and suicide this way is also (1) extremely mentally-ill and suicidal and (2) constantly joking about it in this particular style. I love suicide jokes, to a degree others often find unsettlingly, especially if they know I’ve spent a lot of my life thinking obsessively about murdering myself. I’m not here to shut down the party by any means, but Christ does it need some shaking up. We need more yucks from guns misfiring and melodramatic motivations.

We need Jack Lemmon.

Continue reading “The Suicide Comedies of Jack Lemmon”

Deconstructing Louis CK, Part 2

Read Part 1, on the fraught expectations around reexamining the artistic works of bad people, here. 


“There were some changes in how certain shows are classified this year. For example, Orange is the New Black is now technically a drama, while Louie is now technically jazz.”

– Andy Samberg, 2015 Emmys Opening Monologue

The word “innovative” is thrown around a lot in contemporary cultural criticism. It’s hard to say why, though I have some theories: a lack of historical literacy, particularly with younger critics; an increase in critics, especially reviewers and recappers, using broad language and easy shorthand due to the punishing deadlines demanded by a hectic 24/7 online publishing environment; a growing tendency towards a mindset of critic-as-advocate in a crowded pop culture marketplace, which encourages critics to overstate the virtues of works of art they want to support in the hopes it will persuade more of their audience to give them a shot. Probably there are other reasons, but I like my theories because of all the first-hand evidence I have. I’ve called movies and TV shows innovative out of ignorance, expedience and a desperate want to convince other people to like the things I like so I have someone to talk about them with. Sometimes the truth – that something is “merely” fresh, interesting or novel – can seem a bit lacklustre. But “innovative” is a word with some heft behind it: not just new, but so new it represents a major break with the old way of doing things.

But artistic innovation is rare, and only gets rarer the longer a medium is around. Every medium has its limits, and while its early days will be a flurry of invention as artists create the basic vocabulary of material, structure, form, etc. eventually most things an artist can possibly do with paint on canvas or light on film will have already been done. Irmin Roberts, an uncredited second-unit cameraman (or cinematographer, sources vary), invented the dolly-zoom in 1957 during the making of Vertigo, and that was the first and last time a dolly-zoom was innovative. People have used them in new and interesting ways since then – the reverse dolly-zoom from Goodfellas melts my face off to this day – but it was innovative once. It opened up the medium to new possibilities once.

Maybe this seems pedantic, and it would be if “innovative” was a perfect synonym for “fresh” and “new” and “original”, but the concept of innovation is an extremely loaded one. It’s no surprise the term has grown in use over the last few decades given the valorisation of “innovation” spread by Silicon Valley and its pantheon of “visionary geniuses”, each as mythical as the last. But it’s exactly in that source we should see the danger in throwing it around so loosely. Technological innovations are constantly credited in the public imagination to people who did not create them, treated as the breakthroughs of singularly brilliant minds whose sole role, very often, was owning the companies where the workers who actually created the innovations were working at the time. Even to credit those workers is usually too simplistic, because their breakthroughs are frequently just the final step in a years- or even decades-long process of inquiry, research, design, testing, etc. that likely involved dozens if not hundreds of people who deserve recognition for their contributions. But they don’t get it. Even the one who makes that final jump doesn’t get it. Irmin Roberts invented the dolly-zoom and he doesn’t even have a Wikipedia page.

The word “innovative” is thrown around a lot in contemporary cultural criticism, and it wigs me out. It’s such a bold claim to make: not just something you’ve never seen before, but something no one has ever seen before. And even when you’ve correctly identified something as innovative, if you’re not careful, you can credit it in such a way as to bury the contributions of people without whom it would not exist. It’s not a word to be used lightly, not when criticism is often where the history of an art form – or at least the dominant narrative of that history – is written.

Let’s talk about Louie.

Continue reading “Deconstructing Louis CK, Part 2”

Deconstructing Louis CK, Part 1

For the last several years, an increasing number of celebrities and other powerful figures – mostly but not exclusively men – have been exposed for sexual assault and harassment. People call it the #MeToo “moment” and it’s fair to say the outing of Harvey Weinstein as a serial sexual predator in the pages of the New York Times and New Yorker was a kind of tipping point. But it was a tipping point in a trend that’s been growing for years and many of the people exposed since Weinstein are people whose behaviour were the stuff of rumour for a while before. Sometimes, people ask me why I’m so willing to believe accusers when they speak out when it’s all just “he said, she said”, and there are a lot of reasons, but one is definitely that I’d been hearing things about several of the people recently exposed years before anyone came forward. I’m not some celebrity insider or anything. I’m just some guy from a small town in Ireland who’s never met a famous person I couldn’t fail to make small talk with before falling completely silent and walking away mumbling to myself, as Father Ted’s Ardal O’Hanlon could attest if our encounter in a pub in Galway had been memorable in any way whatsoever. I’m not connected. But if someone had asked me to name sexual predators in Hollywood a year before the Weinstein story broke, I could have named at least a few of the men whose crimes were about to be dragged into the light: Bryan Singer, John Lasseter, Louis CK.

These past few years have raised a lot of challenging questions about how to relate to artistic works made, at least in part, by sexual predators. I’ve written about some of these questions before, and I will probably write about them again in the future. They’re not questions with easy, straightforward or final answers, if they have answers at all. An argument that might persuade you in one case could fail in another: when people say Woody Allen’s movies are inseparable from the man and his crimes, something about it just rings truer to me than when people say the same about the songs of Brand New, whose lead singer Jesse Lacey admitted to sexually exploiting teenage girls while he was in his twenties, and it’s hard to pin down why. Why can I listen to Brand New without guilt but just the thought of listening to Lostprophets, whose lead singer Ian Watkins is a convicted child rapist, turns my stomach? Why do Lostprophets songs turn my stomach when I was recently able to watch multiple episodes of Glee starring Mark Salling, who plead guilty to possessing child pornography before hanging himself, with minimal discomfort? The details differ, obviously, but all four of these men hurt children. What makes me want to take back Brand New’s music from its association with Jesse Lacey but not Lostprophets’ from Ian Watkins?

I’m not sure and may never be. Certainty may not even be the point. Perhaps constantly questioning ourselves and our judgement is the response these issues require. Not to the extent that we suspend judgement indefinitely and let ourselves off the hook from making decisions, obviously, but maybe a satisfying answer shouldn’t be the goal.

Let’s talk about Louis CK.

Continue reading “Deconstructing Louis CK, Part 1”

In Defense of the Canon

Pretty much the only time you’ll hear someone mention the canon in the year of our Lord 2019 is to explain why it’s bullshit: the canon is a bunch of stuff made by old or dead white dudes that a bunch of other old or dead white dudes decided was important, and everything outside of the canon is deemed, by implication, not important or worthwhile or particularly good. The canon is the epitome of cultural elitism; any English undergrad can tell you all about it.

The idea of a canon comes from the Bible, with the books deemed good, important and true being preserved and assembled as part of the Biblical canon, and other writings – like the gospel where the cross is a character that talks, or ones about Jesus as a kid – getting left on the cutting room floor. The idea of a literary canon is a kind of outgrowth from this: collecting the good and important works of literature – Homer, Dante, Chaucer, Shakespeare – as the ones worthy of study, the ones any educated person should be expected to have read. The literary canon is the stuff you’re supposed to read in school or college, but probably didn’t. There are tons of very legitimate criticisms of what makes up the literary canon: it tends to be disproportionately male – Jane Austen, the Bronte sisters, and Virginia Woolf would be the big exceptions when it comes to novelists – and almost exclusively white, and the people who decide what gets deemed canonical (academics and critics) have similar demographic problems. But the big difference between the Biblical canon and the literary canon is that there is no official list of classic books, with everything else likely to be lost or destroyed. The literary canon is necessarily in flux. When Herman Melville died, he was an obscure writer living in poverty, but a few decades later some hip literary types in New York realised no, wait, Moby-Dick is really good, actually, and now here we are.

Continue reading “In Defense of the Canon”

I Just Hope I Don’t Get More Out of This Than You Do

It’s been almost three years since one of the worst webcomic artists in the world published one of his worst webcomics of his career. The artist is Adam Ellis, formerly of Buzzfeed, whose work is likely familiar to anyone who’s ever used Facebook: it may well be mathematically impossible at this point to go a whole hour on Facebook without catching sight of his bug-eyed self-insert in a “relatable” and yet “funny” scenario. The comic in question was posted to Twitter with the caption “shhh” and depicts one of those deeply unfunny people who thinks not liking or knowing much about sport is a personality being silenced by an American football fan who tells him to “let people enjoy things”.

I loathe it more than most of his awful, awful work because, while I find “sportsball” types risible, it can’t mount a more thoughtful objection to their behaviour than “let people enjoy things”. It’s a nice slogan, but obviously a terrible blanket policy when people enjoy lots of bad things, and not just aesthetically bad, but morally bad. But even when there’s arguably not a significant, urgent moral dimension to something people enjoy, the “let people enjoy things” mantra makes me nervous. It’s one thing as a response to someone who’s snobby or pushy with criticisms of your likes or interests on an interpersonal level, the kind of people who comment on how unhealthy your food is or rag on the shows you like for no reason. But at any more macro level, like in online cultural discourse and, increasingly, in professional critical writing, it eventually becomes a way to deflect unflattering critiques or is so internalised that it pre-empts criticism at all.

Of course, Ellis and his comic aren’t responsible for the rise and spread of this attitude in online cultural discourse – how could it be, when Ellis’s work consists almost entirely in arriving three years late to observations that were already trite the first time they were verbalised? – but it’s emblematic of it in a way little else is, and for that, I hate it.

Continue reading “I Just Hope I Don’t Get More Out of This Than You Do”

Fan Boys: The Phantom Menace

People tend not to have a great sense of scale, which isn’t the best quality when we’re so prone to making grand proclamations about entire populations of people. For example, a common refrain since the 2016 US presidential election has been variations on “we now live in a country where nearly half the people voted for Trump”. Now and then someone will point out that, with 60 percent turnout, it was more like a quarter. But that’s still not right. It was 46.1 percent (vote share) of 60.2 percent (turnout) of 71.6 percent (eligibility) of the US population in 2016, or just under 20 percent. This isn’t to minimise the horror of the election result or Trump’s presidency in any way. Every evil thing, every atrocity, that has occurred in the past two years still happened, and, if anything, it just makes it more fucked-up that it didn’t even take a majority to happen.

That’s why it bothers me when I hear this “we now live in a country…” thing, whether about Trump or Brexit or any of the other awful election results of the past several years. If your main political opponents actually comprise less than 20 percent of the country, but you react as if it was half, you can’t possibly be responding in the most effective way. Accuracy matters, especially with something as high-stake as the fate of democracy, and it’s frustrating to constantly see well-intentioned people be so sloppy with reality. Not that low stakes should let people off the hook: standards of research and fact-checking in entertainment journalism are in the gutter and it drives me up the wall. And while it’s obviously not as significant as the rising tide of fascism (though it’s often presented as comorbid with it), when it comes to misrepresenting the scale of a social problem, there’s little critics and journalists have fucked up more than their coverage of “fan boys” and their allegedly toxic effects on society.

Normally, I find articles like this difficult to write, because it requires me to cite specific examples of bad writing and I don’t enjoy going off on other writers, for the most part. But this one will be super easy, because, for once, I can shit on the writing of someone whose writing I already constantly shit on.

This is a callout post. For myself.

Continue reading “Fan Boys: The Phantom Menace”

Behold the Man Who Is a Bean

Night on a deserted street in London. Saint Paul’s Cathedral shines on the horizon. A beam of light shoots down from the sky and expands into a spotlight. A man falls from above and lands smack on the ground. He wears a tweed jacket and red tie, brown slacks and a white shirt. An angelic choir begins to sing in Latin.

Ecce homo qui est faba.

“Behold the man who is a bean.”

Continue reading “Behold the Man Who Is a Bean”

Scenes from the Class Struggle in Medieval Europe

The critical reception to 2001’s A Knight’s Tale is full of terrible, lazy takes deriding it as mind-numbing trash. They’re full of disdain for low culture that places the film’s detractors squarely on the side of the its villains, a comparison that seems utterly lost on the whole pompous lot. The presumed audience of the film – teenagers – gets as much scorn as the film itself. The reviewers then scorn the film all the more in turn for its “pandering”. There are tons of complaints about its anachronistic 70s rock soundtrack, though some of the same reviewers, like Entertainment Weekly’s Lisa Schwarzbaum, would go on to name Moulin Rouge one of the best films of the year.

Admittedly, A Knight’s Tale isn’t as good as Moulin Rouge: this isn’t one of those articles where I try to convince you a largely dismissed piece of trash is actually a masterpiece. A Knight’s Tale is a pretty good popcorn flick, well-cast and competently made, with a straightforward plot and some good set-pieces. Reviewers were fond of referring to it as a “Middle Ages Rocky” or “Rocky on horseback” with exactly the tedious predictability they accuse its plot of epitomising, which is weird for two reasons: first, because Rocky is a gritty minimalist drama, and second, because, somehow, the comparison never made them consider that A Knight’s Tale, much like Rocky, is a film about class.

Continue reading “Scenes from the Class Struggle in Medieval Europe”

Against Relatability

I once had a friend question how I could possibly like Bon Iver’s debut album For Emma, Forever Ago when I’d never been through a breakup. (That isn’t strictly true, but I’ve been with the same person for my whole adult life, so it’s much of a muchness.) I can’t remember exactly how I responded, but it was something like: just because I haven’t been sad over a breakup doesn’t mean I can’t relate to being sad. He seemed sceptical but didn’t push the point.

Roughly six years later, I have a better answer.

Fuck relatability.

Continue reading “Against Relatability”

The Rashomon Effect Effect

The “Rashomon effect” describes the tendency of witnesses to or participants in the same events to give mutually contradictory accounts of what happened due to the subjectivity and fallibility of human memory. It’s named after the 1950 film Rashomon, directed by Akira Kurosawa, in which four witnesses to a murder give contradictory accounts of what happened. The term is bandied about a lot in pop psychology (and philosophy) articles, and one of its more recent applications is, of course, This Age of Trump, This Age of Brexit. In the aftermath of the two major political upsets of 2016, the mainstream media churned out hundreds of handwringing articles about the “post-truth world”, because it’s insufficient for cloistered political and media elites to have merely been wrong, their opinions and expertise are so important that if they were wrong, the only explanation was that the fundamental human ability to distinguish reality from fiction had completely disintegrated.

With its emphasis on an immutable failing of human nature – a fundamental inability to ever truly recall events accurately or, in effect, to know the world at all – it was inevitable that the Rashomon effect would be trotted out as a buzzy term to explain the new reality. Michael Wolff even mentions it in his insider account of the Trump administration, Fire and Fury. The appeal of the term is easy to understand: it describes a basic and insurmountable flaw, so it absolves everyone of responsibility to think about how and why falsehoods may have played a more decisive role in recent politics than in a supposed past era where people were more honest, or at least where the public was harder to hoodwink. I’m not saying they have done – I’m sceptical of the notion of a “post-truth world” – but if they did, I could think of reasons other than the Rashomon effect. Off the top of my head, it’s possible formerly authoritative news sources destroyed their credibility with the public by, among other things, helping the Bush administration manufacture the pretext for a war that killed hundreds of thousands of people. I’m no expert, but I have to wonder if perhaps public trust in the media was damaged when literally no one lost their job over one of the most massive and systemic failures of journalism in recent times.

Of course, that’s just my interpretation, and here’s where I should be putting the obvious joke about how it’s just like in Rashomon, where everyone remembers things differently. But there’s a problem. Unlike most people who reference the Rashomon effect, I’ve seen Rashomon. And Rashomon isn’t about the subjectivity and fallibility of memory. It’s about lying.

Continue reading “The Rashomon Effect Effect”