Critics reflect on how social media, such as Facebook, Twitter and myDigg, fit into the perennial debate on cultural elitism
The Observer, Sunday 30 January 2011
Miranda Sawyer, broadcaster and Observer radio critic: ‘Twitter has made it easier for critics to hear other people’s opinions. Even then, though, you tend to hear similar views to your own’
When I was writing for the Face, during the 1990s, I went to interview some boy racers: young lads who spent all their money souping up their cars in order to screech around mini roundabouts or rev their engines in supermarket car parks until their tyres smoked. The kids asked me who I was writing for. When I said the Face – a magazine that prided itself on representing all aspects of British youth interests – every single one of them replied: “Never heard of it.”
The point is that most people – especially those outside the high-culture capital of London – are involved in culture of their own choice, often of their own making. Professional critics spend their time whizzing between private screenings and secret gigs, opening nights and exclusive playbacks. Everyone else just does stuff they like, with people who like it too. We naturally gravitate to others who share our interests, whether we spend our time collecting first editions, following Stockport County, yomping up mountains or watching three series of Breaking Bad all in one go. Our interests – our personal cultural choices – are what define a good part of our identity.
And mostly, those choices are ignored by the mainstream media. It was only during the 90s that newspapers began to cover pop music in a serious way; only very recently that computer games were deemed worthy of mention. There is still a hierarchy of culture in the media. On The Culture Show or The Review Show, for instance, contemporary art will always trump standup comedy. As a radio critic, I know full well that my reviews will never get the space of those that discuss TV or film. (Sport is even worse: if you’re interested in any sport other than football, cricket, rugby or tennis, forget it.)
The reason why professional critics agree a lot is that they tend to be of a type. They’ve often had a go at what they’re reviewing (they went to art school or were in a rubbish band or tried acting), they like writing and they’re a product of their age. I often find myself nodding along with the Guardian‘s Alexis Petridis, Lynn Gardner and Grace Dent, with Laura Cumming or Kitty Empire from this paper or Caitlin Moran of the Times. But that’s because we all want our culture to do the same things. We have similar taste.
The big difference Facebook and, especially, Twitter has made is that it is easier for critics to hear other people’s opinions. Even then, though, you tend to hear similar views to your own; after all, if you follow someone on Twitter it’s because something about them appeals to you. I tweeted about PJ Harvey’s new album the other day. The excited response I got from followers was amazing. But then, what did I expect? I wasn’t talking to fans of Justin Bieber. We don’t really connect.
Jessa Crispin, editor-in-chief of Bookslut: ‘The tussle, the argument, the fun of criticism is now online’
Whenever people start talking about the death of the critic, the health of criticism is measured in dollars. As in, how much money did the movie that all the critics loved so much make? How many books sold? If every critic in the western world loved Jonathan Franzen’s Freedom but it has only been on the US bestseller list for 17 weeks, well, then the critic must be dead.
What we take away from this argument, then, is that the role of the critic is to sell product. She is simply an extension of the marketing department. Indeed, many of the reviews of Freedom were written in the breathless prose of an artful press release.
One of the great powers of the internet community is its ability to shame the bombast, the overblown, the unquestioning. The focus isn’t merely the work of art itself but the culture that produces and lauds it. Recently there was an open letter posted on various blogs railing against the New Yorker‘s shortage of female feature writers, as well as a relentless campaign at The Awl to highlight the factual inaccuracies in The Social Network and explain why they matter. When Franzen’s glasses were stolen during a book release party, blogs such as Moby Lives pointed out the absurdity of sending helicopters after the thief, as well as the desperate grovelling of the publishing industry at the feet of Franzen, as if he had returned to save literary fiction itself.
More than shame, though, the internet’s greatest strength is enthusiasm. The tussle, the argument, the fun of criticism has moved online. While mainstream critics have narrowed their focus to a handful of novels, movies, and television programmes, the field has never been wider. The same few dozen books might be reviewed in every print publication but meanwhile hundreds of thousands are published every year. In literary criticism there are huge gaps in what gets written about in print: books by women, translated fiction, comic books, books released by small presses, science fiction… Online, though, every niche has its community of producers, critics, and readers, and it’s fed by passion and dedication.
Criticism isn’t about units sold, it’s about the conversation. The fact that Freedom briefly dropped off the bestseller list to me isn’t a mark that criticism is dead – it’s proof it’s still alive, skewering this idea of the objective opinion and rejecting the critics’ insistence that this is a flawless work that everyone must read. If the print media isn’t having the conversation the reader wants, it’s no wonder the listeners have migrated to a place that is.
Philip French, Observer film critic: ‘It could be that bad criticism might drive out serious writing’
Neal Gabler rightly notes the continuing contest between elitists and populists for a commanding position as opinion-makers in the United States. There’s also been a competition between supporters of respectable and disreputable culture, the former traditionally dominated by the churches and middle-class women whose genteel ambitions have shaped society through an opposition to gambling, sexual freedom and drinking, and their support of censorship.
The spread of the world wide web, which is now transforming our culture, allows anyone with a computer to set himself up as a reviewer, a participant in a critical discourse and a potential legislator. This is a positive tendency as well as an inevitable one, if more a cacophony than a civilised discourse, though back in the good old days the Edinburgh Review and Leavis’s Scrutiny also had their in-house bruisers. We must be aware, however, that the decline of print journalism and the ubiquity of the web may produce a cultural Gresham’s law. Gresham declared that bad money drives good money out of circulation. It could be that bad criticism might have a similar effect on serious, considered writing.
By setting up supposedly elitist critics against what he calls “ordinary people” or “ordinary folk”, Gabler does more justice to the former (a motley crew, at least in the world of film reviewing) and less than justice to those not professionally employed in what TS Eliot (that lover of music hall and the Marx brothers) called “the common pursuit of true judgment”. A harsher distinction – between the ignorant and the well informed, the insensitive and the aesthetically or morally responsive – would find adherents on both sides of this false divide. Will cyberspace produce its Samuel Johnson, its Edmund Wilson, its Lionel Trilling?
The established critics have frequently stumbled in recognising significantly innovative or original work. Michael Cimino’s flawed masterpiece Heaven’s Gate (1980) was lynched by American critics hunting as a pack, influencing the producers without giving the public a chance. It was then too late for European writers to rectify their judgment. Could bloggers have made a difference? Are they now attempting to?
Gabler goes along too readily with the anti-intellectual practice of using “critic” as a pejorative term. This isn’t new. Back in 1972 when I was devising a new arts programme for BBC radio, the then controller of Radio 4 said to me: “I don’t care what you call it as long as ‘art’ or ‘critic’ isn’t in the title.” In Waiting for Godot “critic” is the ultimate insult exchanged between Vladimir and Estragon, but Beckett intended it as a joke.
Hari Kunzru, novelist: ‘Critics praise work that doesn’t upset them. So much looks like art but just tastes of cardboard’
In America, cultural elitism has little to do with the arts. In the virulent debate between Jacksonian populists and whoever they’ve got in their gunsights/surveyor’s symbols this week, “culture” largely refers to values – belief in God, patriotism, the nuclear family and so forth. The idea of an artistic “critical elite” usually only turns up in cases such as the recent controversy at the Smithsonian over the removal of a 1987 video by artist David Wojnarowicz, after complaints by Republican congressmen that the work was offensive to Christians.
Aesthetics is very much not the issue here. The point is an attack on the social legacy of the 1960s, and an attempt to reverse the decline in the punitive moral authority of the church. “New York”, with its bankers and filthy art galleries and suspiciously European mass-transit system, often stands in for the nebulous idea of What Is Wrong With America. Or “Hollywood”, understood as a Jewish conspiracy to undermine the nation by showing pictures of American soldiers losing and girls with their tops off.
In more ideologically sophisticated regions of the American right, the notion of a cultural elite is threatening because it suggests that value may spring from something other than pure market forces. How dare you, the unelected critic, presume a specialist knowledge that can override the mystical self-unfolding of consumer choice? In this landscape, the highbrow/lowbrow divide seems like a quaint relic of a bygone age. We now live under the hybrid tyranny of middlebrow. No serious person believes the Oscars are a list of the best films, or the Grammys the best music. Charitably one could say they represent a kind of averaging out, an index of the taste of a group of informed people. At worst, critics acting en masse, with one eye on what’s popular and one eye on what’s good, end up praising work that doesn’t upset them. That’s why there’s so much stuff that looks like art, smells like art, but when you bite into it, it just tastes of cardboard.
This is why we have the internet. Social networks don’t strive for consensus. Instead they thrive on argument. A feed populated by diverse people (professionals or amateurs, paid or unpaid) whose taste you trust (and a few with whom you disagree productively) is the best way to squirm out from the tedious flubbery weight of middlebrow culture. It’s more work than getting your opinions off the TV, but once you try it, you’ll never go back.
John Naughton, professor at the Open University and Observer technology columnist: ‘The decline in critical authority began long before the net’
Well, of course the internet has something to do with it, but the decline in critical authority began a long time before the net was imagined, let alone built. What we’re looking at – at least in a British context – is the cumulative result of social and demographic changes that go back to the 1950s.
It’s mostly about the erosion of deference, and – as it happens – we know exactly when the process began. It was the evening of 8 May 1956, the first night of John Osborne’s play Look Back in Anger, at the Royal Court, when the audience gasped at the sight of an ironing board – an ironing board! – on a West End stage.
Why the astonishment? Because up to that point, the London theatrical scene had been dominated by plays about the upper-middle classes written by chaps such as the Hon William Douglas-Home. The subliminal message was that culture was a toff’s preserve – a royal enclosure of the mind, as it were. By putting something approximating to real life on the stage, Osborne was called an “angry young man” for his pains. But he broke the mould. Suddenly everything that had gone before seemed, somehow, absurd.
And then, with impeccable timing, the toffs really blew it. Anthony Eden, the epitome of the cut-glass, West End matinee idol, masterminded the Suez debacle and the country woke up to the fact that its governing class was a busted flush. From then on it was downhill (or uphill, depending on your point of view) all the way.
The erosion of social deference had a cultural impact because until the late 1960s professional criticism was also, if not a toffs’ preserve, certainly a highbrow, Oxbridge-dominated enclosure. The nation opened its heavyweight newspapers every Sunday to learn what Raymond Mortimer (Malvern and Balliol), Cyril Connolly (Eton and Balliol) or Philip Toynbee (Rugby and Christ Church) made of the latest books, or what John Barber (King Edward’s and Merton) and Kenneth Tynan (King Edward’s and Magdalen) thought about the new plays. In the circumstances, Geoffrey Madan’s description of the British cultural elite as “an arboreal slum of Balliol men” sounds peculiarly apt.
It couldn’t last, of course, and it didn’t. Rupert Murdoch arrived and made vulgarity respectable. Maggie Thatcher disparaged the cultural elite, questioned the worth of intangible values and legitimised greed. And all that happened long before Tim Berners-Lee first thought about the web.