ANALYSE the BBC documentary Walking with Dogs

What was most impressive about Vanessa Engle’s Wonderland: Walking with Dogs (on BBC Two last night, and still available on iplayer) was the way in which it conveyed a message through demonstration as opposed to description.

The documentary about dog-walkers on Hampstead Heath could have easily descended into cutesy stories about puppies and bum-sniffing. Worse, it could have been clumsily steered by the heavy-hand of a producer and director who – through egotism or insecurity – felt compelled to provide voice-over descriptions of “what I realised”, “what dogs mean to us”, or “what I’ve learnt from these people”.

Instead, the show gathered meaning through simple interviews of dog-owners, and the occasional montage-with-music which is obligatory in every TV show. The result was a stark juxtaposition of voices, each with its own preoccupations and situation.

It was, as James Walton says in his review for The Telegraph:

“the TV equivalent of a powerful collection of short stories – most of them melancholy, and all of them linked by firm (and to us canine sceptics, rather shaming) proof of how much emotional support dogs can provide. Like the best short stories, the ones here didn’t divulge all their secrets – but somehow suggested whole worlds lying below their surface. Of course, one reason for this restraint is Wonderland’s commitment to old-school documentary-making.”

This is a clever way of story-telling. It is a sort of collage of subjectivity. En masse, however, these single voices take on a collective wisdom, forming a consensus through no intention of their own. In documentary terms, this also validates the producer; every argument made is made by an outside source, whose opinion is their own.

Most of all, it demonstrates what the best TV producers, novelists, artists and musicians know already: an audience is very capable of analysing events and arguments itself.

The benefits of letting an audience make up its own mind are momentous. This is most obvious when you read other reviews of Walking with Dogs and see quite how convergent they are.

Sam Wollaston, The Guardian, writes:

“These dogs don’t just run about and fetch sticks for their owners. They are the sticks, the crutches, that these people need to get along. They are also guard dogs; they protect their owners not so much from other people but from themselves; they fend off demons. They are substitutes too – for people who used to exist, or will never exist, or exist in a different way from how they used to exist.”

Tom Sutcliffe, The Independent, writes:

“The theme that slowly emerged was of the dog as unwitting social worker, a four-legged crutch that had helped an extraordinary range of people get through difficult times in their lives.”

Somehow, without telling her readers what to think, Ms Engle has led them to the same conclusions. There is no need for more evidence showing that human cognitions are entirely predictable.


Consider “the real truth”

The Frustrated Grammar Pedant award was granted to my office neighbour this week. There was fire in his eyes and blood in his cheeks when a client emailed him the phrase:

“I think this reveals a real truth . . .”

Pursing his lips as if to restrain an outpouring of spiteful bile, he said to me:

“Jonny, does the phrase “a real truth” make sense, or is a truth by its very nature real?”

“A truth is by its very nature real, Paul,” I said. “A truth is by its very nature real.”

We encounter tautology too frequently, I thought at the time, allowing Paul’s generous resentment to infect me. I also thought that it is so much worse when the words in question are pieces of guff in the first place. The phrase “I think this reveals a truth” is just utter word-rot anyway.

Then, in the various rifling through of online news articles which I get paid to do, I noticed that a grocer from the USA has released a healthy own-brand range. It is called “Simple Truth”.

I nearly vomited.

Why do people feel the need to stick an adjective before vast philosophical concepts such as “truth”? And who is Kroger the Grocer to say whether “truth” is simple or complicated? Personally I think that truth must be pretty fucking difficult or I would have worked it out by now. Lies and deceit, on the other hand, now they come to me like breathing and drinking. Should I not shop at Kroger due to philosophical differences?

But, then, as our tempers simmered, I thought:

What the hell, the  concepts of “truth” and “reality” and “simplicity” are fallacies! That being the case, why not use a whole adjectival string to emphasise one’s point?

I still think that this idea reveals a real, profound, genuine, authentic, actual, simple truth.

*

Incidentally, Collins English Dictionary defines a “pedant” as “a person who relies too much on academic learning or who is concerned chiefly with insignificant detail”.

I find that insulting. One can never be too right when it comes to using language precisely. And now I have to deal with the thought that the dictionary itself is a subjective entity, itself open to accusations of untruth and unreality. Shite.


CONSIDER what the hell “absence” is anyway

In Philip Hensher’s review of Zadie Smith’s new novel, NW, he writes:

“The structural division between characters who will never meet enacts a society in deep disruption.”

First, without having read the book, I considered how much this could be true. Yes, I thought, a novel in which none of the protagonists ever meet could imply that they inhabit lonely lives in a noncohesive society. Cloud Atlas by David Mitchell sprung to my mind, and Hensher mentions William Golding’s Darkness Visible.

I thought, beyond literature, it could also be true that our lives are open to analysis by the “structural divisions” which we enact between ourselves and the world; we could be defined most by the people who we don’t know, the places we never go to, the things we never think.

But then I thought: there’s a problem with this “never meeting”, this absence. The problem is that it is impossible to prove; or, if we are to accept that one absence is meaningful, then we have to accept countless other absences as meaningful (e.g. the distinct lack of rain/blood/elephants/telephones from Smith’s book).

Don’t you think it’s odd that Hensher chooses to derive meaning from the absence of conversation between Smith’s protagonists, when he could presumably gather much more evidence for what is present in her writing. Why does he put himself at a rhetorical impasse through claiming that this absence actively does something?

Similarly, why would I attempt to define myself by my unconscious non-choices when I am actually capable of judging my behaviour from the loved ones and places I do choose?

On the literature question, Gillian Beer makes an interesting point (on Virginia Woolf’s work):

“Absence gives predominance to memory and to imagination […] In one sense, everything is absent in fiction, since nothing can be physically there.”

Beer’s reading of what absence inspires in a person seems to explain Hensher’s review. When we read something and decide that there is a structural or linguistic absence, there is nothing to confront that thought. So we cognate on, free-wheeling.

But then Beer suggests that in literature, as opposed to life, it is entirely necessary to over-interpret, because “everything is absent in fiction”.

I’m not wholly convince by this. When we read and write, are we really dealing in absent material? This suggests that we should feel like we’re swimming about anchorless in open space, but I never feel more grounded than when I’m snuggling up with a book. I think Beer does tacitly acknowledge that she is over-egging it, because she notes: “in one sense”. Indeed there are other, more plausible, senses in which literature is very present.

I am convinced of one thing. Thinking about absence sucks in people – like Phillip Hensher, Gillian Beer, David Mitchell, Virginia Woolf and myself – and words – like loneliness, noncohesion, never, “division”, “disruption” – leaving us much more tired and no less confused.

When something is impossible to prove but we contain to rummage about in it, we are left open to being convinced by pseudo-thinkers and charlatans. That is, I guess, the origin of absence’s lover and nemesis: faith. Which I might write about next time.

Finally: thanks to an online word-processing tool called writtenkitten.net, which gives me a different picture of a kitten for every 100 words I write. Amazing. 🙂


CONSIDER the role of perspective in the debate on vegetarianism

The debate on vegetarianism is often truncated by the irreconcilable nature of two stalwart perspectives: the meat-eater and the vegetarian.

The vegetarian sats to the meat-eater: “I’m not going to eat meat because there is no reason to, and you shouldn’t eat it either.”

The meat-eater says to the vegetarian: “I’m going to eat meat because there’s no reason not to, and you should eat it too.”

As you can infer from these two images, the only way to progress from this stalemate is to introduce a third perspective, like a third player introduced during an impossible chess endgame would reinvigorate play. This is the key player in the battle between meat-eater and vegetarian: animal.

Slavov Žižek can provide our means to this third perspective. In his exegesis of Hegelian philosophy and the modern West, at this talk, he makes a salient point about how considering the object’s perspective invariably informs the subject’s ideology:

The question to be raised is not what this philosopher can tell us, but the opposite one: what are we and our contemporary situation in his eyes?

Although Žižek is making a particular point about historiography,  I think that the same kind of exercise in empathy – that is, the imagining of another perspective to inform one’s own ideology – can be used for our vegetarianism debate. Although instead of emancipating dead philosophers from objects to subjects, we are doing so with necessarily silent animals.

Here’s the fun bit.

Imagine dining out at a steakhouse with a three guests: a meat-eater, a vegetarian, and a cow. While the carnivore would salivate, and the vegetarian would remonstrate, what would the cow do and think?

It would surely be a existential challenge for the cow to read a menu of its species’ organs and appendages. And then to have the desired parts delivered on a platter to the communal table as the main actor in a culinary theatre. It would be a moment of uncanny self-awareness which no human has ever felt, to consider one’s body functioning as something other than a vessel for its own promotion, as a means to another’s pleasure.

It would seem sinister that the world has created a vast infrastructure with which to feed its people on your body, delivering you from field through plant to plate. It would seem macabre to the cow that there are tens of other restaurants on this street, hundreds in this town, and millions on this planet all delivering your kind to others, better off dead than alive.

It is my suspicion that most vegetarians perpetuate their ideology through the reinforcement of this feeling of the uncanny or bizarre. It acts as an unconscious kind of empathy. Your anecdotal proof is this: a vegetarian friend of mine says sometimes that when she looks up and sees flocks of pigeons or crows flying from building to building, she realises how weird it is that we share a world with them. We have the streets; they have the rooftops.

This third perspective doesn’t answer the debate, nor freshen up the stalemate. But perhaps through giving a voice to a silent object, it can complicate what is essentially a simple difference of opinion.


COMPARE & CONTRAST Bob Dylan and the scientist

In How Creativity Works, Jonah Lehrer strives to articulate how creative decisions are taken: how does a scientist alight upon an innovative solution, and how does Bob Dylan conceive, gestate, and birth a song? Lehrer says:

“This is the clichéd moment of insight that people know so well from stories of Archimedes in the bathtub and Isaac Newton under the apple tree. The moment of insight can seem like an impenetrable enigma. The question, of course, is how these insights happen.”

This question of creation badgers artists in the same way that a baby does its parent. “How did you get here?” is a question leading inside the artist (back to before they consciously conceived a project) as well as outside (to the stimuli which triggered the cognitions leading to the novel, painting or poem produced.)

Martin Amis is eloquent about it, in an interview with The Spectator:

“At which point do you realise that you have a novel springing to life? It’s a fascinating question. It’s all decided in a moment, I think. You get a funny feeling, you see something or read something and almost at once you get a kind of throb, which goes through you — a shiver. And you think: this is a novel I can write. You don’t know much about it, but you know how you’re going to begin, perhaps. It’s a situation, it’s a setting, but it’s deeply mysterious. The whole process is deeply mysterious.”

Amis’ description of the moment of creation captures neatly its physiological (“throb”, “shiver”), triggered (“see or read something”), and enigmatic (“perhaps”, “mysterious”) conditions.

It brings me back to the first comparison: between Bob Dylan and the scientist. There is a problem here, and it lies in the differences between a creative solution to a problem, and creativity.

Whereas Lehrer rightly uses the scientist to show how creative innovation can solve a problem beyond the scope of logic and algorithms, I don’t think the same can be said for Bob Dylan. The key difference is that the scientist works towards an end (answering part of an unfinished theory, finding the right chemical formula to perform a task), whereas the artist does not.

Take Lehrer’s example of Archimedes who leapt out of the bathtub when he had discovered the displacement of water. That Lehrer recounts this clichéd tale in a clichéd fashion suggests he has thought little about the real connection between this moment of inspiration and one that a musician like Dylan would have. He says:

“Hopelessness eventually gives way to a revelation. This is another essential feature of moments of insight: the feeling of certainty that accompanies the new idea.”

This is true for a scientist, who can ratify his hypothesis through testing and the testimony of his peers. But it would be inadvisable for Bob Dylan to claim that his latest song is “right”.

Why? Because art strives towards subjective perception; science, objective measurement. Hence my issue of definitions earlier: art has creativity as an end in itself, and science has creative solutions to extant problems.

This is the fundamental difference between two universal fields of human activity, but Lehrer has failed to understand it. It is only through Practical Criticism (which encourages us to interrogate the premise of everything we are told) that this shortcoming can be revealed.

Indeed, in performing this analysis we have created shortcomings of our own — words like “art” and “science” can’t be flung upon us so flippantly — but this is only a blog, after all.


CONSIDER the term “middlebrow”

Practical Criticism #7: Use of “middlebrow” as a derogatory description for literature has slipped out of our vocabulary, though it is still in our dictionaries. It is an inevitable fate for many adjectives and nouns that they will pass into and out of common circulation within a generation.

100 years ago, for a generation of modernists whose literary authority relied upon “highbrow” esotericism, to have a work called “middlebrow” was a curse of the highest order. Look at the avant-gardism of the 1920s’ seminal works: James Joyce’s Ulysses, T. S. Eliot’s ‘The Waste Land’, Djuna Barnes’ Nightwood. What we have of decade’s literature is some of the militantly anti-“middlebrow” writing there has ever been.

In fact, Virginia Woolf wrote to the editor of The New Statesman to complain that he had omitted the word “highbrow” from a review of her latest book. To lump her in with the “middlebrow” was to call her a “petty purveyor” with more money than taste. Middlebrow men simply plundered expensive art at auctions in order to hang on their walls and stack on their shelves something that could fill their void of good taste. The middlebrow were neither thinkers like the highbrow, nor doers like the lowbrow, but pursuit-less and shallow: interested only in fame, power, money, and social standing. They used literature as a means, books as props, rather than as an end.

Leonard Woolf, Virginia’s Husband, made a similar criticism in his short book Hunting the Highbrow (1927) – no doubt the two of them had some cracking chats over the dinner table! This book is an anthropologist’s account of the bizarre species of man known as the “middlebrow”. Such creatures he further categorised:

  1. Those men who don’t have the intellect to appreciate highbrow literature,
  2. Those who are too vain to accept art as anything other than a tool of social betterment,
  3. Those who can never learn “taste” even if they tried.

Ah, “taste”! you may have noticed that word appearing several times so far in this post. It is another mouldy morsel which we no longer chew upon. This was a divisive word in the early 20th Century, and it was wed to the conception of “brow”.

But does it mean anything that we longer have “middlebrow” and “taste” in our verbal arsenal? Well, we do have our own versions of them. We now use “middlebrow”in a positive senseas “trashy”, “chick-lit”, and “summer reading”. We call art which we know requires little “taste” a “guilt pleasure”.

“Highbrow art” has, I think, now become a derogatory term. We would call it “smug” (Salman Rushdie), “self-indulgent” (Julian Barnes), and “obscure” (Tom McCarthy).

Thus we have seen a complete reversal in literary appreciation between 1922 and 2012: “highbrow” has become a critique not a point of pride, “middlebrow” has become those cheerier “guilty pleasures” which we know we shouldn’t like. Indeed the reading of literature has become a specialised task undertaken only by university students and scholars, whereas it had been the place of every man.

The last I heard of “taste”, too, was in 1979 when Pierre Bourdieu published his sociological study Distinction: A Social Critique of the Judgment of Taste. After that? Well, the highbrow among us read too much Foucault, Derrida, and Zizek to worry about the middlebrow. And the middlebrow too this indifference to mean free rein to dive into Dan Brown and Jilly Cooper unashamed.

I don’t mean to be as scathing as I sound. On the contrary, this switch in meaning is okay. It maybe suggests that our culture now is more stable than it was in the 1920s – before two world wars, indeed – and so we have no need for artistic revolution. But Practical Criticism demands that we at least note the coming and going of these trends and definitions so that we can inform the next generation, indefinitely, that ‘way back when’ we saw things very differently.


CONSIDER the impact of technology on generosity

Recently I read an article which said that today’s technology-native generation want access rather than ownership.

To information, to music, to gadgets, to every accessible entity.

I viewed an example of this on the train yesterday. Two late-teenage girls were talking about an acquaintance of theirs who was trying to ingratiate herself into their close friendship group. To paraphrase (names have been invented):

Mary: Have you seen the emails that Sophie’s been sending around recently?

Laura: No, why?

Mary: You have to check your emails, it’s so awkward and weird.

Laura: What do you mean? What’s she been saying?

Mary: You know she really wants to go to Glastonbury and Newquay with us this summer, well she’s been sending us links to loads of YouTube clips and gossip articles.

Laura: Oh god that’s so embarrassing.

Mary: I know. And she says stuff like “Hi guys, have you seen the worlds cutest puppies? They’re sooo funny!” and then loads of kisses.

Laura: That’s so blatant.

This conversation should demonstrate one function of an access-focussed culture. In terms of social interaction, generosity turns from a system of monetary exchange to one of qualitative provision. That is, instead of living out the equation Money = Flowers = Wife’s temporary happiness, a man might send her a link to a news article he hopes she is interested in with the vaguer aim of improving their relationship.

In the case given above, where historically Sophie may have purchased ownership of jewellery, food, music, etc for Mary and Laura’s friendship group, now she sends them knowledge of—and access to—chosen information which is simultaneously available to others.

There is an added challenge for the giver in this new realm. The object of technological generosity is much less defined, and thus Sophie must take more care to choose the right link, clip, or article. Seeing as there is no monetary value to the gift, it becomes solely a prize of appropriateness: whether the gift conveys access to highly desirable information, or not.

If successful, this provision creates a far more intimate relationship because it clarifies the parties’ mutual interests. If unsuccessful, it fails dramatically because there is no other value attached to it but the personal.

Sadly for Sophie, her information is not desirable to the beneficiary because it too blatantly assumes an antiquated system of exchange: Sending links = Invitation on holiday.

In this sense, is the new generation not more generous and thoughtful than its ancestors?