NANNY KNOWS BEST

NanniesReflecting on the perennial relevance of using an English public school as a metaphor for the hierarchical framework of British society, David Sherwin, the screenwriter of Lindsay Anderson’s landmark 1968 movie ‘If…’ remarked that ‘the world has always been governed by prefects’; even if one received a state education, the meaning behind his statement is not lost in translation. In a Britain that has been divided and ruled by ex-public schoolboys for almost twenty years, it doesn’t take a great leap of the imagination to see the ghastly superior arrogance of the prefects that attempt to beat Malcolm McDowell’s rebel into conformity being played by Blair, Mandelson, Cameron and Osborne.

In the early 70s, the late great barrister, champion of free speech, writer and ‘Rumpole’ creator John Mortimer engaged in a televised Cambridge Union debate on the subject of pornography, advocating the libertine side of an argument in opposition to the obligatory Mary Whitehouse as well as future Tory leader Michael Howard; Mortimer remembered Howard as insufferably smug and self-satisfied, behaving as though he was already resident on the frontbench in the Commons even before he’d stood for Parliament. In a sense, this was the prefect theory writ large, a young man indoctrinated with the patrician belief that he knows what is best for the rest on account of his being trained to govern those less fortunate than himself, just as his predecessors were trained to govern the colonies.

The argument of the educated elite back then was the same as it always has been – that certain books, images or films are fine for those whose supposedly wide-ranging education has taught them to appreciate things that the lower orders don’t have the intellectual capabilities to appreciate in the same way; should these things be made available to the irresponsible masses, moral depravity can be the only outcome. They cannot be trusted and need to be guided, advised and protected; such an attitude was the motivation for the founding of the British Board of Film Censors in 1912 and continues to dictate government policy where the plebs are concerned, evident in everything from the smoking ban to advice on how many alcoholic units are allowed per day. Those who make the laws clearly cannot be corrupted by the immorality inherent in that which they seek to save the people from. Their education provides them with an intellectual and emotional immunity.

Mervin Griffith-Jones, prosecuting barrister at the Lady Chatterley Trial of 1960, famously asked the jury if DH Lawrence’s long-banned erotic novel was a book ‘you would even wish your wives or servants to read’, a statement that contained within it all the prejudices and sense of superiority that educational privilege can inculcate. After a brief window of just over thirty years in which the country was led by five successive Prime Ministers who weren’t raised in this environment, the ruling elite reverted to type and once again the British public find themselves being talked down to and treated like children by their ‘betters’. It is no coincidence that children are usually evoked when our betters attempt to justify their outrage on our behalf, as though every adult in the country has an innocent little angel handcuffed to their person 24/7, inadvertently exposed to everything we voluntarily expose ourselves to – laws on pornography or smoking or illicit substances must be tightened because of the children. Children are a convenient tool to prick the moral conscience, but why not kittens or puppies?

The news this week that long-overdue renovations to the Palace of Westminster may require a temporary debating chamber has thrown Parliament into disarray due to the fact that the proposed location for this chamber doesn’t allow the consumption of alcohol. Owned by an ‘Islamic company’, drinking there is forbidden, so there would be no bar. A workplace in which alcohol is banned?! Imagine such a thing! How could industry cope if the nation’s workforce were denied the right to booze? After all, the old licensing laws were only introduced in the first place because the government feared the effect of inebriated labourers on the war effort in 1914. The mortified response of Honourable Members to the possibility that they could be deprived of their liquid perks speaks volumes as to their belief that they should be allowed privileges not afforded the rest. Nobody yet knows if the underground rifle-range at Westminster will also be threatened by the imminent refurbishments, but I dread to think how MPs will cope if that too is rendered out of bounds. British steelworkers currently facing redundancy must wonder if their own workplace rifle-ranges are under threat along with their livelihoods.

That ego and the seductive allure of power can induce an even greater feeling of Us and Them amongst politicians who ascend to ministerial heights is something that few dispute; MPs being caught either with their hands in the till or their trousers round their ankles are common side-effects of coming to believe their status enables them to get away with indiscretions ordinary mortals would be punished for – something both the expenses’ scandal and the career of the recently deceased Cecil Parkinson will testify. The current crop of prefects seem to blame the evils of an over-powerful state on an abundance of public services, hence their determination to strip them down to the bones; yet it is the ability of the state to outlaw everything that distinguishes a democracy from an autocracy that is the real evil the public have to contend with when their nanny can infantilise them with unchallenged impunity. They regard you as a lower life form and always will if you let them.

TERRY WOGAN (1938-2016)

WoganThe death of Sir Terry Wogan at the age of 77 is the latest in a series of celebrity deaths that has marked out a year only one month old as a graveyard for famous faces that have been part of the nation’s cultural wallpaper for as long as many of us can remember. Of course, that these names were in the free bus-pass demographic means the Grim Reaper had already rented a property in their neighbourhood, waiting for the call; but though childhood may sometimes still seem within touching distance when we snatch a glance over our shoulders, the fact remains that those of us born in the 1960s or 70s are travelling away from it with the same gathering speed as the stars that illuminated it.

As a child, Radio 2 seemed like Radio 1 for my grandparents; I initially associated it with holidays at their house, when my grandma’s caramel-coloured radio would be tuned-into Jimmy Young playing Doris Day and other 50s mainstays. Later, when even the briefest spin of a hit lumped in with Punk was more than my father could stomach, his in-car radio crossed the no-man’s land between the hairy cornflake and Wogan to infect my ears with contemporary MOR. It was the pre-breakfast TV era, when radio was the exclusive link to the outside world, as it had been for generations, and radio’s role as a means of starting the day has never really left me. Television before the late afternoon still seems wrong.

Whereas Radio 1 DJs wore satin bomber jackets ala Smashie & Nicey, Radio 2 DJs dressed like Val Doonican; there wasn’t actually much of an age gap between them, but at least the Radio 2 crowd didn’t pretend to be younger than they actually were, bar the one concession to middle-aged vanity, the toupee. But it was the voice that mattered more than the look, and Terry Wogan had one of the best, a warm Irish twist on the oral central heating of Oliver Postgate at his most soothing.

At one stage in the 1980s, Wogan was omnipotent to the point of annoyance. He’d presented ‘Blankety Blank’ for years and was then hosting a chat show on primetime BBC1 three nights a week; he presented ‘Children in Need’ every November and narrated the Eurovision Song Contest every spring. He’d only just packed in his long-running morning show on Radio 2, the vehicle that had brought him to the country’s attention, as his workload left him with little room to focus on it. By the turn of the twenty-first century, however, his TV profile had dipped somewhat, washed-up on the barren shores of Channel 5’s daytime schedule; wisely, he’d gone back to basics on Radio 2 and helped establish the station as the nation’s most popular. By this time, past antagonism towards him had softened as it often does with the survivor; those dulcet tones were once again associated with distant mornings in which they had bantered with Jimmy Young, and the Eurovision underwent a similarly nostalgic reappraisal as Wogan’s witty and sardonic commentary made it worth watching.

Wogan finally retired from his daily Radio 2 show in 2009, handing the slot to Chris Evans as part of Radio 2’s occasional facelift when it drops down a generation. He continued to broadcast a weekend show until November last year when health problems also caused him to miss ‘Children in Need’ for the first time. His passing doesn’t end an era anymore than the death of fellow veteran broadcaster Ed Stewart a few weeks ago did, for the era ended long before either hung up their microphones…innit.

© The Editor

SPIRITS OF ’76

GainsbourgThe middle moment of a decade can often mark a peak or a trough, depending wherever popular culture happens to be. In both the 1960s and 90s, the middle constituted a definite peak, whereas the 1970s and 80s found the cul-de-sac between the two ends of the respective decades wallowing in a complacent calm before an unexpected storm. With history traditionally written by the winners, however, looking back forty years to 1976 requires detachment from the official cultural chronicle of the 70s that has spewed forth from the pens of Parsons, Savage, Morley et al ever since they and their generation ascended to the mainstream media in the 80s. Widening the spectrum beyond music, it’s certainly evident that 1976 was a watershed year, but not in the ways the angry old men of the twenty-first century would have anyone who wasn’t there believe.

Most years are marked by endings of some sort, but there seemed an unusually high amount in 1976, as though a line was being drawn under what had gone before along with many of the figures that had defined it. A crucial change in chart trends was clear when the year saw no top ten sightings at all for some of the biggest pop acts of the preceding four or five years – no Slade, Sweet, Gary Glitter, The Osmonds, David Essex, Suzi Quatro or Alvin Stardust, and even the all-conquering teen act of ’75, The Bay City Rollers, only managed one appearance. Meanwhile, pop’s most notable Englishman abroad, John Lennon, failed to release any new recorded material for the first time since the debut Beatles single in 1962; as his former colleagues were also finally free from their Apple contracts, nobody knew then that Lennon wouldn’t release another record until 1980.

But it wasn’t just chart careers that ended in ’76. The year opened with the death of Jason, the very first ‘Blue Peter’ cat, a feline counterpoint to Petra since the monochrome days of 1963; his passing was marked by the final in-studio appearance of Valerie Singleton as a regular presenter on the programme. In one fell swoop, two iconic figures from one iconic TV show were gone. As a child, the sudden removal of fixtures from a picture they have always been part of is genuinely unsettling, and one is made aware for the first time that nothing is permanent. Younger viewers also noticed the military backup for ‘Doctor Who’, UNIT (a much-loved mainstay of the series since the very first episode of the decade), had quietly vanished from the supporting cast, removing the earth-bound aspect of the series in the process.

A BBC series that even predated the 1958 debut of ‘Blue Peter’ ended in 1976 as ‘Dixon of Dock Green’ finally hung up its truncheon after twenty-one years. Lead actor Jack Warner was 80 by this point, but he’d continued to play the part of Sgt Dixon as a uniformed Bobby up until the final series, when he was then seen in civvies as a deskbound police collator. Despite maintaining an exceptional standard of writing and acting, Warner’s advancing years and failing health contributed to the retirement of ‘Dixon’, and he only outlived it by five years. ‘Z-Cars’ had been seen as a grittier alternative to ‘Dixon’ in the early 60s, with the hardboiled CID characters of Barlow and Watt eventually being transferred to a spinoff, ‘Softly Softly’. From 1969, the series was renamed ‘Softly Softly: Taskforce’, but by 1976 even that was viewed as pedestrian in comparison to ITV’s tough new kid on the cop block, ‘The Sweeney’, and Thamesford Constabulary followed Sgt Dixon into retirement that same year.

Sid James died on 26 April; the man with the dirtiest laugh in showbiz was starring in his own primetime ITV sitcom, ‘Bless This House’, which was still being produced when he died of a heart attack en route to hospital after collapsing onstage. While ‘Bless This House’ dealt with middle-aged parents struggling to handle their teenage children’s place in the Permissive Society, ITV’s other great sitcom of 73the era, ‘Man About the House’, removed the parental figures and instead focused on a twenty-something trio of flat-sharers, two of whom were girls and one of whom was a guy. The premise was slightly risqué when the series had begun three years previously, but the running gag about the potential saucy goings-on in the flat rented from the Ropers was gradually superseded by storylines in which the living arrangements of the three lead characters were secondary. ‘Man About the House’ ended its run after six series in April 1976, marking the symbolic TV demise of the archetypal early 70s Jack-the-Lad character (represented by Richard O’Sullivan’s Robin Tripp), who thereafter lingered on in the cinema via Robin Askwith before being sucked down the postmodern plughole.

On the very same night that ‘Man About the House’ came to an end with Paula Wilcox’s Chrissy breaking the heart of Richard O’Sullivan’s Robin by marrying his smug brother Norman, ‘Coronation Street’ COPYRIGHT GRANADA T.V. Use of this still is granted free of charge in connection with programme publicity only. All other uses may only proceed with prior written permission from GTV. and a usage fee will be charged.experienced the abrupt disappearance of Minnie Caldwell. The deceptively timid sidekick of resident battleaxe Ena Sharples had been one of the show’s great characters from its beginnings in 1960, yet the actress Margot Bryant’s increasing inability to remember her lines forced her to leave with her character receiving an off-air departure that robbed the viewers of a proper goodbye. That she was eventually diagnosed with Alzheimer’s explained a great deal, something that may also have been apparent to another figure who had been part of the nation’s wallpaper for over a decade, the Prime Minister Harold Wilson.

The Labour leader’s eight overall years as PM may have been interrupted by the three and-a-half year tenancy at No.10 by his nemesis Edward Heath, but the strain of steering the country through such turbulent waters was beginning to take its toll on him by the time he won his fourth General Election in 76October 1974. When Heath was deposed by Thatcher the following March, the fight seemed to ebb away from Wilson as his duelling partner of a decade was no longer facing him across the dispatch box and his weariness was writ large on his haggard countenance. The EEC Referendum of June 1975, when he’d allowed his Cabinet to dispense with collective responsibility, exposed serious divisions in the party that would grow deeper in the years to come, and that would have been a good enough reason to bail out. But it was the worrying failure of Wilson’s razor-sharp memory during his last year or so in office that was perhaps the final straw; he announced his shock resignation just five days after turning sixty in March 1976. Even ‘Private Eye’ editor Richard Ingrams later admitted it was hard to imagine the country without Wilson at the helm.

Mirror Pistols CoverMembers of the general public under and above 15-25 were not necessarily regular readers of the music press, so when The Sex Pistols and Punk Rock became overnight household words as 1977 was just weeks away, it seemed as if the Four Horsemen had ridden out of nowhere to call time on the 70s. In a way, they had; but it wasn’t until roundabout 1979 that the true mainstream impact of the unlovable spiky-tops was evident in the fresh crop of chart acts that pointed the way towards the 80s, whether Tubeway Army or The Police, not to mention shorter hair and straighter trousers. That their arrival should have come at the end of twelve months in which so much had drawn to a close belies the myth that Johnny Rotten and co swept everything away the moment they let rip in their summit meeting with Bill Grundy. The evidence is that the cluttered cultural landscape had already been cleared as if in subconscious preparation for the Apocalypse.

© The Editor

CLIMB EVERY MOUNTAIN

HillaryThe death of Henry Worsley, which was announced on Monday, served as a sober reminder that even in this age of satellite communications and scientific knowledge of the damage a severe environment can do the human body, some expeditions remain as dangerous a challenge as in the days of Shackleton and Scott a century ago. The 55-year-old ex-army man was attempting to cross Antarctica alone, to achieve what Sir Ernest famously failed to; he came within just 30 miles of his goal, almost 70 miles closer to the South Pole than Shackleton had managed in 1909, yet when he was airlifted to hospital last Saturday, Worsley was found to be suffering from complete organ failure and died not long after.

The currency of exploration has been devalued somewhat over the past half-century; after Hillary stood at the top of Mount Everest in 1953, that now-chronically unfashionable Victorian/Edwardian notion of a Great British ‘Boy’s Own’ Hero began to rapidly diminish. Perhaps, if you’ll pardon the pun, the concept had reached its peak. It partially resurfaced in the likes of the dashing, driven Donald Campbell as he was breaking world speed records in the 50s and 60s, but that was really its last hurrah.

A hundred years back, someone such as Henry Worsley would have been a celebrated household name and inspiration to a generation of little boys, embarking upon his mission for Queen, Country and Empire as well as the ego-boosting kudos that came with being the first man to do this or that. Planting the Union Jack in the soil of his destination would have enshrined him in the history books and earned him a place alongside Drake or Burton. These days, with the general assumption being that every inch of the earth has been conquered, an explorer has to interest the public in his endeavours by raising money for charity – and Henry Worsley did just that, selecting one that aids wounded servicemen.

Even after the discovery of the New World in the fifteenth century, exploration was seen as a viable pursuit when myths and legends of lost kingdoms still retained a grip on the romantic imagination of nations. Although many explorers were wealthy gentlemen dilettantes whose yearning to locate uncharted land was mainly motivated by the prospect of garnering further riches, the imperial element also constituted a crucial aspect of the quest. Claiming virgin territory for their monarch ensured social elevation once they sailed back home, with a knighthood and possible peerage on the cards. When the great naval explorations undertaken by professional seamen rather than amateur adventurers and commissioned by the state came to an end in the nineteenth century – ones that mapped the oceans and opened the doors for European colonisation of Africa and Asia – the Polar Regions were then looked upon as amongst the last challenging landscapes left on earth, and the gentlemen once again took to the high seas, bringing poetry and artistry to exploring.

What became known as the Heroic Age of Antarctic Exploration followed the decline of official state-sanctioned expeditions; the privately-funded adventure became more prevalent when a few high-profile tragedies led to a loss of appetite for adventure on the part of the Royal Navy. One of the most notorious was the ‘Lost Expedition’ of Sir John Franklin in 1845, in which the long-held belief of a shortcut between Europe and Asia, AKA the Northwest Passage, was to be located once and for all. Within two months of setting sail, however, the 129-strong crew disappeared and it took the best part of 150 years to piece together the probable events that caused the expedition to claim the lives of all involved, a long-running detective story that eventually climaxed with the discovery of Franklin’s ship, HMS Erebus, as recent as 2014. As was tradition during Franklin’s lifetime, his heroic failure was turned into a triumph, with his likeness preserved in statues and paintings, a posthumous immortality that awaited Captain Scott.

After Amundsen’s successful South Pole expedition in 1912, the next challenge was Mount Everest, with the British determined to be one step ahead of the competition by launching the first official expedition in 1921; the team, including George Mallory, were there as a reconnaissance party, but saw enough to calculate Everest could be conquered. A second British expedition followed in 1922 and then again in 1924, both containing Mallory. His attempt to reach the summit on the third expedition ended in mystery when he and his climbing partner Andrew Irvine vanished whilst making the ascent, never to be seen again after last being sighted around 800 vertical feet from the summit. Even though Mallory’s body was finally discovered in 1999, no one will ever know if he reached the roof of the world three decades before Hillary and Tenzing.

In the aftermath of the first Moon Landing in July 1969, Richard Nixon claimed Armstrong, Aldrin and Collins were worthy of being spoken of in the same breath as Columbus, as though the future of exploration lay in the stars rather than down here on earth. If that were indeed the case, perhaps this is also a factor behind the alteration in the perception of earth-bound explorers in an age of health and safety, seen as reckless individuals taking their lives in their hands for purely egotistical reasons – as opposed to national heroes engaging in a patriotic duty. The only way to soften attitudes seems to be to add charity to the mix as a means of justifying the adventure, for neither adventure nor adventurer are enough anymore. The era of the heroic explorer is long gone. Having said that, one still cannot but admire the devil-may-care simplicity of Mallory’s famous reply when asked why he wanted to conquer Everest. ‘Because it’s there’ should be used as a mantra for anyone with the slightest inkling of achieving anything their environment and upbringing have conspired to deny them. It’s worth a thousand self-help books and motivational lectures.

© The Editor

FOUR LEFT FEET

SDPIn retrospect, it was inevitably destined to fail in its original incarnation; it was a fragile four-way partnership from the beginning. Like Crosby, Stills, Nash and Young recording one landmark LP before internal conflict ceased to inspire creativity, competing egos scuppered any chances of long-term success. But hopes were certainly high in some quarters 35 years ago today, when Roy Jenkins, David Owen, Shirley Williams and Bill Rodgers issued the Limehouse Declaration, a media event that confirmed the Gang of Four’s split from Labour and the formation of a new centre-left political party, the SDP. Who can forget that logo – a post-punk graphic classic? Using the same font ‘Melody Maker’ was then employing as its masthead, it made an instant impact at a time when political parties weren’t branded as they are today. In its own small way, that logo said as much about the desire to break with British political tradition as the party’s policies. For a brief moment in 1981, some saw the birth of the Social Democratic Party as the light at the end of the gloomy tunnel the nation had been travelling through for far too long.

From the mid-70s onwards, there had been a series of defections from the Labour party – whether the likes of Dick Taverne and Eddie Milne standing against Labour candidates as independents, having been threatened with de-selection by their local branches as they were infiltrated by Trotskyites, or the likes of Christopher Mayhew and Lord Chalfont joining the Liberal cause – something that suggested Harold Wilson’s largely successful role at playing the pacifist parent keeping his warring offspring on the Left and the Right from engaging in ideological fisticuffs was not entirely effective. It was evident that trouble was brewing beneath the united front, and even the election of moderate Jim Callaghan as Wilson’s successor couldn’t prevent growing dissatisfaction from other moderates within the party over the increasing influence of the far Left. As the Tories took the country to the Right in the wake of the 1979 General Election, the appointment of Michael Foot as Labour leader was the last straw for the Gang of Four.

Despite the shaky start Margaret Thatcher made as Prime Minister – there were even rumours of a coup at one stage in 1981 – the Labour party was already regarded as unelectable; its move towards the far Left in the wake of the 1979 General Election defeat had been a disaster for Labour in popularity polls, with only its most loyal, diehard supporters sticking by Michael Foot, a principled and passionate politician, yes, but one utterly unsuitable for the task of leading the country. The public had blamed the industrial chaos of the late 70s on the more militant tendencies of the Left, and a Labour party still in thrall to the unions, led by a wild-haired eccentric who resembled a mad professor from a Children’s Film Foundation movie, was never going to be elected to office. The calamitous drubbing Labour received at the 1983 General Election had been forewarned by a quartet of Labour moderates stranded by the party’s lurch leftwards.

Jenkins, the avuncular old-timer, Owen, the suave matinée idol for the housewives, Williams, the bossy headmistress, and Rodgers, the Ringo of the band, were all former Labour Ministers who had become disillusioned with their party’s self-destructive policies and felt there was an unoccupied middle ground within British politics at a time when Labour and Conservative were positioned at ideologically-opposed extremes. Some still argue they bottled the challenge of wrestling control of Labour from Foot and Benn in a civil war they perhaps knew they could never win. But the formation of the Social Democratic Party, coming when Thatcher’s popularity was at her lowest ebb and Michael Foot was Public Enemy Number One in the eyes of the right-wing tabloid press, was generally well-received as breath of fresh air.

There were plenty floating voters around in 1981 whose faith in Labour and the Tories was waning after the two had swapped places over the past decade without any discernible improvement in the country’s fortunes, and they welcomed something different. Twenty-eight Labour MPs and one Tory eventually joined the SDP, and a series of by-election victories leading up to the 1983 General Election suggested the new party was a force to be reckoned with, achieving an opinion poll rating of 50% at one stage in 1981.

The arrival of the SDP had also been welcomed by several senior members of the Liberal Party, including their leader, David Steel; it was felt by many Liberals that the two parties were far more ideologically matched than the Liberals had been with Labour during the short-lived Lib-Lab Pact of 1977/78. Perhaps a partnership between the SDP and the Liberal party was inevitable, and the two entered into a mutually-beneficent union at the end of 1981 as the SDP-Liberal Alliance. Although their instant popularity received a knockback when Thatcher’s standing rose considerably in the wake of the Falklands War of 1982, at the 1983 General Election, the party polled 25% of the national vote – though the ‘first past the post’ British electoral system only resulted in 23 Alliance MPs being elected. They fared worse at the 1987 General Election, by which stage the party’s honeymoon period was long gone and Mrs Thatcher was at the peak of her powers.

A complete union between the SDP and the Liberals had long been mooted, but disputes over who should lead them, and growing ideological differences, continued to plague the two parties as the initial promise of the SDP appeared to have floundered in the eyes of voters. Splits within the SDP were compounded when they and the Liberals officially combined as the Liberal Democrats in 1988, a new party that drew most of its numbers from the SDP, yet was led by a former Liberal, Paddy Ashdown. SDP members who opposed the union, most prominently David Owen, staggered on before eventually disappearing from the mainstream political map in the late 1990s, whereas the Lib Dems gradually became the most significant third party in British politics for more than a generation, peaking with a tally of 62 seats at the 2005 General Election under Charles Kennedy.

Some have reduced the SDP to a footnote in British political history, buy there’s no doubt that the SDP proved to the ailing Labour party that it was possible to move towards the middle ground as the warring extremes of Left and Right began to turn many potential voters away from politics by the mid-80s. Labour’s own ideological shift started in earnest during the aftermath of the humiliating ’83 Election hammering, with the appointment of Neil Kinnock as party leader, the first step on the long and winding road to New Labour and power. 35 years on, with the Lib Dems reduced to eight measly MPs (even less than the Liberals had in 1981) and Labour again led by an old-school Socialist with his head in the clouds, could history repeat itself? If so, any breakaway from Labour would require the presence of figures with a little more clout than Chukka Umunna, Liz Kendall or Tristram Hunt. But there aren’t any.

© The Editor

LAND OF MAKE-BELIEVE

TrumptonA work of philosophical fiction published 500 years ago, one that paints a portrait of an egalitarian republic that advocates the abolition of private ownership and tolerates all forms of religious worship; this is a country that can boast free health care, divorce, married priests of both sexes, and legal euthanasia; it has an exclusively rural economy without serfdom, elects its head of state and opposes gambling, lawyers, hunting, cosmetic adornment and war. Essentially classless, it nevertheless has a large slave population, though this one real concession to the times in which it was written shouldn’t overshadow the radical and revolutionary concept of society it promotes.

Seemingly anticipating Enlightenment thinking, as well as both Socialism and Communism, one would imagine the author to be a humanist visionary who despised the faults within his own era and looked to a more idealistic future. Yet, ‘Utopia’, first printed in 1516, came from the pen of Thomas More, a man whose later position as Lord Chancellor under Henry VIII saw him instigate a ruthless campaign against ‘heretical’ Protestants until his devotion to the Catholic Church cost him his head when refusing to recognise Henry’s religious supremacy following the break with Rome.

Originally written in Latin (as was the custom), ‘Utopia’ was hardly the ’50 Shades of Grey’ of its day in terms of sales, appearing when the majority of Europe was largely illiterate and not actually published in More’s homeland and native tongue until 16 years after his execution, 35 since it was first published on the Continent. Like most people reading this, I would imagine, I’ve never read it myself; but then I’ve never read the Bible or the Koran either; that doesn’t necessarily mean I’m ignorant of their content or unaware of their far-reaching influence.

As far as ‘Utopia’ is concerned, the title itself – derived from Ancient Greek – is perhaps now more famous than the book. The word and its opposite, Dystopia, has continued to permeate popular culture up to the present day; a recent Channel 4 drama borrowed the title, even though one feels this was ironic, as the society its cast of unpleasant characters inhabited was far-from Utopian. Utopian and Dystopian fiction have crossed all manners of literary genres, including horror and (mainly) science fiction, though ‘Gulliver’s Travels’ probably remains the most entertaining, not to mention satirical, interpretation of the Utopian idyll. The concept has also been prevalent in cinema for several decades. One only has to think of ‘Brave New World’, ‘Nineteen Eighty-Four’, ‘A Clockwork Orange’, ‘Zardoz’ and ‘Equilibrium’; TV has dabbled too, from various episodes of ‘The Twilight Zone’, through to ‘The Prisoner’, ‘Survivors’ and ‘The Walking Dead’.

As early as the aftermath of the English Civil War, with Puritan groups such as the Diggers, the ideas More espoused in ‘Utopia’ began to infect numerous political ideologies and have continued to do so in the centuries since its publication. The seismic societal shift of the Industrial Revolution and the increasing chase for profit at the expense of human wellbeing in the nineteenth century saw Socialist philosophies emerge in Britain, despite their brutal suppression across the Channel following the initial optimism of the French Revolution. The publication of Marx and Engels’ ‘The Communist Manifesto’, arriving just in time for the Revolutions of 1848, updated some of More’s theories and expanded them into a more cohesive manifesto for social change. Although Marx regarded some of More’s ideas as naive, the presence of slavery in the island of Utopia and the notion that privacy did not equate with freedom was cited as highly relevant to the Soviet model of Communism by Aleksandr Solzhenitsyn after his time in Gulags. The claiming of More as a Communist Godfather by the nascent USSR was emphasised by his name’s presence on a list of similarly regarded figures carved into Moscow’s Obelisk of Revolutionary Thinkers, a monument that stood near the Kremlin for 95 years before being controversially (not to say symbolically) demolished without warning on the instructions of the autocratic President Putin in 2013.

Some have claimed More’s inspiration for the society depicted in ‘Utopia’ came from his experience of monastic communities as well as his time spent as an envoy in Europe, where he was exposed to Renaissance Humanism. The fact that Utopia is situated in the New World seems to suggest the author believed a better future for mankind lay away from the endlessly warring European kingdoms and the unequal societies they had evolved into over centuries. There is also an argument he was influenced by the ideas of Ancient Greek philosophers; the text is peppered with knowing references to Ancient Greek, mainly in the names of Utopia’s cities and in the central character of Raphael, whose surname translates as ‘Dispenser of nonsense’. It’s quite possible More may have had his tongue in his cheek whilst writing ‘Utopia’ as well as reflecting the idealism of youth before that idealism was corrupted by power and wealth.

Whatever the genesis and subsequent influence of ‘Utopia’, the fact that it was written so long ago and by a man one doesn’t instantly associate with the ideals the book depicts, is a fascinating example of thinking outside the conventions of the box as they stood when More conceived it. Unlike, say, Darwin’s ‘On the Origin of Species’, ‘Utopia’ was something of a slow-burner when it came to its cultural impact, though considering the severe reprisals dished out by religious and monarchical institutions to anyone who questioned or challenged their right to rule in the sixteenth century, perhaps that was just as well for More. Imagine living in a time when freedom of thought and the ability to express that in writing were condemned and crushed without mercy…

© The Editor

BEAR NECESSITIES

PutinIt was an assassination straight out of a Cold War spy novel – an East European dissident resident in London stands at a bus-stop, waiting to get to Bush House where he will broadcast on the BBC World Service; a sharp pain suddenly shoots through the back of his thigh; he gazes behind him and sees an anonymous man walking away carrying an umbrella. Only when he develops a fever that evening is a connection made between the sickness and what resembles a bee-sting on his leg. Gradually convinced he has been poisoned by foreign agents, he dies before any action can be taken. He was Bulgarian novelist, playwright and critic of his country’s Communist regime, Georgi Markov, and a pellet containing the poison ricin had been jabbed into his leg via the tip of the umbrella carried by a man believed to be a KGB operative. Nobody has ever been charged with his murder, which took place in September 1978.

Both Markov’s minor celebrity and the novelty of the method used to assassinate him garnered headlines at the time, even though agents of both East and West were routinely engaged in such cloak-and-dagger hits back then. The notoriety attached to Markov’s murder merely confirmed the activities of operatives as played out in the pages of John le Carré novels were far-from fantasy; le Carré’s own day-job at MI6 gave him an insight into the realities of international espionage that infused his books with the kind of accuracy Ian Fleming had avoided. When the Cold War officially came to an end with the collapse of the Soviet Union and Boris Yeltsin’s attempts to implement a Western-style free-market economy on Russia in the 90s, many assumed the age of East-West antagonism was over. Not so an ex-KGB officer and rising star in Yeltsin’s administration, Vladimir Putin.

For the last half-decade of his 16-year KGB career, Putin was stationed in East Germany, where he worked in tandem with the Stasi. As the DDR entered its final days, he was instrumental in the burning of KGB files and was then recalled to Moscow; during the KGB-engineered coup to topple Gorbachev in 1991, Putin sensed which way the wind was blowing and switched sides. After a period in local government in St Petersburg, Putin transferred to Moscow and was promoted into President Yeltsin’s team, falling back on his KGB experience to head its successor, the FSB. By the end of the 1990s, his hardline approach to the conflict in Chechnya earned him public plaudits and he was awarded the post of Prime Minister before winning the nomination as Yeltsin’s Presidential heir, a position he ascended to in 2000.

During his first stint at President (2000-08), Putin’s aim seemed to be to restore pride and prestige to a country that had lost an empire and had suffered the exploitation of Yeltsin’s economic reforms by a handful of businessmen who acquired immense riches whilst the majority of the population endured hardship. Putin’s persecution of the Oligarchs unsympathetic to his rule forced most of them into exile before they could be imprisoned, whereas his crackdown on free speech and media independence attracted his first heavy criticism from the West.

When he stepped down as President in 2008, Putin was accused of being a backseat driver when his successor Dmitry Medvedev appointed him Prime Minister, a position he held for the next four years before running for President again in 2012. By this time, Putin had moved the Presidential goalposts, altering the constitution in order to extend a third term from four years to six.

His re-election was widely condemned as rigged, but despite protests both home and abroad, Putin’s grip on power was reinforced. The expansion of the Russian military and interventions in Chechnya, Georgia and Ukraine, as well as the annexation of Crimea, all of which were initiated to return the country to its Soviet-era size, served to restore the only kind of Russian pride and prestige a man raised in the USSR and employed by the KGB could relate to. What some have referred to as a Mafia-style concept of governance is not without its home-grown critics, one of whom was Alexander Litvinenko. The former FSB officer had defected to the UK as his ex-boss Putin was poised to be inaugurated President and embarked upon a career in journalism and adviser to the British secret service. His accusations that both the FSB and Putin were complicit in several acts of terrorism, including the 2006 murder of Russian journalist Anna Politkovskaya, earned him the enmity of his countrymen

Just a couple of weeks after implying Putin had ordered the assassination of Politkovskaya, Litvinenko suddenly fell ill on the same day he had met two ex-KGB agents in London; hospitalised, his condition worsened as it was discovered he had traces of polonium-210 in his blood; evidence that he had been poisoned by a rare radioactive substance backed-up Litvinenko’s claims that Putin had been behind the poisoning, but he died within three weeks of being admitted to hospital. Litvinenko’s widow fought long and hard for a public inquiry into her husband’s death and was finally given the green light in January 2015. Now, twelve months later, the conclusions of the inquiry are that her husband was indeed deliberately poisoned and effectively murdered by two fellow Russians, named as Dmitry Kovtun and Andrei Lugovoy, the pair he had met on the day he fell ill.

Relations between Russia and Britain, weakened by the initial murder of Litvinenko and not helped by Putin’s subsequent military adventures and persistent macho posturing, appear to now be at their worst since the Cold War. Russia’s refusal to allow the extradition of the two men accused of Litvinenko’s murder and their counter-accusations of the inquiry being ‘politicised’, along with Theresa May’s vocal condemnation of the Russian state’s response, suggest this particular case will remain a thorn in the side of British-Russian relations for a long time to come. All accused parties deny any involvement in Litvinenko’s death, but to paraphrase dear old Mandy Rice-Davies, they would, wouldn’t they?

William Gladstone’s belief that Britain and Russia should be natural allies due to their geographical similarities, situated on opposing edges of Europe, both part of and yet outside of the continent, was an alliance he surmised would be a useful means of preventing a dominant France or Germany in the centre. 150 years later, the two nations have never seemed so distant from one another; and neither France nor Germany will lose much sleep over that.

© The Editor

A MENTAL CASE

CameronNew Year, New Grandiose Statements. David Cameron’s been making loud announcements over the past seven days; this week he’s been banging on about immigrants and the need to speak the lingo as crucial in embracing the culture. The British Citizenship Test is crammed with the kind of historical references to this country that half of the natives wouldn’t be able to answer on account of great chunks of British history – especially that nasty imperialist empire-building stuff – being excised from the curriculum for fear of offending ethnic minorities, despite the unavoidable fact that most owe their presence here to the existence of the old colonies. Perhaps any aspiring Brits who fail rather than pass the test should be awarded citizenship in that this is a more of a mark of their native credentials.

Last week, the PM expressed his desire to eradicate estates from the landscape – not the green and pleasant ones he and his horsey chums ride across when hunting foxes, of course, but the so-called sink estates that the plebs live on. He probably learned about them from watching ‘Benefits Street’. He complemented this newfound concern for those whose desire to escape the miserable poverty of such estates has been scuppered by his administration’s ruthless cuts by making a pledge to invest £1 billion in various forms of mental health care; it attracted the kind of headlines a Prime Minister usually attracts when announcing a new policy in which a lot of money will be splashed about, but his sudden public declaration of concern on the issue of mental health seems very much at odds with the policy his government has pursued over the past five years, especially with those mental health sufferers whose conditions render them unable to hold down a regular job and are faced with little choice but to claim benefits.

The Work Capability Assessment was introduced by Gordon Brown’s Government in 2008, and a Tory-friendly policy that demonised and degraded disability benefits claimants was eagerly taken on by the incoming Coalition in 2010 as a canny way of tackling the deficit. Media horror stories that portrayed any benefits claimant as scrounging scum helped create a climate in which sympathy and genuine assistance were bound to be in short supply, thus giving government the green light to ride roughshod over disability and mental health campaigners. The controversial outsourcing of testing benefits claimants’ capability for work to the French company Atos ended in August last year, but for all the negative publicity Atos received in the media following a series of baffling decisions to pass severely ill claimants as fit for work, the fact is they were under immense pressure from the Government, particularly DWP Tsar Iain Duncan Smith.

Stories emerged a couple of years back from ex-Atos employees that the essential nature of the task entrusted to them was to bring down the unemployment figures and to alter the unhelpful conclusions of fitness testers should they find a claimant genuinely unfit for work. Statistics later revealed that from 2011-2014, 2,380 claimants died – several by their own hands – less than two months after their claims came to an end courtesy of Atos.

With their offices picketed and their employees receiving death threats, Atos have attempted to distance themselves from the damage done in their name since their contract was prematurely terminated, and their website states that contract was rewritten by the Government to conveniently hand over responsibility for the disability benefit aspect of so-called welfare reforms. Granted, there may well be an element of passing the buck, but the DWP called the overall shots, and it seems pretty evident now that Atos were merely obeying orders. After all, the dodgy US insurance company Unum has been issuing advice regarding disability claimants to successive British Governments since the 90s, and played a key part in designing the testing system that the DWP then implemented and hired Atos to carry the can for.

Awarding the contract to Atos in the first place was hardly a move that spoke of concern about mental or physical disability on the Government’s part; the lack of medical or psychiatric qualifications Atos employees possessed, not to mention zero empathy with the claimants’ condition, said it all. Would one hire a lifetime teetotaller to run an AA group or someone whose drug experience stretches no further than swallowing the odd Anadin to work as a counsellor in a rehab clinic?

It was evident early on that the vulnerable on the bottom rung of society’s ladder were once again in line to bear the brunt of an economic situation they hadn’t caused and their lives were being placed in the hands of a company that didn’t even pay Corporation Tax. Following the suicide of one claimant Atos had decided was fit for work, the coroner overseeing the case surmised that Atos had come to this decision without taking any of the doctors’ reports into consideration throughout the 90-minute assessment. Who is most qualified to judge whether someone’s mental or physical condition prevents them from stacking supermarket shelves or cold-calling members of the public?

The former Chief Economist at the Cabinet Office, Jonathan Portes, said the assessment programme Atos carried out for the DWP was ‘the biggest single social policy failure of the last fifteen years’; with a record number of appeals lodged with tribunals, increased antidepressant dependency amongst those who have been subjected to the assessment, and a string of suicides, the programme is an appalling example of a government whose alleged concern for mental health sufferers is clearly secondary to boasting of falling unemployment figures. For David Cameron to then don the mantle of a man who cares does tend to stick in the throat a little – not that an alien body lodged in the windpipe would prevent anyone being passed as fit for work, though.

© The Editor

GENTLEMEN AND PLAYERS

SAM_2351Any embryonic art form with commercial potential is looked upon by envious eyes, just as the Martians observed planet Earth in the late nineteenth century, according to HG Wells. The first such art form in the age of the mass media was cinema; a glut of little film companies sprang up in the early silent era, many of which grew into the giants we all recall from their memorable idents – MGM’s lion, Paramount’s mountain, Universal’s spinning globe, 20th Century Fox’s giant logo illuminated by searchlights and so on. Smaller companies that had flourished during the heyday of the silents had been absorbed into the larger outfits come the talkies, which soon achieved virtual dominance of the global cinema market. Running parallel with the rise of the Hollywood studio system was the music business, first through publishing and then the formation of record companies to facilitate the advances in reproducing recorded sound.

There were mutually beneficent relations between the movie studios and record labels in the pre-rock age, when the majority of hit records emanated from hit movies or hit Broadway shows. Songwriters rated higher than performers in terms of earnings, as the same song could be covered by a variety of singers, giving rise to the ‘standard’ that no individual performer had an exclusive claim on. Singers were little more than serfs as far as record companies were concerned, limited to pitiful royalty rates and chained to contracts signed without reading the small print.

At the beginning of the 1960s, British record companies such as EMI and Decca were headed by Old Etonians who could just as easily have been running the Bank of England, addressing everyone by their surnames; their recording studios were staffed by men in white coats who treated the job like any other 9-to-5 occupation. Maverick outsiders such as producer Joe Meek were very much on the fringes of the elite, running cottage industries from their own homes and leasing the end results to labels, as had been the case during the brief eruption of US rock ‘n’ roll before it was eaten alive by the big bucks of the big boys.

By the end of the 60s, the logo in the centre of the vinyl had not only become as identifiable as the movie studio ident, but many embodied a particular style of music, giving the listener an inkling as to what to expect even before they placed the needle on the record. Most majors had an offshoot label to cater for the more experimental end of the pop spectrum, and the men who ran these offshoots were often counter-cultural dudes with a business brain, a long way from the titled toffs at the top of the pyramid – at least on the surface. Even if there was an apparently extreme contrast between the man who condescendingly welcomed a new signing into his office with a tray of tea brought in by his secretary and the one who opened a box on his desk housing several ready-rolled joints, what headmaster and head-boy shared was a very British sense of amateurism. To use old cricket terminology, they were gentlemen rather than players.

All that changed in the 70s and 80s; the phenomenal money-spinner that the 60s had turned pop into gave birth to what was truly a music industry, and the players wrestled control from the gentlemen. Despite Punk spawning numerous independent labels, these were gradually bought up by the majors to supersede the old subsidiary labels that changing musical fashions had removed from the record racks. Improved awareness by artists as to just how much their predecessors had been ripped-off was matched by the new professional rock manager, personified by Led Zeppelin’s Peter Grant.

Few acts that sprang to prominence during the halcyon days of rock as business saw ownership of their golden eggs slip through their fingers via poorly-executed deals as The Beatles had suffered. Record companies as well as their artists and management were now all well enough versed in the pros and cons of making a mint from music to run the industry with a slick ruthlessness that ensured a splendid time was guaranteed for all.

With the benefit of hindsight, it’s evident that record companies assumed this state of affairs would be permanent; as they began buying each other up in an act of corporate cannibalism, the new kid on the block called the internet was poised to shake up an industry that had grown fat and complacent on the profits of past glories. The arrival of illegal downloading sites such as Napster opened a divide between producers and consumers that would issue a challenge to the Godlike authority of record companies that the record companies thought they could dismiss with a campaign similar to the short-lived ‘home taping is killing music’ crusade of the 80s. They claimed songwriters and performers would be denied their due royalties if the public weren’t prepared to pay for music, yet copyright on the likes of YouTube can be easily implemented, thus neutralising any income the poster of a song or music video stands to derive from it.

A couple of months back, assembling a Beatles CD compilation for my own personal listening, I sourced the songs I needed via YouTube, as all my Beatles albums are on vinyl. I had no problem locating the material and the CD was put together quickly. Typing in other Beatles songs for a different compilation a couple of days ago, I found most of them had disappeared. Those that survived in visual form only were without their musical content; the excuse given was that the copyright was ‘owned by UMG’ – not Parlophone, not even EMI, but UMG. UMG – who’s he when he’s at home? The record industry’s very own Judge Dredd, that’s who.

UMG stands for Universal Music Group, a behemoth of a corporation that is a subsidiary of the French media monster Vivendi, and is based in California. It owns the vast majority of pop’s most valuable back catalogues and has sought to counteract any unlicensed music videos appearing on YouTube with its Vevo brand. Famous names that fall under the UMG umbrella include Geffen, Chess, A&M, Capitol, Island, Def Jam, Decca, Polydor, Motown, Virgin and EMI. The latter was purchased in 2012 for £1.2 billion. UMG has recently started throwing its weight around online, cracking down on anyone daring to post favourite songs that UMG ‘owns’. It would be nice to think UMG is simply a collective of music lovers ensuring songwriters and artists are paid their due royalties, but it isn’t. It has more in common with News Corps, but is very much in synch with a one-time art-form that is now little more than a leisure industry whose aim is to generate a feel-good factor to its fast-food consumers. So, RIP the record company, from Sir Joseph Lockwood to Colonel Sanders in fifty years.

© The Editor

VAPID RESPONSE

Vapid ResponseA 1999 Jeremy Paxman interview with David Bowie excavated and aired anew during this past week sees the man whose abrupt withdrawal from the stage takes some getting used to make some remarkably prescient predictions about the way in which the then-infant internet would alter the cultural landscape. Music’s method of delivery has arguably undergone the greatest change of all in the seventeen years since a cynical Paxo doubted Bowie’s foresight; but when the visionary interviewee warned cyberspace had the potential to generate both good and bad, the reactions on social and antisocial media to the week’s events vindicated his soothsaying.

‘Channel 4 News’ anchor Jon Snow is the cause of the latest online lather for attempting to inject a tediously obvious gag into a discussion with actor Richard Wilson on the premature passing of fellow thespian Alan Rickman. When Wilson guest-starred as himself in a classic ‘Father Ted’ episode, losing his rag as his ‘One Foot in the Grave’ catchphrase was barked in his face at an inappropriate moment, Victor Meldrew was still on screen at the time and ‘I don’t believe it’ was an omnipotent uttering within popular culture. For Jon Snow to ask Wilson if he believed it when hearing of Rickman’s death is an example of an ageing media man presuming the catchphrase retains its relevance; not that this has been acknowledged by Outraged of Everywhere, however.

Actress Emma Watson has also been castigated for her response to Rickman’s death, this time accused of exploiting it to push her own agenda. Her crime was to post a quote on feminism by her ‘Harry Potter’ co-star as a tribute; as Watson is known for her own feminist beliefs, should it really come as a great surprise that she used a quote that chimes with her opinions and thus emphasises why she was fond of the late actor? Facebook and Twitter were both awash with David Bowie quotes that were selected because whoever posted them felt they possessed the most relevance to their own lives.

I found Watson’s online tribute considerably less nauseating than the rash of crass tokenisms that routinely appear whenever a public figure passes away. The likes of David Cameron has no choice; as Prime Minister, it is a given that the death of a cultural giant such as David Bowie will result in the subject being raised when a microphone is stuck in his face, and he has to respond in the way both media and public expect him to. When John Lennon died in 1980, incoming US President Ronald Reagan was asked for his reaction, as was 60s PM Harold Wilson; it’s hard to imagine either politician putting their feet up of an evening and listening to ‘Plastic Ono Band’ just as the thought of Cameron nodding his head to side 2 of ‘Low’ is pure fantasy. These men are required to have pat buzzwords on standby for every such occasion; that is their excuse.

Where non-politicians are concerned, the standard ‘OMG – our thoughts go out to his/her family’ comment has become so familiar that the absolute absence of genuine emotion or feeling in the statement tends to be overlooked. What matters is that the poster has adhered to the unwritten law of social media by saying something, however meaningless and devoid of personal sentiment. That is what counts now, yet another legacy of the vicarious grief pioneered during the Diana hysteria of 1997. Ever since Her Majesty was condemned for not making an official comment within hours of the fatal journey into the Paris tunnel, the need to be seen issuing an official comment by everyone indulging in online life is compulsory.

If someone famous I detest were to suddenly drop dead tomorrow, whether Kelvin McKenzie or James Corden, I would not be declaring on Facebook or Twitter how gutted I was merely because it’s become the given thing to do; it would be dishonest and disingenuous, akin to the fawning ramblings that emanated from ancient peers in the Lords when Thatcher died, something that even annoyed Norman Tebbitt, especially when they came from the lips of those whose loyalty to the Iron Lady was in short supply at the time she was deposed.

One of the many problems with social media’s virtual friendships is that expressions of upset can also inhabit the virtual world, emotional holograms that have all the heartfelt authenticity of the gooey verse in a greetings card. For those whose social interaction has been shaped by cyberspace and have no memory of life without it, a pre-programmed response to death that is characterised by its vacuous content and the fact it actually says nothing about the person who has died is something that may as well be included in the terms and conditions form nobody bothers reading when signing-up, so obligatory is it now. The impenetrable glut of such comments also makes it hard to spot the actual genuine article when someone attempts to pay an individual tribute; Emma Watson did that and look at how it was received. Don’t veer from the script, dear.

© The Editor

IT’S A MIXED-UP, MUDDLED-UP, SHOOK-UP WORLD

Johnny‘Bangkok Chick-Boys’ was the documentary Alan Partridge alleged he wanted to switch off his hotel cable TV in favour of ‘Driving Miss Daisy’, though mysteriously found himself unable to work out a way of doing so. Grilling his Geordie sidekick about his experience of Ladyboys during his military outings to the Far East, Partridge’s fascination with these exotic self-made hybrids isn’t uncommon, as the queues of western male tourists eager to sample their talents will testify. Elsewhere in Asia, Indian culture has the Hijras, castrated men dressed as women who are supposedly blessed with gypsy-like mystical powers to bestow bad luck upon those who seek to banish them from society; despite this, most simply end up living a grubby existence as low-level prostitutes (I won’t describe them as ‘sex-workers’, as that implies a degree of career choice to their miserable little lot).

As a collective group, the Ladyboys and the Hijras largely refrain from seeking recognition as Real Women. True, their appearance may dupe the odd unsuspecting foreign punter, but they are clearly posing as the opposite sex by exaggerating stereotypical feminine traits. The same could be said of the old Warhol transvestite superstars such as Candy Darling and Holly Woodlawn, who knowingly resided in a harmless fantasy world that reinvented them as the most glamorous divas Tinsel Town never had. The latter two’s roles in Paul Morrissey’s trashy early 70s underground movies contained both humour and an undeniable degree of risqué excitement that inspired both Bowie and Divine; as far as the heavyweight drag queen was concerned, his cinematic collaborations with John Waters took the humour to a glorious plateau of bad taste that has never been bettered.

Quentin Crisp, a remarkably brave man who took his life in his hands every time he stepped out into 1930s London with his painted face and nails and dyed red hair, was once criticised by the New York gay ‘community’ in an early example of libertine censorship for daring to air reservations over the OTT excesses by which being out and proud had to be advertised in the manner of a New Orleans Carnival. Crisp did so with his customary caustic wit, though this didn’t square with the witless, fanatical demands to be ‘accepted’ by a straight society that Crisp had never sought to win the acceptance of.

Ah, yes – wit, the vital element missing from the rulebook of the transgender police who pretend the glorified middle-aged Ladyboy, Bruce ‘Caitlyn’ Jenner, is a woman. With their endless additions to the Uxbridge English Dictionary and on-the-spot fines for those who dare to use terms that are no longer allowed in polite society, these humourless enforcers would actually find their Orwellian credo very much at home in Iran. There, any man prepared to publicly proclaim his homosexuality is encouraged to undergo a sex change, which the state will pay for. Subsidised gender reassignment has become commonplace in Iran, and those who emerge from the operating table are thereafter officially recognised as Real Women. Who’d have thought it? The transgender capital of the world is the land of the Ayatollah.

At one time, donning the apparel and mannerisms of the opposite sex was a deliberate act of subversion, a conscious affront aimed at the straight society that associated any hint of gender bending with deviancy – or in other words, homosexuality. The thought that a heterosexual man could adorn himself with cosmetics was such a challenge to the stringent specifications of what maketh a man that it contained genuine rebellious connotations, even in a country like Britain, with a rich history of theatrical female impersonation stretching through the music hall and all the way back to the time when pre-pubescent boys had to play Shakespeare’s female parts on account of actresses being banned from the stage. Whether Mick Jagger in a dress or Marc Bolan sprinkling stardust on his cheeks, there was always a playful, mischievous aspect to the practice that reflected the traditional British sense of the absurd; in the wider canvas of America, which has a far more prevalent macho lineage, such behaviour was restricted to isolated pockets of resistance like LA and New York. The chic freaks rejected the straights and their society and didn’t want to be embraced by it.

How times have changed. A man paints his lips or eyes today and he’s immediately claimed by fanatical lobbyists demanding he be recognised as a woman in order that he can be neatly categorised, labelled and accepted. How would the transgender police have reacted upon entering the cornucopia of sexually ambiguous individuals dancing the night away at Steve Strange’s Blitz club in the early 80s? Standing out from the crowd was crucial to any adoption of female accoutrements back then; nowadays the crowd mentality, whereby everyone has to be part of some ‘community’, has become so entrenched that the natural assumption is that a man in makeup is not expressing his individuality but seeking to be co-opted by an officially-sanctioned group. Stripped of its fearless sartorial radicalism, what was once the ultimate outsider’s challenge to the masculine straitjacket has been stolen by those who have no comprehension of the thrill embodied in blurring gender lines as a means of spurning safety in numbers; they ask why would anybody not want to belong, whereas I ask why would anybody want to?

© The Editor