THIS YEAR’S MODEL

It says a lot about ‘terrorism fatigue’ that the latest atrocity – 14 dead in Barcelona to date – is something I’m struggling to write about without being overwhelmed by déjà-vu. Spain hasn’t experienced this kind of attack since the appalling Madrid bombings of 2004, but Blighty hadn’t undergone anything on the scale of 7/7 until Westminster, Manchester and London Bridge in our ‘Spring of Discontent’ earlier this year. By the time the third of these casual massacres came around, the media clichés were becoming familiar enough to induce the kind of reaction that dilutes the brutality of the slaughter and renders it almost on a par with all the other eye-rolling headlines that newspaper proprietors concoct to arrest falling sales figures.

The censorship of the gruesome reality is part of the game. There was an almighty storm on Twitter last night in which some thought it vital to show images from Barcelona whereas others regarded doing so as insulting to the people who lost their lives. Key to their recruitment policy, ISIS don’t spare the gory details in screening the aftermath of allied bombing raids on innocents abroad; seeing pictures that news outlets prefer not to show us has an impact that the Jihadi mindset responds to with a sense of vindication for their own retaliatory actions. What, one wonders, would the response in the west be were our broadcasters to practice a similarly uncompromising disregard for the editor’s scissors in the wake of another terrorist incident? Perhaps their very worry as to what response it might inspire is significant.

Whereas television news initially picked up the fearless baton from cinema newsreels and broadcasted the grim warts-and-all facts in vision from the 60s through to the 80s, recent trends have seen oversensitive censoring that leaves the reality to the viewers’ imaginations. Footage of Nazi death-camps may not have emerged until six years of conflict were already reaching their climax, but the horrific sight solidified hatred of the Germans for a generation and offered further justification for the Second World War, even if it was hardly still needed by 1945. Programmes this week marking the 70th anniversary of the partition of India have screened archive film of the bloodbaths in the wake of the British exit from the Subcontinent, yet it’s almost as though the grim images being in monochrome and from so long ago means they’re permissible in a historical context – akin to a false admission that this kind of brutality is something the civilised world left behind more than half-a-century ago.

Hearing of one more massacre on European soil and being denied the evidence transforms mass murder into an abstract concept and distances it further from the gut reaction images naturally provoke. When the world was shown the 1982 butchery at the Sabra and Shatila refugee camps in Beirut, Israeli troops absolving themselves of responsibility led to impassioned demonstrations in Tel Aviv that spilled over into Israel’s parliament; merely hearing of what had happened probably wouldn’t have inspired the same level of outrage as seeing the images did.

But seeing the hideous truth of precisely what it is Jihadists are capable of would tarnish the fatuous script Theresa May recited with routine precision last night – the whole ‘standing with…’ speech, which has no doubt already been accompanied by complementary appropriation of the Barcelona FC badge as a makeshift profile picture on social media. The pat sentiment of this speech, echoed across Europe in the respective languages of all the other leaders who recycled it, says nothing about the issue and fails to address it because to address it would leave the harmonious Utopian narrative in tatters. Jeremy Corbyn’s dismissal of Sarah Champion for having the nerve to say a fact out loud is symptomatic of this brush-it-under-the-carpet and don’t-frighten-the-children attitude which is fine for an ostrich but won’t prevent another atrocity in another European city before the year is out.

Unrelated on the surface, though sharing the same spirit, are the increasingly fanatical demands by the Puritan militants to remove public monuments to long-dead American heroes whose philosophies are out of kilter with contemporary mores (no surprise when most have been deceased for over a century). Confederate generals are the current target, though one enlightened online idiot apparently advocated the blowing-up of Mount Rushmore yesterday. Considering the first handful of US Presidents were slave-owners and that the White House itself was built by slave labour – something Obama at least acknowledged with a refreshing absence of froth in his mouth – means any rewriting of American history on this level will require the removal of a good deal more than a statue of Robert E Lee from the landscape.

The Taliban or ISIS destroying ancient antiquities and Islamic iconography that they find offensive or insulting to their twisted take on the faith is no different from what is being allowed to take place in America at the moment; to condemn one and condone the other is hypocrisy of the highest order. These are not the symbolic gestures of revolutionary rebellions emanating from a subjugated populace breaking the chains of totalitarian bondage, but the product of those indoctrinated in the ideology of fanaticism. Whether on an American campus, in a Middle Eastern Jihadi training camp, or inside English churches under the reign of Edward VI, it matters not; the motivation is the same, and it is this unswerving tunnel vision that drives the greatest threats to freedom of thought, speech and living we are confronted by in 2017.

© The Editor

THE SECRET SERVICE

I’ve used the term ‘Star Chamber’ on more than one occasion as a derivative description for a clandestine collective of decision-makers operating behind closed doors – most recently with regards to the new censorious regime on YouTube. However, when it comes to decisions being made that are a good deal more serious than having one’s uploaded video slapped with a ‘not advertiser-friendly’ label, one need look no further for a genuine Star Chamber than the smug and sinister network of box-ticking, back-slapping, self-righteous do-gooders operating under the umbrella banner of social services.

Long-term readers of this blog may recall a couple of posts I penned last year (https://winegumtelegram.wordpress.com/2016/11/21/a-social-disservice/ and https://winegumtelegram.wordpress.com/2016/11/29/consensual-healing/) on the subject of a severely mentally handicapped child whose mother is a friend of mine. Her child, a ten-year-old I referred to as X, was placed in a temporary care unit for children of similar conditions last November because her single mother could no longer cope with the day-to-day demands of looking after such a challenging child alone. The authorities were reluctant to take on this responsibility (and that’s putting it mildly), forcing the desperate mother to adopt desperate measures, such as refusing to collect the child from school the day after she’d been fobbed off on the phone when begging for assistance, thus leaving the authorities with no choice but to re-home X there and then.

Since this traumatic incident at the back-end of last year, the child has been living in a temporary care unit that currently only has two other resident children; the mother has established a pattern of visiting three times a week and taking the child back to her home for a couple of hours on each occasion. These occasions usually involve allowing X to indulge in the simple pleasures that make her happy, ones that don’t come within the narrow, rigid remit as endorsed by the powers-that-be overseeing the care unit – basically enabling X to enjoy foodstuffs frowned upon by them, and exercising a degree of realism absent from the fatuous positivity practiced by the ludicrously long list of employees on the social service gravy-train trained to believe X’s condition is one that can be ‘rehabilitated’.

This training imbues its recipients with a superiority complex and emphasises parents are an irritant if they express views that are contrary to those deemed appropriate by state employees – even though the parents may have spent many years 24/7 with the child and therefore know what makes it tick. Parents are viewed as something of an encumbrance to the system because some of them can see the system is getting it wrong re their children’s best interests and are prepared to puncture the positivity balloon by pointing this out. Social services aren’t keen on those not in their exalted position of faux-authority telling them the system they’re trained to obey with unswerving subservience sucks.

When X returned to a spate of self-harming – mainly biting her arms and hitting herself on the head – these were new behaviours that began when she entered the care environment, and her mother instantly knew what the problem was. X does this when she’s bored or hungry; her capability for expressing her frustration in any way other than self-harming is virtually zilch. But no one in authority wanted to discuss or even admit that this was happening. It wasn’t until the mother presented photographic evidence of appalling bruises and bite-marks that the self-harming was actually acknowledged.

Initially, when the staff at the care unit placed food on her plate such as noodles, spaghetti or anything she couldn’t hold and chomp on like Henry VIII with a chicken-leg, she refused to partake in the meal and lost a good deal of weight as a consequence; this was due to what are called ‘sensory processing issues’, and until it was pointed out by the mother, the staff wouldn’t provide X with a replacement meal, refusing to veer from a menu that caters for a mere three children. There have been other incidents where the staff will take X swimming at a time when she would normally eat, a decision flying in the face of common sense. Very much a creature of repetitious habit as befitting the most extreme outer limits of the autistic scale, X reacts to any alteration in the schedule by reverting to her worst traits, even if (as her mother constantly points out to employees of the system) these traits can be avoided.

The entire county in which X resides has the one solitary temporary care unit for children in her condition; a fourth child who had attacked X on several occasions was recently relocated to another care unit, but this time down in Shropshire – a considerable distance from home. In a way, the process of relocation is akin to when convicts are removed from one prison to another, often hundreds of miles from where the con’s family live, thus necessitating an increase and expense in travel come visiting day. And, just as the families of prisoners have no say in where the authorities choose to dispatch their loved ones to, social services will place children wherever the hell they like if they have ultimate charge of the child; parents aren’t consulted because parents aren’t important.

Yesterday, X’s mother was belatedly informed by X’s social worker (incidentally, the nineteenth X has had in her ten short years) that the social services’ Star Chamber had held a secret meeting the day before in which they’d decided they would effectively gain power of attorney over X, absolving her parents of all rights and claims to her. The parents were not informed and no review was held that would’ve given a platform to the parents’ concerns and enabled them to express a view on future plans for X when a permanent home for her needs to be found eventually.

If this goes ahead via the intended court order, the social services can place X anywhere in the country and the parents will have no say whatsoever in the matter; X’s mother has established a routine with X that benefits X and brings a modicum of pleasure into a life that has a paucity of it; if X is relocated hundreds of miles away, all that will cease. Is this really being done in X’s best interests or is it another penny-pinching exercise conducted by overpaid, arrogant authorities whose PR machine sells the uninformed public a different reality to the one parents such as X’s mother have been battered around the head by?

Post-Savile, it would appear police and social services have swapped places. The boys in blue’s politicisation over the past five or six years, underlined by borderline-spoof Twitter accounts from obscure officers declaring their PC credentials in prioritising ‘Hate Crime’ and the rights of minorities, has seen them adopt the right-on tactics once associated with the social worker; at the same time, social services have been transformed into a veritable secret police, granted powers to swoop unchallenged on parents they deem unfit and ill-informed as though overcompensating for the numerous well-publicised failures of social services to prevent actual abuse of children. For most parents in X’s mother’s position, the social services add to the burden the child represents, something that completely contradicts their purpose.

For the last decade, X’s mother has been exposed to a side of the welfare state that mercifully few of us have to contend with, and it has understandably left her so cynical towards the state that she simply doesn’t trust the state to do what’s best for her daughter. Therefore, the only choice she can see is to take X back into her home – narrowing the scope of her day-to-day life yet again as she reverts to the role of carer and gaoler for a child whose brain will remain that of a three-month-old baby, but whose body is physically maturing as normal. Next birthday, X will be eleven. And her mother will be exhausted. Again.

© The Editor

LEAVING LAS VEGAS

This might be the second post in a row to begin with a reference to Princess Diana, though that’s neither intentional nor some sort of preparation for a gushing post come the last day of August. For this particular post, I exhume our Queen of Hearts once again solely in relation to the media tsunami that accompanied the aftermath of her death; for one specific generation, this was the first moment when the demise of a ubiquitous household name was afforded such blanket coverage. For me, that first moment came precisely twenty years earlier with the death of Elvis Presley – forty years ago today.

As it was the middle of the school summer holidays, I got up in August 1977 when I felt like it rather than being dragged out of bed as I would be during term-time. Therefore, I was denied the playground reaction the day I heard, but I was informed about what had happened by my mum the moment I appeared for breakfast; she’d been watching ‘News at Ten’ the night before and they’d announced it on there. Up until Elvis died, I can only recall a small handful of famous people whose deaths I was made aware of at the time they happened. There was Roger Delgado, the actor who’d played The Master opposite Jon Pertwee’s Doctor Who; there was racing driver Graham Hill; there was Chairman Mao; and there was ‘Record Breakers’ star Ross McWhirter. But none of those demises prepared me for the event of Elvis’s death.

Elvis mattered to people and meant something more to a greater number of them than any of the aforementioned names other than Mao. A mate of my dad’s wore all-black for a full week after Elvis died – an unusual gesture to make in the sartorially colourful 70s; a friend of mine once told me he remembered his father strolling out into the garden when he heard the news and standing in tearful silence out there on his own for a good ten minutes. At that age, I’d never witnessed the passing of a person nobody I knew had ever met having that kind of impact. But even in a pre-internet and 24/7 TV news age, it was impossible to avoid the worldwide outpouring of emotion that Elvis’s death provoked.

John Lennon’s oft-quoted opinion when a reporter shoved a microphone in his face that day in 1977 was ‘Elvis died when he joined the army’. This off-the-cuff statement may have had a grain of truth to it re the ‘pure’ undiluted Elvis as a relevant musical force, but Presley’s military sojourn in Germany had introduced him to the profoundly unhealthy diet that eventually killed him, so Lennon wasn’t far off the mark. At the same time, Elvis’s charisma and popularity seemed undimmed by his slow slide towards a premature end; his most devoted fans almost regarded him as immortal, which was why his death shook them so much. The King of Rock ‘n’ Roll continued to exert a powerful influence over the generation who’d been around when he’d dropped like a pop atom bomb into the static music scene of twenty years earlier.

I had grown-up with fat Vegas Elvis in his white, rhinestone-studded jumpsuit, and though it sounds sacrilege now, when I’d been roundabout five I used to get him mixed up with Gary Glitter! However, I gradually became aware Elvis had once been young, slim and sexy via the odd old movie of his on TV; and it’s strange to think now that the Elvis of my childhood was only in his late 30s/early 40s. He seemed so much older. After almost a decade squandered on the diminishing returns of Hollywood, Elvis had re-emerged as a live act at the end of the 60s and began making decent records again; it was a respectable renaissance, yet his self-destructive personality and isolation from anyone bar his yes-men mafia soon saw the Elvis roadshow become as damaging to his reputation as the movie conveyor belt had been.

Subsequently seeing his physical and mental deterioration via concert footage from the months leading up to his death, one comes away feeling both disgust that such a beautiful-looking human being could let himself degenerate into such rack and ruin at so young an age and sadness at the waste of talent. It’s as tragic to see the obscenely bloated Elvis drenched in sweat and mumbling his way through his set-list as it is to see the audience still whooping and cheering despite the blatant evidence before them that Elvis is virtually dead already. But by the mid-70s, Elvis had become little more than a barely animated tourist attraction. Not that this was initially reflected in the reaction to his death; the gruesome details seeped out in the years afterwards. In August 1977, even the NME – then the bible for the Punk scene that was at its height – put the young Elvis on its front cover and declared ‘Remember Him This Way’.

When I watched television images of the huge number of fans besieging the Graceland mansion in 1977, their collective mass reminded me of crowds en route to the Cup Final or hysterical girls chasing The Bay City Rollers; but the novel aspect of these images was that the fans – none of whom were children – were all crying. You didn’t see adults cry in public very often when I was a kid. The coverage of the reaction to Elvis’s death wasn’t just limited to the day after either; it seemed to go on all week. By the time ‘Top of the Pops’ came round again, his current single, ‘Way Down’, had zoomed up to No.1; even the Christmas schedules four months later bowed to the demand, and one of Elvis’s movies – of which there are perhaps three or four actually worth watching – was screened every morning on BBC1 over the holidays.

We’re used to all this now; but it was new to me then. Three years later, the same response greeted John Lennon’s death, though the nature of Lennon’s passing was far more shocking. Whenever a major pop culture figure dies today – and we’ve had quite a few in recent years – we tend to view it through the post-Diana prism, something enhanced and intensified by social media. But Elvis got there before her.

© The Editor

GARDENERS’ QUESTION TIME

Maybe a glorified paddling pool was the most fitting tribute to our ‘Queen of Hearts’; the Diana Memorial Fountain in Hyde Park, which opened to an underwhelming fanfare in 2004, was, like the public image of the woman it was supposed to be a tribute to, an impressive triumph of style over substance. In the years immediately following the premature death of the former Princess of Wales, many such fanciful schemes were suggested, some considerably abstract and bearing little obvious relation to the woman herself. Perhaps the unintentionally hilarious statue of Diana and Dodi in Harrods was at the forefront of the minds concocting these prospective memorials.

Joanna Lumley, an actress who doesn’t seem to act very much anymore, suggested a ‘floating paradise’ as one more bizarre tribute to Diana barely a year after events in Paris; this somewhat vague concept eventually morphed into the notion of a bridge that also doubled up as a garden – with or without the Diana brand attached to it. Following her successful campaign to gain Gurkhas the right to settle in the UK, Lumley suddenly had a public platform that proved immensely attractive to politicians hoping some of Purdey’s star quality would rub off on them. One such politician was the ex-London Mayor Boris Johnson. Riding high on the PR victory of the 2012 London Olympics, Bo-Jo gave the green light to what became the London Garden Bridge project.

Ah, yes – the Garden Bridge. It is now officially an ex-bridge, bereft of life and all that. The ambitious (if rather impractical) idea of another shortcut across the Thames that would serve as a novel rural facsimile in the heart of the capital looked good on paper, yes; but the proposed location wasn’t a part of London in desperate need of another bridge and the locals whose lives would be disrupted by its protracted construction weren’t even consulted as Boris took it upon himself to be the project’s salesman; when he gained planning permission in 2014, Johnson’s record in facilitating the ongoing despoiling of the capital’s skyline by constantly ruling in favour of developers over opposition didn’t give cause for optimism.

Initially, the public were told the bridge would be financed by private investors, but the struggle to raise the required funds necessitated the diverting of taxpayers’ money into the project – a total that now stands at around an estimated £46.4m. As Chancellor, George Osborne promised Boris £30m from the public purse, and a chunk of that squandered cash found its way into the black hole of the Garden Bridge courtesy of David Cameron; Dave ignored the advice of his civil servants by throwing more taxpayer’s money at it when the failure of recruiting enough private investors revealed a £56m shortfall in the accounts of the trust set up to handle the lucre.

The Garden Bridge had its critics from day one; they viewed it as an expensive vanity project that could be to Cameron’s Government what the Millennium Dome was to Blair’s. Its proponents, such as chairman of the trust, Lord Davies, claimed the Bridge would be a ‘beautiful new green space in the heart of London’; but it’s not as though Central London, for all its traffic bottlenecks and overcrowded pavements, doesn’t already have an abundance of spacious parks and green squares to breathe in – most of which have been part of the London landscape for well over a century.

The Garden Bridge could well have gone ahead as a felicitous white elephant for Japanese tourists if enough private investors had been prepared to pay for its construction as well as the projected £3m a year needed for maintenance once open; but for so much public money to have been squandered on ‘a public space’ without public consultation is outrageous, especially now the whole thing has been abandoned.

A review into the project chaired by Dame Margaret Hodge was severely critical of the methods of raising money for it and also of Boris Johnson for his inability to justify the public expense; Hodge’s conclusion was that it would be better to call time on the Garden Bridge before any further costs were unwittingly incurred by taxpayers. Johnson’s successor as London Mayor Sadiq Khan has finally pulled the plug on it following the findings of the review, though some say he could have spared even more expense had he done so earlier; his predecessor claims Khan has killed the Bridge out of spite, saying ‘The Garden Bridge was a beautiful project and could have been easily financed’, though his own failure to finance it without regular recourse to the public purse hardly backs up his response to the Mayor’s belated decision.

As another cheerleader for the Garden Bridge, even Lord Davies admitted earlier this year that the project was not currently ‘a going concern’. The trust still hadn’t purchased the land on the South Bank of the Thames that would serve as the bridge’s southern landing and no private investors have been persuaded to part with their pennies for a full twelve months. The total provided by private investors is alleged to be around £70m, though how much of the public money wasted on the project was spent on courting potential private investors is unknown.

Ultimately, the London Garden Bridge can join a list of other intended attractions for the capital that never made it beyond the drawing board, though some came closer to succeeding. Watkin’s Tower, London’s planned answer to the Eiffel Tower in the 1890s which, had it been completed, would still be taller than the Shard, made it as far as 154ft before being abandoned and then demolished, eventually making way for Wembley Stadium. But it’s interesting to note that one of the proposed ideas for the Wembley site prior to the partial construction of the Tower was a replica of the Great Pyramid of Giza, in which food would be grown in hanging gardens. Perhaps the committee responsible for the Garden Bridge should have studied their London history books beforehand.

© The Editor

SONG OF THE SOUTH

When Belfast City Council voted to break with tradition in 2012 by reducing the flying of the Union Flag atop City Hall from 365 to 18 days a year, the more vociferous wing of the Unionist community greeted the announcement with violent protests. A couple of days ago, marking the anniversary of the Battle of the Boyne, bonfires were lit across Unionist strongholds of the province, many of which were decorated with photos of prominent Sinn Fein politicians. I only nod to our neighbours over the Irish Sea to make a roundabout point on how the issues that enflame passions on both sides of the sectarian divide in Northern Ireland barely register on the mainland; they’re viewed by the rest of the UK (with the possible exception of Glasgow) as parochial concerns unique to Ulster and characteristic of a land with an extremely long memory.

Even with the high profile suddenly afforded the DUP in the wake of Theresa May’s golden handshake, the ‘street politics’ of Northern Ireland rarely attract outsiders to the barricades, something that can’t be said of another divided community from a region with a similarly turbulent history several thousand miles away – Virginia. The dramatic and ugly events that took place in Charlottesville, Virginia at the weekend didn’t have their source material in religious divisions, but race – the most contentious of all American issues that just won’t go away. Not even eight years of a black President could sort it.

Virginia was one of the four slave states from the ‘Upper South’ of the US that, along with Arkansas, Tennessee and North Carolina, joined the original seven Southern secessionist states in the Confederacy during the Civil War. Its history, now so bound-up with the Confederacy and its aftermath, predates that era considerably, with Virginia being the first English colony in the New World, established as far back as 1607. But it was also prominent among the 13 colonies that broke with British rule and has a claim as being the birthplace of the USA; it certainly was the birthplace of eight US Presidents, for one thing.

Like the rest of the states in the South, Virginia had a segregationist policy in place until the civil rights movement of the 1960s gradually led to a repeal of the remaining Jim Crow laws; but its past, like many of its neighbours’ pasts, continues to attract the attention of those for whom integration remains a greater threat to making America great again than the hardware in Kim Jong Un’s toy-box.

Recent attempts to reduce the high visibility of the Confederate Flag in the Southern states have gone hand-in-hand with a concerted programme to remove statues of, and monuments to, Confederate heroes from public places; and these efforts at erasing a history that sits uncomfortably on the shoulders of modern America have served to ignite the ire of white Southern natives proud of their inheritance, as well as white supremacists from different parts of the country who exploit the situation to promote their cause. When Washington belatedly addressed the iniquities and inequality of the South in the 60s by outlawing its segregationist traditions, the white population claimed the rest of the US didn’t understand the South and there’s probably a grain of truth in that. The South was seen as something of an embarrassment that contradicted America’s international reputation as the Land of the Free; the South was a place where the past remained present.

The ongoing contemporary operation to change the perception of the South, not only for outsiders but also for those who live there, has been characterised by the official removal of ‘negative’ symbols relating to its past; though whereas the pulling down of statues during an uprising or revolution tends to come from the emancipated population itself, the policy of removing them that has been taking place across the South of late is a decision of federal government. Many have viewed this decision as symptomatic of rewriting American history, a rewrite that fails to acknowledge aspects of it that don’t complement the image America likes to project of itself. There are also concerns that by erasing the visible legacy of the Confederacy, future generations are being presented with a lopsided story of their country, one without warts and all, and one depriving them of a history they could learn from.

Plans to remove a statue of Robert E Lee, Confederate Civil War general, in Charlottesville led to the town being invaded on Saturday by a ‘Unite the Right’ march, bringing in angry white men from all over America for a rally that was destined to be met with a counter-rally. Whatever valid points had a right to be made didn’t stand a chance of being heard; both sides were infiltrated by those whose intentions were obvious from the start, many of whom had little or nothing to do with the part of the country they headed for.

The relatively liberal college town of Charlottesville was hijacked by opposing sides looking for a battlefield. The far-from spotless ‘Black Lives Matter’ crowd were accompanied by the masked men from ‘antifa’ – an abbreviation of ‘anti-fascist’ – who have a reputation as violent left-wing anarchists; they were the group responsible for the trouble that occurred in Washington on the day of Donald Trump’s inauguration. Those under the Alt-Right banner included neo-Nazis as well as that old mainstay always up for a fight, the Ku Klux Klan. The KKK are almost to the South what the Orangemen are to Ulster, though for all their shared pseudo-Masonic ritualism and shameful record of gerrymandering, the Orangemen are a long way from the Klan when it comes to provoking and stoking hatred in the most sinister manner.

What was already a predictable and unedifying clash on Saturday plumbed especially appalling depths when one lunatic ironically took a leaf out of the Jihadi manual and drove a car directly at protestors; his efforts were responsible for 19 injuries and one death. The white supremacists, who view President Trump as ‘their man’, were gratified that the Donald seemed reluctant to attribute blame for events to them, though the majority of the Alt Right (to whom Trump owes a great debt) probably regard the extremists who descended upon Charlottesville with the same abhorrence as the left views the ‘antifa’. It would certainly suit the narrative of the moment to lump together anyone who questions or challenges the anti-Trump consensus into one hate-fuelled, racist mob; but unfortunately, it’s not quite so…erm…black and white.

© The Editor

FAMILY ENTERTAINMENT

As with the two Peters, Hitchens and Oborne, Paul Joseph Watson is not a media figure whose every pronouncement provokes a nod of the head, yet as with those aforementioned grumpy grandees of Fleet Street, he often nails the ludicrousness of the world we live in simply by daring to challenge it. An unapologetic ambassador of the so-called ‘Alt Right’, Watson is the face of the UK branch of ‘Info Wars’, the US conspiracy theorist site fronted by the ranting human foghorn Alex Jones. Watson doesn’t adopt the breathless bluster of his American sponsor; adopting that approach for a British audience would reduce him to the level of Jeremy Clarkson. Instead, he sometimes comes across as Owen Jones through the looking-glass, the flipside mirror image of the pocket Northern Socialist.

Watson has posted a series of regular videos on YouTube over the past couple of years, both highlighting and ridiculing the increasingly fatuous fanaticism of the extreme left’s PC storm-troopers, especially on the other side of the Atlantic; as a result, he’s made as many enemies as fans, and while one may not always concur with his conclusions, there’s no doubt he’s highlighted a lot of things that needed highlighting. Until now, that is.

Watson has temporarily drawn the blinds on his YouTube window due to the fact that he can no longer make a living from it thanks to a new Star Chamber of YouTube judges, installed by parent company Google to police the medium and crack down on any questioning of the consensus. Many may be unaware that ‘monetising’ one’s uploads to YT can bring in a little revenue depending on the number of views the videos receive; Watson’s videos received astronomical views and no doubt brought in a nice little profit on a monthly basis. However, the crackdown on anyone saying anything that could be perceived as ‘offensive’ means all of Watson’s videos have now been deemed ‘not advertiser-friendly’, thus meaning he can’t make a penny from them anymore.

I’ve written on more than one occasion in the past of the transformation of YouTube in recent years. What was initially an invaluable platform for, amongst others, lovers of archive footage unavailable on DVD and rarely screened on TV – often uploaded from decrepit off-air VHS recordings or sourced from actual television vaults by insiders – has slowly seen passionate promoters of the rare and obscure edged to one side by The Man and his corporate bullyboys. Copyright laws have been tightened to the point whereby every piece of film not actually shot on one’s own camera is subjected to a ‘third party infringement’ order, regardless of how minimal its use may be. I once had a video stamped with copyright claims simply because I used the BBC4 ident for a handful of seconds as the intro to it.

This OTT enforcement of copyright has made navigating such rules something of an art-form for veteran uploaders, but perhaps responding to criticisms of alleged lax attitudes to ‘hate’ videos, YouTube has now embarked upon a censorious crusade in which any video that doesn’t promote the Coca-Cola ideal of a harmonious multicultural/LGBT/Islam-with-a-smiley-face society is penalised; anyone who takes the piss out of or merely questions this bland make-believe Utopia is denied an income as a consequence. People regularly air their grievances with the BBC as pandering to a left-leaning notion of ‘Right-On’ politics – often justified, viz. the hardly unbiased four-person panel of prominent Muslims discussing the latest Pakistani grooming network on ‘Newsnight’ this week; but YouTube has suddenly usurped Auntie Beeb as an intolerant home for one view and one view only.

Infuriatingly vacuous American airheads who call themselves ‘vloggers’ – usually squeaky-voiced teenage Disney Princess types who exude the air of hyperactive six-year-olds albeit bereft of infantile charm – make millions from their vapid videos that appeal to a generation whose heads have already been ground to slurry by being force-fed media sedatives; and these are the future of YouTube, not anybody with anything to say. My own personal speciality area tends to be satire, but satire is now as welcome on YouTube as a copy of Charlie Hebdo would be in a Parisian mosque.

A couple of days ago, the new YouTube constabulary provided me with a long list of my videos their panel has decided I can no longer make any money from. To be honest, I don’t make much, anyway – around £120 a year; I have a loyal following who will view my output whatever I upload and I also pick up casual viewers en route, but I’m a cult presence and probably always will be. I accept that some of my output is coarse in the Derek & Clive tradition, but YT already had an age-restriction system in place where rude words were concerned, so anybody stumbling upon them knew what to expect beforehand.

None of the previous rules in place to protect a ‘family audience’ were apparently sufficient, however, for the strict new boundaries have narrowed the range of opinions on offer even further. Many of my own videos parody the politically-incorrect 1970s and therefore need to be viewed with that in mind, yet the humourless martinets Google has recruited to clean-up YouTube’s lingering vestiges of its original freewheeling spirit can’t even tolerate that. One particular video of mine was a spoof 70s BBC trailer previewing a night of programmes marking ‘National Smoking Day’; it’s so obviously a piss-take, yet it’s been labelled ‘not advertiser friendly’. Despite infringing no copyright, I can’t earn anything from it anymore.

I attach another innocuous video in this style to the post and ask you to watch it in order that you can decide whether or not it’s remotely ‘offensive’. The video in question being ‘banned’ as a source of income was something I challenged; when I did so, I was informed the team won’t review the status of a video subjected to this treatment unless it receives over a thousand views in 28 days; some of my videos can take months to reach that amount of views, so I haven’t got a cat in hell’s chance of reversing the judgement. It’s a rip-off and it’s an outrage. But it’s 2017. Sign up to the consensus or be cast out into the online free-speech wilderness.

© The Editor

THE LINEMAN FOR THE COUNTY

I may be a day or two late, but as I’d rather not write about the coming attraction of World War III (life’s too short – literally), I decided it’s never too late to pay tribute to the great Glen Campbell. His death at the age of 81 was attributed to the crippling effects of Alzheimer’s, which forced his retirement from performance and recording five years ago; but the passing of this unsung link between the once vehemently opposed worlds of Country and Rock brings to end a career that helped bridge a divide that had widened when Rock ‘n’ Roll took what it needed from C&W in the mid-50s.

A hybrid of many contemporaneous styles (as all the best musical genres usually are), Rock ‘n’ Roll possessed an outlaw element stolen from the Blues that Country, in its slow journey from hillbilly folk music to Grand Ole Opry conservatism, responded to with a reactionary redneck rage. From being the soundtrack of the poor white rural population, Country had become an audio comfort blanket for its audience, weighed down by sentimental schmaltz and insular wallowing in its own suffering. Rock ‘n’ Roll was younger, sexier and essentially black. Johnny Cash may have kept its original intent alive into the 1960s, but the British Invasion that sold American coal to a record-buying public oblivious of its own heritage made Country seem to be emblematic of the old world order.

Glen Campbell may have emanated from that old world order, but he was young enough to have been affected by changes to the musical landscape from the mid-50s onwards, and his skill on the fret-board carried him from Arkansas poverty to the profitable LA session scene of the 60s. He was a vital member of the so-called ‘Wrecking Crew’, the talented collective of musicians who provided the backing on virtually every great US pop hit outside of Motown, including the best of Phil Spector; Campbell played on records by everyone from Frank Sinatra and Elvis Presley to Simon and Garfunkel, The Monkees, The Righteous Brothers. The Mamas & the Papas, Sonny & Cher and numerous others. One of the acts reliant on the Wrecking Crew’s ability to get the job done in the allocated studio time was The Beach Boys, and when Brian Wilson’s musical horizons began to widen beyond churning out surfing dirges on the grinding touring circuit, the resident Beach Boys’ genius stayed at home to write while Campbell filled in for him on the road.

Whilst making a decent living as a session musician, Campbell was simultaneously releasing his own records without much in the way of success. It wasn’t until Campbell started to mine the riches from the pen of songwriter and producer Jimmy Webb that he stumbled upon the kind of musical partnership that could yield the success he’d so far struggled to find as a solo artist. Despite his exposure to a different world in LA, Campbell still held onto his conservative outlook, declaring draft-card burners during the early years of the Vietnam War should be hanged, and greeting Jimmy Webb upon their first meeting with a curt ‘Get a hair-cut’. He also starred in an acting role alongside ultra-conservative Republican flag-waver John Wayne in 1969’s ‘True Grit’. However, as Country and Rock remained at loggerheads, Campbell found a middle ground in the Middle of the Road and helped pave the way for reconciliation.

Only when Bob Dylan returned from his two-year exile with ‘John Wesley Harding’, embracing a Country style that then surfaced in key works by The Byrds and The Band, did Country slowly make its peace with Rock ‘n’ Roll; solo artists such as Kris Kristofferson, Emmylou Harris, Neil Young and Gram Parsons, along with bands such as The Allman Brothers, America and The Eagles were able to have a foot in both camps as the 70s dawned, but it was Glen Campbell who pioneered the tricky route from Country to Rock and back again. His seminal collaborations with Jimmy Webb, peaking with the glorious ‘Wichita Lineman’, saw Campbell established as a star – in 1967 he scooped Grammys in both the Country and Pop categories – and he was rewarded with his own TV series, which included guests from Rock as well as Country. He also recorded a string of successful duets with another artist who managed to appeal to a wider audience than Country could traditionally call upon, the wondrous Bobbie Gentry.

By the mid-70s, Campbell was as regular a presence in the upper echelons of the US Hot 100 as he was in the specialist Country charts, hitting the top spot with ‘Rhinestone Cowboy’ in 1975 and again with ‘Southern Nights’ a couple of years later. Unfortunately, the trappings of mainstream success that had claimed many a rock star began to leave a mark on him, resulting in a serious cocaine habit, alcoholism, and a disastrous relationship with another Country act 22 years his junior, Tanya Tucker.

His faith appeared to come to his rescue in the 21st century and, like Johnny Cash before him, Campbell found that covering songs by contemporary acts brought his work to a new audience; his 2005 album ‘Meet Glen Campbell’ included numbers written by Green Day, Foo Fighters and U2. When he went public over his Alzheimer’s diagnosis in 2012, he embarked upon a farewell tour and recorded his final album, which was eventually released just a couple of months ago. The critics belatedly acknowledged the pivotal part he’d played in reuniting two musical genres with a shared lineage, and as his condition continued to deteriorate it was only a matter of time before he inevitably checked-out for good. That moment came two days ago. The last song he recorded was titled ‘I’m Not Gonna Miss You’, but plenty people are gonna miss him.

© The Editor

FROM AVARICE TO AUSTERITY

This wet and windy month of August 2017 is, if nothing else, awash in anniversaries – fifty years since the 1967 Sexual Offences Act; forty since Elvis Presley died; twenty since Diana died…and so on. Perhaps the focus on these anniversaries helps distract us from the imminent apocalypse courtesy of Mr Trump and Mr Jong-Un, though one anniversary was highlighted today that is hardly cause for celebration, even if it has not only dictated the global political landscape for the last ten years, but has also impacted upon all our lives in one way or another.

If we cast our minds back a mere decade (easily done when you’re over 21), we find Gordon Brown, the so-called Iron Chancellor as was, finally ascending to the position he always felt his predecessor had promised him during their legendary dinner at an Islington restaurant thirteen years previously. However, it was ironic that a man who supposedly had his fiscal finger on the financial pulse could be so short-sighted when it came to the inevitable bust of the boom he had been happy to take credit for. Perhaps the prospect of finally getting his hands on the key to No.10 was too great a distraction for Gordon Brown in the summer of 2007 and he took his eye off the ball; after all, the US housing market bubble had already burst by the time he succeeded Blair; surely, just as what goes up must come down, a boom must be followed by a bust?

Although he was obviously keen to put his own stamp on the role of Prime Minister, Gordon Brown didn’t initially show that his predecessor’s fondness for dishing out knighthoods and Peerages to prominent financiers was about to be discontinued. The bankers were still the bosom buddies of New Labour; the partnership they had entered into on the eve of the 1997 General Election retained its cosy nature and the likes of Peter Mandelson, poised to return to Government as Brown sought to shore up his fresh-faceless Cabinet with a few experienced old hands, was the embodiment of New Labour’s love affair with the wealthy, forever sighted on the yachts of Russian oligarchs or quaffing champers in the City. Should the Government be accused of giving the bankers too much leeway and allowing them to exercise too much influence over the economy, Labour could simply point to the population and their plentiful status symbols as evidence that affluence was now available for all.

However, the crisis arising from the inevitable collapse of the US subprime mortgage market – with America already borrowing heavily from China (which had saved its pennies as the west was recklessly throwing its own about like bloody confetti) and banks no longer lending to each other – began to seriously affect international finance in the summer of 2007 and the first British casualty was the high-street bank, Northern Rock.

Northern Rock had centred its financial practises on securitisation – borrowing both home and abroad to fund the mortgages it sold before re-selling mortgages in the capital markets internationally; but when investors’ demand for securitised mortgages plummeted, Northern Rock could no longer repay the loans it had acquired around the world when business was booming. Moreover, ever since being caught napping by the dot-com crash, the financial markets had tended to second-guess potential crises, and the ensuing publicity afforded Northern Rock’s shaky foundations only served to bring about the disaster that was being predicted. Northern Rock approached the Bank of England for a liquidity support facility to compensate for the sudden loss of the funds it had previously raised in the markets; this move, reported with sensationalist relish by the British press, triggered the first run on a British bank in over a century.

Further scaremongering reporting from the media as customers queued around the block to empty their accounts before (so they feared) their savings evaporated evoked the panic at George Bailey’s Building and Loan bank in ‘It’s A Wonderful Life’; it seemed as though the collapse of Northern Rock was becoming a self-fulfilling prophesy as the public believed the hype and sought to withdraw every penny they had stored in the bank’s coffers.

This remarkable event should have been anticipated by Gordon Brown; after all, he’d only ceased to be Chancellor for three months before Northern Rock’s dramatic downfall leapt from the financial section of the papers to the front page. But Alistair Darling was now in charge of the nation’s purse-strings and accusations of responsibility for a financial crisis that may have had its genesis on Gordon Brown’s watch were met with an effective ‘It weren’t me, Guv’. The Government observed from the sidelines as two unsuccessful attempts to rescue Northern Rock came to nothing and then belatedly intervened by taking the bank into state ownership. It was ironic that a party that had spent the best part of fifteen years dispensing with the nationalisation programme that had been integral to its constitution since its inception should have to end its days in Government nationalising the one industry that had always prided itself on its independence from the state; but once Northern Rock was pulled back from the precipice by nationalisation, the legacy of living beyond one’s means began to spread.

If Gordon Brown imagined slipping into Tony Blair’s shoes would be achieved without the need for a shoe-horn merely because he’d had his beady eye on those shoes for ten years, then the honeymoon period that had raised his popularity high enough for him to have won the autumn Election he’d baulked at calling was short-lived indeed. The Northern Rock debacle would act as the harbinger of an economic meltdown that would dog Brown throughout what would turn out to be the brief tenure of his premiership. Northern Rock was no one-off drama that could be glossed over as an aberration from the prosperous state of affairs Brown had overseen during his residence at No.11.

The global markets were sliding towards recession in 2007, and Britain, saddled with debts accumulated during the Blair boom years – a boom that had benefitted everyone from the million-pound bonus banker at the top to the designer-clad Chav at the bottom – was poised to pay the price for its reckless dependency on credit. A decade later, with the average British salary the same as the average British consumer debt per person (£28,000), the country is still paying the price.

© The Editor

GROWING-UP IN PUBLIC

One of the many dreaded factors in introducing one’s boy/girlfriend to one’s mother has always been ‘the potty picture’. The best tea-set being dusted down and mum bizarrely transforming into an air hostess when serving it is an uncomfortable enough experience; but if the new other half passes muster, chances are the childhood photo album will then be excavated. And, naturally, every childhood photo album opens with a baby sat on a potty. Why do mothers feel the need to a) capture a crap on camera and b) show it to one’s partner decades later? It remains a perplexing aspect of parenting that non-parents like me will always be mystified by. Perhaps it’s a symbolic surrender of emotional ownership and an acknowledgement that the other half will at some point in the relationship see said partner on the loo too. As a portrait of man and woman’s mutual vulnerability, sitting on the loo is probably a greater leveller than death.

As horrific as this handover ceremony has been for generations, the one saving grace of it has been that the ritual takes place behind closed doors, only endured by those present in the room. Not for the first time, be thankful the visual documentation of your formative years was restricted to the Kodak Brownie or (at a push) the Super-8 cine-camera. Imagine you’d been born on the cusp of the millennium or immediately thereafter. The potty picture would be the opening image in your online gallery of embarrassment, shared with, if not necessarily the world, then your mother’s circle of family and friends and – as a consequence – their offspring and their family and friends.

Eight out of ten mothers (probably) think their little angel is inherently superior to any other child on the planet, so are instinctively compelled to broadcast this information to anyone within earshot; backstage at the Miss World contest must seem like a veritable picture of communal harmony compared to the level of competitiveness at the school-gates. The Yummy Mummy movement, bolstered by the celebrity mother industry, daytime TV, dozens of websites, and a plethora of ‘How To…’ guidebooks, has turned this traditional rivalry between mums into a deadly game of one-upmanship that now has an additional dimension that takes it above and beyond the parochial battlefield – social media.

Twenty-first century boys and girls are the first generation to have their entire lives so far uploaded to a worldwide database, using the lead character in ‘The Truman Show’ as a blueprint for growing-up. It’s not a pleasant thought, especially when one considers they’ve had no say in the matter. From the initial ‘aaah’ shot to appear on Facebook barely days (or in some cases, hours) after the sprog’s arrival all the way to the ‘first day at school’ shot, the internet has been utilised as cyber apron-strings by mothers too blinded by their perfect child to appreciate the future ramifications of their actions.

Another element of crass Americanisation to pollute British culture, the aforementioned ‘first day at school’ shot takes its place alongside even greater demands on the parental coffers such as the insidious establishing of ‘the prom’ as an end-of-term beauty contest; not only does the latter introduce a new financial burden previously reserved for Catholic parents and their communion dresses, it also places pressure upon the children themselves. It was bad enough when this alien tradition infiltrated high schools; the fact it has now seeped into the primary school social calendar means mothers now have yet more opportunities to earn online bragging points whilst bankrupting themselves in the process.

The generation who welcomed the internet into their lives from adolescence onwards have already become accustomed to documenting every aspect of their existence online, but the generation coming up behind them, who will have never known a time without it, have had it thrust upon them as a normal state of affairs. It’s too early to say how this will shape their self-perception in years to come, but the threat of these images remaining accessible for eternity was something as worrying as Facebook’s refusal to allow the accounts of the deceased to be deleted – until, it would appear, now.

Yesterday it was announced by Matt Hancock, Digital Minister (yes, that’s a real job title), that the EU’s General Data Protection Regulation laws are to be transferred onto the UK statute book in an overhaul of Britain’s own data protection laws. The most encouraging upshot of the proposals is that it should not only be easier for people to withdraw their consent for personal data to be shared online, but it should enable people to request the removal of childhood photographs uploaded by parents years before. In theory, this could spell the end of the potty photo’s online life.

Anyone well-versed enough in cyber practices will of course be aware that it’s hardly rocket-science to copy and paste an image from the internet, so the chances are some images can be uploaded over and over again in perpetuity; but at least the proposals in this new bill might provide the unfortunate cyber star with some legal clout to get his or her own back on Mommie Dearest. The right of the individual in question to upload childhood photos of their own choice is something those of us who grew up in private already have – as the image illustrating this post demonstrates. And I will always defend that seven-year-old’s right to have worn those trousers.

© The Editor

OVER THE RAINBOW

Amidst the celebratory coverage of the 1967 Sexual Offences Act’s fiftieth anniversary, it is certainly worth being reminded precisely how limiting the freedoms contained within the ‘consenting adults in private’ law actually were, and how these limitations made it easily open to abuse by the powers-that-be. After the Act was passed, it’s surprising to realise that more gay men were prosecuted than before it. Perhaps the understandable precautions that had been crucial prior to 1967 were perceived to be unnecessary once decriminalisation came into force; the illusion of legality blinded many to the numerous areas in which homosexuality remained criminal; it also forced the police and politicians to focus on those areas with renewed crusading vigour in the years thereafter.

A timely reminder of this uncomfortable truth came via Peter Tatchell’s excellent and eye (or ear)-opening Radio 4 documentary, ‘The Myth of Homosexual Decriminalisation’, broadcast on Saturday evening; it documented how 1967 was not so much an end as a beginning, the start of the long road to abolishing discrimination, altering attitudes and achieving an equal age of consent with heterosexuals – none of which were dealt with in the imperfect Act that came into being half-a-century ago.

Scotland, Northern Ireland, the armed forces and the merchant navy – all exempt from decriminalisation in 1967; much anti-homosexual legislation remained on the statue book for decades after 1967 and queer-bashing was a legitimate police pastime well into the 1980s. For out and proud young men today, barely old enough to even remember the last century, all of this must seem insane. The prejudices openly unleashed upon gay men and largely unchallenged by the majority of society combined with the AIDS hysteria (AKA ‘The Gay Plague’) and Clause 28 to create a climate of moral panic that would unthinkable to anyone under, say, 30 in 2017. Perhaps the inability to comprehend how we used to live has played its part in a lack of perspective where those too young to remember are concerned.

The sins of their forefathers for allowing this state of affairs to linger for so long without challenge has undoubtedly fuelled a militant bullishness amongst the young; this reaction demands the law and society in general adopt the consensus they’ve developed to serve as a severe redress to the past. It comes partly from retrospective guilt and is not unlike America’s similar response to historical racism via the slave trade and segregation. At its most extreme, the new consensus is imposed with the same level of illogical fanaticism once employed by those who upheld and endorsed the previous prejudices this consensus reacts against, portraying anyone who is white as inherently racist and anyone who is heterosexual as inherently homophobic.

But the ironic outcome can often seem like less of a striving for genuine equality between the different sexual demographics – which is surely what should be aimed for – and more of a determined campaign to ensure the poacher is elevated to gamekeeper and vice-versa. The new consensus cannot alter the past, but the slightest sign of any attitude bearing a passing resemblance to the past – however mild in comparison – dumps the wrongs of the past on the doorstep of the present. The ‘gay cake’ saga in Northern Ireland a couple of years ago seemed indicative of this mindset; a refusal to countenance that there are many out there for whom homosexuality remains a difficult concept has created a climate of intolerance that excludes debate. If you don’t embrace this consensus, you are a homophobic bigot – end of. ‘Inclusivity’ does not include those who deviate from the script.

The clamour to be seen as endorsing the consensus by political parties and other establishment organisations that maybe weren’t viewed as so gay-friendly in the past resulted in the virtue signalling of the National Trust edict stating volunteers dealing with the public at Norfolk’s Felbrigg Hall (whose last resident, Robert Wyndham Ketton-Cremer was recently posthumously ‘outed’) must wear rainbow gay pride badges. Those who weren’t comfortable with wearing them were to be relegated to the backrooms of the property. The case was taken up by certain Fleet Street tabloids and predictably labelled a right-wing cause célèbre by the likes of the Grauniad; but the sudden reversal of the edict so that wearing the badges is now optional rather than compulsory seems a more sensible compromise that recognises inclusivity should mean what it says.

Many of the archive recordings of attitudes towards homosexuality excavated for Peter Tatchell’s Radio 4 retrospective were as gobsmacking to hear as similar excerpts of unashamedly racist language from the same era; but whilst these attitudes survive on a smaller scale in private, the cheerleaders for our liberated society still turn a blind eye to one publically vocal section of it. Some of the vilest and most bigoted opinions on homosexuality expressed today emanate from Islam, yet the ultra-liberal left gives Islam the kind of leeway it won’t tolerate in any other faith, let alone secular discourse. Why? Perhaps it’s due to the fact that Muslims have been designated the left’s persecuted pets; they are above and beyond the kind of criticism others are fair game for.

Of course, not every Muslim is virulently anti-gay any more than every Christian or every person without any religion whatsoever; I think most people aren’t really that bothered, to be honest. It’s just a shame the person who retains a problem with the notion of homosexuality – usually down to simple ignorance and lack of education – is lumped in with the genuinely homophobic in a rainbow that has no shades of grey.

© The Editor