THE SOLITARY LIFE

When I was a kid, the only people I knew who lived on their own were a few old ladies. I assumed they were all widows, going by the sepia-tinted portraits of Brylcreemed young men I sometimes spied on the sideboard. They probably lost their husbands in the war; but if they were of actual pensionable age (rather than merely ‘looking old’, as anyone over-40 did back then), I guess some of them might have been widowed in the ‘14-18 bash as much as the sequel. It was a long time ago. I was given the impression the solitary life was reserved for a very narrow demographic; there was nobody in my wider family who lived on their own, for example. Aunts and uncles already out of their teens remained at home until they got married; that was presented to me as the natural order of things. None of them went to university either, so they didn’t even get to experience what now seems to be the routine route to liberation – even if returning to the nest as a debt-addled graduate with little hope of being a homeowner is the inevitable anticlimax to this adolescent interregnum.

Unlike my childhood, those who live on their own today aren’t necessarily ageing widows, and being the sole resident of one’s abode is no longer viewed as especially unusual or even a little suspect if you don’t resemble Ena Sharples or Minnie Caldwell. A 16% increase in the number of Brits living alone in the 20 years between 1997 and 2017 pushed the numbers up to 7.7 million; and whilst widows and widowers naturally still figure, higher divorce rates have played their part too; what used to be referred to as spinsters and bachelors are also far more abundant today than they were 40-odd years ago.

Interestingly, whilst there has been a 16% fall in the 25-44 age groups, the 45-64 demographic has seen a rise of 53%, with a higher proportion of both the divorced and the never-married filling the stats, reflecting changing social mores. Men living alone outnumber women – particularly in the 25-34 groups – until we reach the 55-64 groups, when the numbers even themselves out. The former groups mostly consist of the unmarried, whereas divorcees dominate the latter. All age groups, however, are less likely to own their home than married couples without children. Rented accommodation in later life can bring with it specific uncertainties and insecurities; higher levels of anxiety and lower levels of happiness are also attributed to living alone when compared to couples.

Of course, being alone doesn’t necessarily equate with being lonely; as Bryan Ferry once so memorably said, ‘loneliness is a crowded room’. Indeed, for every Eleanor Rigby, there is someone quite content to own their own space, especially if they’ve experienced an unhappy marriage or have made a conscious choice to avoid matrimony altogether. One’s profession can also play a part in the success of one’s living arrangements; some jobs are conducive to domestic bliss, whereas others encompass antisocial hours or are simply designed for solitude. I can certainly vouch for the latter. Unless writers could work out a way to ‘jam’ in the manner musicians do – perhaps sitting in a circle hammering away at their laptops in synch – it’s very much something that is allergic to the communal and enhanced by the absence of company.

The creative process can last days, weeks and sometimes months, during which time a writer must be the least desirable spouse it’s possible to imagine. Married men-of-the-pen who managed to make a handsome living from it have at least enjoyed the luxury of a ‘writing shed’ at the bottom of the garden; both Dickens and Roald Dahl famously retreated to theirs when the muse struck, and their families understood this meant ‘do not disturb’. Rented flats on the top floors of houses aren’t quite as accommodating, though at least the solitary life in such circumstances ensures that which Virginia Woolf famously referred to as ‘a room of one’s own.’

Naturally, for every plus to living alone there is a minus. Whilst there are many solitary dwellers whose boy/girlfriends regularly sleep over and therefore enjoy a ‘part-time’ relationship that can work for both parties, there are plenty more bereft of that option. If one has nobody to come home to of an evening or wake up with in the morning, the opportunities to self-indulge in self-abuse (and I’m not talking strictly masturbation) are myriad. With nobody to watch over you or rein in your excesses, the temptation to overdo it can be tempting indeed. The problem is, unlike being an overgrown Macaulay Culkin, the novelty of the home alone scenario ceases to be a novelty if it’s the norm. It’s easy to slip into the mindset that nobody gives a shit, so why should you; and that’s a hard habit to break, one that fuels such self-indulgence. Drugs are a popular passport to personal oblivion; but when it comes to writers, the demon drink appears to be the most common excuse for not knowing when to stop.

Although a considerable stretch from being a ‘proper alcoholic’, I admit that until relatively recently I was well on my way to having a serious drink problem; and what had initially emerged as a psychological crutch following a personal tragedy quickly morphed into the clichéd components of the author’s armoury. I completely fell for the stereotype of the death-wish wordsmith with the bottle of scotch and packet of fags as his constant companions and suffered the consequences in terms of the damage it does to those around you. When I belatedly recognised the damage it was doing to me, I finally did something about it – even though I left it far too late to salvage what it had already cost me. My stint as Ray Milland interestingly had no adverse effect on the work – which probably made it easier to avoid addressing the issue – but its slow-burning impact on my life beyond the written word was devastating.

I can take a less-than nourishing crumb of comfort from the fact I was a ‘funny’ drunk rather than nasty (like my father) or violent (like those found in Saturday city centres); but it’s not much in the way of solace when I reflect on what a selfish, nihilistic dickhead I was. In truth, I am profoundly ashamed of the way I behaved and no apology to the injured party can ever be good enough. But at least I’ve narrowed down my consumption from an average daily intake of two bottles of wine, a dozen glasses of whisky and half-a-bottle of vodka to a solitary Chardonnay one evening a week. I might drink it wholly alone, but at least it’s all I drink.

Food can be another casualty of the solitary life. The appeal of a hearty meal doesn’t necessarily escape those living alone, but the lengthy preparation can feel like an immense demand on both time and energy when there’s only one mouth to feed; the easy alternative of some microwaveable plastic that can be unsealed, heated and scoffed in barely five minutes reflects the fact that solitude sometimes breeds hostility towards the ceremonies reserved for couples. In contrast to the instant meal for one, preparing, cooking, stirring and serving a proper dinner for two is a ritual that can span an hour or more, albeit a ritual that – if shared – can be as exquisitely intimate an experience as any that two people can enjoy with their clothes on.

Living on your own, as with sharing your life, has the potential to be either a blessing or a curse depending on the circumstances; both arrangements have their advantages as well as disadvantages, and both should be tried at least once. I’ve known many a miserable soul trapped in a loveless relationship, just as I’ve known many a life and soul for whom the thought of having to share their space is anathema. Ironically, when one examines the statistics, one is very much not alone in being alone.

© The Editor

URBAN MYTHOLOGY

Whilst the majority of last week’s D-Day anniversaries were fitting tributes to those who fought them on the beaches, it was inevitable a degree of nostalgia – even for such dark days – would creep into the commemorations. In the case of the Second World War, we have the comforting hindsight of a happy ending, which participants were denied at the time; but nostalgia – whether for the War via ‘Dad’s Army’ or talking-heads TV celebrating more recent cultural epochs – is a romantic electric blanket that is at its warmest when the chilly present seems to lack certainties. There don’t appear to be any certainties at all right now, and nobody has any idea what comes next other than predicting the worst. By contrast, the past is a benevolent piece of furniture we can curl up in and know where we are.

That said, distance sometimes enables us to discern jewels that were hidden when we were busy living in the past – as Jethro Tull once perhaps pointed out. For example, I’d only have to glance at a handful of posts on here from 2016 to come to the conclusion that 2016 was a terrible year – yet, from my own personal 2019 perspective, I can now see it was one of the happiest times of my life. If anything, this serves as a salient lesson to enjoy what one has whilst one has it instead of waiting for it to be claimed by nostalgia and the belated appreciation that is tinged with wistful regret. But I digress.

When watching the 60s/70s drama ‘Public Eye’ recently, it was telling that, amidst the inevitable presence of so many elements of British life long since gone, a particular plotline caught my eye: Lead character Frank Marker moves from one town to another and has to make an appointment to meet the man who is now his bank manager in order that his account can be transferred from his old branch to his new one. Despite Reg Varney making history with his inaugural withdrawal in 1967, hole-in-the-wall cash machines were hardly a fixture on every street corner through the 1970s, if at all. Alfred Burke’s character couldn’t simply relocate elsewhere and continue to withdraw money from anywhere he happened to be – neither could he manage his financial affairs himself online; all of his payments were physical and if he wanted to invest or withdraw, he needed to go to an actual building and make the exchange over the counter by engaging with a fellow human being.

In a week in which I witnessed the doors of yet another neighbourhood bank branch close for good, this scene from ‘Public Eye’ also reminded me how that mainstay of 70s sitcom jokes, the bank manager, was once an office almost on a par with the local vicar, GP or police constable in terms of ‘civic dignitaries’; they no doubt still count for something in Ambridge, but in urban areas the bank manager is virtually an extinct species. If you, like me, reside in an urban area, you won’t have a bank manager either – nor do you probably know a vicar, a copper or even a GP, at least if your experience of the impersonal surgeries in which a different doctor dispenses medication every time you visit is anything like mine.

In most cases, the clout such professions carried has gone because the environment that elevated them has gone. The absence of belonging that many in an alienating metropolis feel can partly be traced back to the point where the strands of benign authority that helped bind communities together became frayed and then snapped; from village elder to local squire to Sgt Dixon, the people required at least one go-to figure to resolve their disputes. Even if they still do, those figures aren’t around anymore; and, anyway, if authority equates with age, the village elder is most likely now rotting away in a care home. We can’t rely on the police to come running when we dial 999, we can’t get an appointment to see a GP, and our bank no longer has a branch on the high-street. Even if you favour collectivism, you’d be hard pushed to generate it in such a fragmented landscape.

The old concept of community, in which everyone had a part to play and a function to perform, had developed from the village roots of towns and had in turn arisen from ancient tribal divisions of labour; in those parts of the world where the literal meaning of ‘tribe’ still applies, one tends to find these roles remain intact and crucial to the community’s survival. In the west, where communities had grown through being supported and sustained by one specific industry, a sense of place was strong in a way that – following the subsequent black hole of underinvestment since the industry’s collapse – has been rendered utterly redundant. A town’s residents can connect with someone on the other side of the world but might not necessarily know a single person living on their street.

Today, community can be more of abstract concept, often equating with identity; the general trend is for the rejection of shared common ground in favour of individual separateness. Even when people defined by their differences or ‘diversity’ are quick to gather in a facsimile of community, their emphasis on individuality precludes genuine community, hence the endless splitting into endless subdivisions of every community based around identity, underlining how diversity can diversify to the point whereby nobody has anything in common anymore. The 21st century incarnations of the People’s Front of Judea and the Judean People’s Front are permanently engaged in social media spats that make unity seem like something people only did in the old days. We receive a tantalising taste of it when we pause to commemorate lives lost in conflicts that required unity to succeed; but the fact that WWII will soon cease inhabiting living memory to join the Napoleonic Wars as mere history keeps it firmly in the context of the past.

Politicians being, of course, the cynical old manipulators of the public mood that they instinctively are, sell themselves to the electorate by appealing to the craving for community as it used to be. The pitches of the wretched hopefuls vying to become the new Tory leader (and, unfortunately, Prime Minister) are crammed with fatuous references to ‘bringing the nation together’ as they line-up like a bunch of vacuous suits to be sneered at by Alan Sugar. The fact that they all appear to be falling over each other to see who can produce the best drug-taking anecdote is a bizarre development that could be viewed as either an attempt to appear human (not easy for a Conservative MP) or to pre-empt any dirty digging on the part of their opponents. Personally, my opinion of Michael Gove has not changed one iota now that I know he snorted coke 20 years ago; and to be honest, if I was married to Sarah Vine I’d probably be permanently off my tits on mushrooms, seeing that as the only viable means of achieving domestic bliss.

Understandably, one response to this strange rash of substance abuse confessions from the kind of people you really don’t want to picture snorting or skinning-up has been accusations of hypocrisy. For decades, the Conservative Party has repeatedly opposed any grownup discussions on the antiquated drugs laws and has constantly played the finger-wagging nanny against anyone daring to recreationally indulge. Then again, this ‘do as I say, not as I do’ approach that the current confessions appear to emphasise is perhaps especially grating because it sounds so parental, albeit emanating from the most uncaring and irresponsible parents imaginable. If we need our village elders today, Westminster is not the village where we’ll find them.

© The Editor

THE SWINGEING SIXTIES

A couple of anniversaries worth marking, I thought; a regular feature of this here blog, but always a welcome break from contemporary concerns, what with most of them being pretty grim. Today marks a decade since the UK smoking ban came into force; but firstly, fifty years ago today, the Times published an editorial that remains one of the few (if indeed the only one ever) to impact considerably on pop culture, as well as marking a significant turning point in the Us and Them battle that divided young and old in mid-60s Britain. The emergence of Teddy Boys, Rock ‘n’ Roll, Beatlemania and Mods Vs Rockers all gave rise to the belief amongst the generations that had fought in two World Wars and then ran the country that the country was falling apart at the seams.

None invoked such Blimp-ish rage in establishment circles as those shaggy-haired scruffs The Rolling Stones; their appearance alone was deemed offensive enough, but the thought that these 12-bar wonders might have any kind of influence over the young beyond simply cajoling them into buying their records seemed symbolic of the decline and fall of western civilisation. Things got worse as the Stones began to adopt a more erudite, cultured persona when the arty influence of girlfriends like Marianne Faithfull and Anita Pallenberg stretched their ambitions and aspirations beyond merely recycling the Blues. They appeared to be encroaching into the Highbrow, which was bad enough; and then they began extolling the virtues of chemical mind-expansion, something previously reserved for revered (and safely dead) intellectuals like Aldous Huxley.

Fining the band for peeing against a garage wall when the petrol pump attendant refused them access the loos was one thing; but in order to stop this repulsive revolution in its tracks, there needed to be something bigger that could bring about the desired effect. In 1967, the opportunity presented itself and the cohabitating coterie of press, police and judiciary seized upon it. The loose lips of Brian Jones in a London club, unknowingly endorsing LSD to an undercover journalist, led to said Stone being mistakenly identified in print as Mick Jagger; Jagger sued the News of the World but, like Oscar Wilde’s legal action against the Marquess of Queensberry, this response then provoked the enemy into making its move, which it did a week later.

The raid on Keith Richards’ Redlands home, interrupting the aftermath of a ‘drugs party’, has long been woven into both Stones and Rock mythology – with poor Marianne Faithfull still dogged by the utterly fabricated ‘Mars Bar’ rumour; but the outcome for Mick and Keith at the time wasn’t quite so entertaining, the former charged with possession of four amphetamine tablets and the latter with allowing cannabis to be smoked on his property. They were tried at the Chichester Assizes in June 1967 and were both found guilty, with Jagger sentenced to three months’ imprisonment and Richards to a year. They both immediately launched appeals and were released on bail after a night behind bars.

The severity of the sentences and the dubious collusion between Scotland Yard and the News of the World raised many questions. The Stones’ contemporaries reacted with a show of support, with The Who rush-releasing cover versions of ‘Under My Thumb’ and ‘The Last Time’ as a single; but the most unexpected show of support came not from Us, but Them. Sensing an injustice had been done simply to teach these loutish upstarts a lesson, none other than William Rees-Mogg (yes, father of Jacob) intervened. Rees-Mogg was the editor of the Times – viewed as a bastion of the same establishment intent on persecution and punishment where the Swinging 60s were concerned – and he made an eloquent, passionate plea in the Times’ editorial on 1 July 1967, under the title ‘Who breaks a butterfly on a wheel?’

‘If we are going to make any case a symbol of the conflict between the sound traditional values of Britain and the new hedonism,’ wrote Rees-Mogg, ‘then we must be sure that the sound traditional values include those of tolerance and equity. It should be the particular quality of British justice to ensure that Mr Jagger is treated exactly the same as anyone else, no better and no worse. There must remain a suspicion in this case that Mr Jagger received a more severe sentence than would have been thought proper for any purely anonymous young man.’

Coupled with the widespread outrage amongst the young over the sentences, the Times editorial prompted the authorities to bring the appeal hearings forward and a month after being sentenced, the sentences were quashed. Mick and Keith walked away from court free men again and Jagger was more or less immediately flown by a helicopter hired by ambitious Granada producer John Birt to take part in a special ‘World in Action’ debate with three members of the establishment (chaired by Rees-Mogg), who seemed to look upon Jagger as elderly scientists would look upon a fascinating new species of butterfly. But Jagger’s easy-on-the-ear middle-class accent and reassuring, unthreatening demeanour charmed both his inquisitors and the television audience.

The intervention of William Rees-Mogg and the belated realisation by the Great British Public that maybe these demonised heroes of the young weren’t quite as great a threat to the future of mankind as the atom bomb marked a sea-change in the way the transforming society was perceived by its elder statesmen. The same year as the cause célèbre of the Mick & Keith trial, homosexual acts between consenting adults in private were decriminalised, abortion was legalised, and ‘Sgt Pepper’s Lonely Hearts Club Band’ was embraced by young and old alike as Art. The affair also gifted Keith Richards, previously overshadowed in the media spotlight by Jagger and Brian Jones, the outlaw image he’s maintained ever since as the ‘soul’ of the band. There were casualties, however.

Brian Jones, targeted by the drugs squad in a separate raid and increasingly isolated within the band, embarked upon a rapid downward slide that culminated in his mysterious premature death two years later; Marianne Faithfull, denounced from the pulpit as a harlot and mercilessly mocked over the Mars Bar myth, then embarked upon her own downward slide that led all the way to being a homeless heroin addict in the 70s. But the Times stepping back from the great divide to look at it with objective sagacity was the first step towards acceptance of youth culture as a valid and relevant force within society by those too old to participate. Bar the odd moral panic over Punk Rock and Acid House, it has been recognised as such ever since, as thousands of books, documentaries and humble little articles such as this will testify.

© The Editor

ACID DROPS

Depending on where one stands re conspiracy theories, the fact that many man-made chemicals were usually in the hands of the military and secret services as experimental mind-controlling weapons before filtering down to civilians as recreational drugs could be seen as confirmation that ‘The Man’ knew all along that it would be far easier to let the people destroy themselves by allowing them to believe they were getting one over the authorities. The permanently illegal status of the most popular recreational drugs definitely imbues a cavalier sense of flouting the law in those who purchase and use them, after all; and what better way to bestow ‘cool’ upon something than to ban it, knowing it will do your dirty work for you in the process?

I once heard the opinion aired that heroin didn’t flood into the most socially deprived and politically militant urban areas of Scotland until opposition to Margaret Thatcher’s Government was recognised as a potential problem; suddenly, an entire generation of possible agent-provocateurs was neutered by smack, thus negating the likelihood of any mass civil unrest. Hey, it’s a theory. One could also point to the way in which gin saturated the ghettos of Georgian England at a moment when the governing class’s fear of the Mob and anarchy provoked by the poor was at its height. Gin certainly subdued that, as Hogarth knew only too well.

Whether or not the infiltration of substances whose consumers instinctively exceed moderate dosages of into society is indeed a conscious ploy of subversive control by The Man is fact or fiction, the official line that Drugs are Bad seems destined to remain intact; and in some respects, this system upholds an equilibrium that leaves the user feeling he’s some social desperado living on the edge while the authorities continue to appeal to Daily Mail Man as the balance of power is maintained and both parties are happy. Bring peace to the War on Drugs by making all drugs legal and there’s no fun in it anymore for either side.

One of the problems with any drug that produces effects the user enjoys is curbing the user’s natural desire to prolong those effects by ingesting more. Why stop at a couple of glasses of wine when there’s a whole bottle in front of you that can make you feel even better if you drink it all? At one time, I used to occasionally have a spliff, but it was always in a social situation and social situations for me – even when I’ve been at my most sociable – have tended to be separated by weeks. I’ve had friends for whom rolling a spliff has been something they’ve indulged in upwards of half-a-dozen times a day; as a consequence, they’d spend the majority of that day stoned. That to me is diluting the pleasurable impact a drug can have if it attains the same level of a treat as a bar of chocolate had when I was a child. If it’s there on tap, it’s no treat at all, and trying to recapture the initial impact as one’s bloodstream becomes accustomed to the sensation through overuse means increasing the dosage.

With many drugs classed as ‘A’, the addiction of the user tends to arise from this futile recapturing of the original impact it had, like the heavy drinker who doesn’t know when to stop. The only real way to ensure such an impact hits every time is to use the drug in moderation, but that’s easier said than done, especially when the effects of the drug may be the only effects life offers that make life worth living. The pot of gold that constitutes the must-have accessories we are taught from an early age to aim for is for some a poor alternative to the ecstatic rush of a Class A drug, especially now that so many of that pot’s contents are far more unattainable – not to say unaffordable.

Outside of recreational use, the occasional calls for giving a drug like cannabis special medicinal dispensation on the grounds of its proven positive effects on sufferers of diseases such as MS tend to face the same response. The belief perpetuated by the powers-that-be and their media mouthpieces that cannabis, like all drugs, is evil always seems to override any intelligent discussion or sensible progress. We are taught to believe if we smoke a spliff it’ll be the first step on the stairway to crack, as though anyone who samples the apothecary’s forbidden fruit is incapable of not succumbing to chemical obesity. Granted, that can happen (as I’ve already pointed out), but that doesn’t have to be the case for everyone.

The news that a small minority of people in the UK who experience mental health difficulties have been self-medicating psychedelic drugs in tiny daily doses – so-called ‘micro-dosing’ – appears to have taken LSD full circle. Not long after it was synthesised by Swiss chemist Albert Hoffman in 1944, the US military and the CIA tested the drug on mostly unknowing guinea pigs before abandoning it as too erratic a method of mind control; psychiatric hospitals came to similar conclusions.

However, as soon as LSD slipped into the hands of artists and writers, who viewed it in a different light altogether, the blue touch-paper for the 60s counter-culture had been lit and the predictable outlawing of Acid in the US and UK fuelled its mystique further. Unfortunately, the over-indulgence of its guitar-strumming salesmen followed a familiar pattern and any talk of LSD as a beneficial substance for medical complaints was abandoned.

Considering the racket that is the pharmaceutical industry, perhaps it’s no wonder people have turned once again to the possibilities inherent in the drugs no drugs corporation will touch. A London-based psychiatrist named James Rucker has recently overseen a trial at Imperial College that treated clinical depression with magic mushrooms, reversing decades of aversion by the profession to psychedelics. He doesn’t endorse self-medicating micro-dosing, but perhaps only because he hasn’t yet tested the process under trial conditions. Amidst the usual fears aired as to the dangers of ‘taking too much’, it would seem some have discovered a means of administering a drug that works for them by trusting their adult ability to practice restraint rather than being society’s naughty child incapable of resisting addiction.

© The Editor

THE DRUGS DON’T WORK

SoldierSide-effects are one of the prices paid for the positive outcome of any prescribed medication; indeed, many of these are specified in small print via the folded-up leaflets that accompany the packages. Some are as long as shopping lists; every unwelcome addition to daily hang-ups over physical appearance seem to be possibilities – weight-gain, mood swings, spots, insomnia, loss of appetite, drowsiness, dehydration etc.; you name it, it’s a potential side-effect. In fact, the roll-call of probable side-effects is so ridiculously varied and vague that it does often make one wonder if the drugs companies are either utterly clueless or simply hedging their bets when it comes to future litigation on the part of the consumer.

With this in mind, it’s no surprise that medication intended to combat a disease as serious as malaria is bound to contain its fair share of side-effects; and Lariam appears to boast an abundance of the worst kind. However, it is its regular use by the Ministry of Defence for immunising troops dispatched to parts of the world where malaria tends to strike that has hit the headlines today – particularly with the admission by former head of the British Army Lord Dannatt that, though he was running the show whilst soldiers were being given Lariam, he himself wouldn’t touch it with a bargepole.

To justify his aversion, Dannatt cites the negative experience of his own son with the drug after taking it as a precaution prior to an African holiday in the 90s; Dannatt says the side-effects made his son withdrawn and depressed, common side-effects where Lariam is concerned, along with suicidal thoughts and violent outbursts. Although Lariam isn’t the sole drug deployed to combat malaria, it was the standard anti-malarial drug given by the MoD to troops from 2007-15 – upwards of 17,000 soldiers. Lord Dannatt was in overall charge of the British Army from 2006-9, at a time when Lariam was being used, yet despite his awareness of the damage it could do, he said nothing.

Bogged down in Iraq and Afghanistan during Dannatt’s tenure as head of the British Army, the issue of anti-malarial drugs was pushed on the backburner due to the troops largely engaged in conflict in areas where malaria wasn’t prominent. One ex-soldier who has gone public was dispatched to Sierra Leone in 2000, though, and such precautions were necessary in that case. He claims Lariam had an immediate impact, turning him into an ‘ogre’; it’s not an especially comforting image to realise trained armed men were on a foreign field in a precarious mental condition, though in response the MoD says that Lariam has only been given to troops after ‘individual risk assessments’ since 2013.

Dannatt has tentatively apologised for any damage done by the drug to British soldiers on his watch, though it obviously didn’t damage every troop exposed to it; the vagueness of potential side-effects as listed in the leaflets that are included in every box of prescribed medication cover all eventualities, though as a regular user of prescribed medication myself I recognise that few of the myriad side-effects conjured up in print have surfaced whenever I’ve been following the recommended course. Both the World Health Organisation and Public Health England endorse Lariam as an effective aid against malaria, and the company that manufactures it has issued a statement reinforcing its faith in the MoD to prescribe the drug with due care. This doesn’t detract from the fact that some of the more disturbing side-effects in the case of Lariam have indeed ruined lives, and Lord Dannatt’s belated public acknowledgment of this will probably be little comfort to its sufferers. Mind you, the military does have a history of caring for their cannon-fodder with somewhat casual nonchalance – particularly when it comes to chemical-related matters.

The original 1962 movie of ‘The Manchurian Candidate’ reflected the concurrent experimental tests by the CIA that sprang from a sordid little project known as Operation Paperclip, whereby their response to alleged Soviet mind-control was to hire scientists who were actually Nazi war criminals and use unknowing military personnel as guinea pigs. The film depicts a character played by Laurence Harvey as an ex-US Marine captured by the enemy during the Korean War and brainwashed into becoming a sleeper assassin, triggered into action years later. It’s all Cold War paranoia, of course, but Operation Paperclip and its various illegal offshoots that contradicted the Nuremberg Code agreed to by the US in the aftermath of the Second World War were all-too real. Long before its elevation to compulsory recreation by the hippies, LSD was a regular drug of choice for such experiments, though the thought of attempting to issue orders to anyone tripping off their tits on Acid does seem like an exceptionally futile exercise.

Whilst the use of Lariam by the British Army pales next to the appalling operations indulged in by the CIA in the 50s, that Lord Dannatt has waited until now to come clean about his reservations is perhaps more indicative of the mindset within the MoD that sent British troops into war-zones ill-quipped and under-funded. If a country asks young men to lay down their lives for it, the least that country can do is to safeguard against that likelihood as best it can. In so many cases, those running the British Army have come up short too many times. And that’s simply not good enough.

© The Editor