JUDAS Priest‘Making Plans for Nigel’, the first top 20 hit for that most brilliantly quirky of British bands, XTC, tells the tale of parents mapping out their son’s future without much consultation with the son himself. ‘We only want what’s best for him,’ they say. ‘Nigel’s whole future is as good as sealed’ because ‘he has his future in British Steel.’ The wry, witty lyrics portray parents fixed in the belief that a job with a state-run industry – steel was first nationalised in 1949 and again in 1967 – is a job for life. Within months of the single’s release, steelworkers went on strike, presenting the Thatcher Government with its introduction to industrial unrest and presumably wrecking the plans of Nigel’s parents.

British towns that morphed into cities during the white heat of the Industrial Revolution bore the scars of their industries like an old boxer’s broken nose and cauliflower ears. Most were smelly, dirty and dark. Should anyone head north from the capital – or west to Wales – they would have embarked on an eye-opening journey into the black underbelly of the Empire’s engine, confronted by skies discoloured with the discharge of chimneys and cooling towers, the toxic fallout from industry that had put the city on the map and served as the supplier of employment for its population. It was the Faustian pact that population entered into, and however miserable their living conditions and however low their wages, a pride was imbued in the industry that their city was associated with as generations earned a living from it. The industry was a benevolent cushion that kept its beneficiaries from being thrown upon the mercy of the state.

The decline of Britain’s manufacturing industries in the decades following the end of the Second World War went hand-in-hand with the death of Empire and the country’s diminishing status on the global stage. In the space of a generation, Britain had to make its biggest adjustment to its place in the world since the advent of the era that had made it such a force to be reckoned with in the first place. Certainties that provided the nation with a great deal of its identity for more than a century, whether a map sporting huge swathes of pink landmasses or an industry that sprawling metropolises had grown around, were clung to with a sense of futile desperation as though the populace were scrambling for the boats jettisoned from a sinking ship.

The slow, painful extinction of Britain’s mining industry perhaps spawned the most extreme reaction from people who had known nothing else, and losing the battle left their communities decimated, smashing the framework of those communities without offering a tangible alternative; many remain struck in the shadows of that defeat to this very day.

The reasons for decline in British industry are myriad: Competition from nations whose economic fortunes improved in the aftermath of WWII as ours nosedived, the cheapness of imports over home-grown produce, lack of long-term investment, the materialistic, self-aggrandising greed of union leaders which was then followed by the often ruthless individual profiteering from privatisation at the expense of collective benefits, EU restrictions on government subsidies and bailouts, the increasing preference for service industries over manufacturing ones, the financial sector being awarded seemingly ‘special’ status – many of these aspects have played their part in the sorrowful situation that the British steel industry finds itself in today.

Unusually for a declining industry, demand for steel is high and supply is abundant – perhaps too abundant. China’s steel industry outgrew its own domestic demand and began to export its steel at knockdown prices that quickly rendered the native steel industries of the countries buying-in the cheaper Chinese version unprofitable. Despite the money that Tata, the Indian company that purchased what used to be called British Steel in 2007, has poured into the business, it continues to operate at a loss, making this week’s announcement that the industry is up for sale the kind of apocalyptic news that everyone who works in it has been dreading. As was the case with mining, the steel industry isn’t centred on just the one specific region of the UK, and the loss of jobs if the industry is allowed to go under has the potential to affect far more lives than if it were restricted to a solitary city or county.

There’s no doubt that cities which suffered the loss of their major industry have emerged from the wreckage as cleaner, healthier and less polluted places to live; but their spirit has been broken in the process, leaving youngsters without the guarantee of following in the footsteps of their forefathers little option but to up sticks and try their luck elsewhere, getting on their bikes and looking for work as someone once famously said. Consett, Corby, Ebbw Vale, Shotton and Motherwell all suffered high unemployment in the wake of their steel plants closing down; workers at the remaining plants in South Yorkshire, South Wales, Scunthorpe, Teesside and Scotland now bite their collective fingernails to see if the Government will intervene and prevent similar devastation laying waste to their own corner of the country.

© The Editor


AdolfWhen Radovan Karadzic received a forty-year sentence after being found guilty of genocide, war crimes and crimes against humanity at the International Criminal Court in The Hague last week, the trial of the former Serbian politician, poet, fugitive and advocate of ethnic cleansing during the Bosnian War had taken six years to reach its conclusion. By contrast, the trial of Nazi War Criminal Adolf Eichmann in Israel half-a-century ago took just eight months between the opening of the prosecution case and Eichmann being sentenced to death. At the time of Eichmann’s capture in Argentina by Mossad agents in 1960, a mere fifteen years had passed since the end of the Second World War; when Karadzic was finally found ‘hiding in plain sight’ in Belgrade, twelve years separated the present day from the conflict in which Karadzic had earned his notoriety. His crimes were as fresh in the memory of those affected by them as Eichmann’s had been for death camp survivors in the early 60s. In many respects, they remained present tense.

Karadzic’s ability to evade arrest for over a decade was aided by his standing amongst many Serbs, who didn’t regard his actions as those of a criminal; neither did he; he privately expressed his conviction that a ruthless policy of ethnic apartheid in the former Yugoslavia was both morally and intellectually indisputable. Adolf Eichmann had said towards the end of WWII that he would ‘leap laughing into the grave because the feeling that he had five million people on his conscience would be for him a source of extraordinary satisfaction’. Did he fulfil his promise when the noose was slipped round his neck in June 1962, I wonder? Neither man appears to have recognised or acknowledged that their warped judgement on their fellow-man was in any way flawed or repugnant, and any notion of remorse was conspicuous by its absence.

The need for justice to be seen to be done is understandable in the case of both Eichmann and Karadzic. Their respective decisions led to the deaths of thousands (in the case of Karadzic) and millions (in the case of Eichmann); the innocent lives lost as casualties of inhumane ideology resonated throughout families for generations and the natural human desire for somebody to be held accountable galvanised the searches, arrest, imprisonment and trials of both men. The gap between crimes and capture is fairly similar, though the difference in the length it took between the beginning and ending of their respective trials is considerable. In terms of speed, Eichmann’s trial was reminiscent of the Nuremberg Trials in the immediate aftermath of the Second World War, where 23 leading members of the Third Reich were tried and sentenced in the space of twelve months. These were military tribunals, of course, with judges provided by Britain, France, the US and the USSR, and were carried out at a moment whilst much of Europe still stood in ruins, with the thirst for retribution at its height.

The most prominent Serb war criminal to stand trial at The Hague prior to Karadzic was ex-Yugoslav President Slobodan Milosevic, who died in 2006 before the trial was completed; at the time of his death, it was a mere five years into proceedings. The establishment of the International Criminal Tribunal for the former Yugoslavia and the International Criminal Tribunal for Rwanda in 1993 and 1994 respectively both predate the official formation of the International Criminal Court, which didn’t come into being until as recent as 2002, one of the reasons why Milosevic refused to recognise the body that had begun his trial the year before. That it took decades for the ICC to be established after the initial proposal for such a concept was mooted as far back as 1919 is perhaps reflected in the unsatisfactory length of time it takes for a judgement to be reached. When one considers 23 Nazi military and political personnel were tried, sentenced and convicted in a year from 1945-46, it would seem the institution sitting at The Hague still has a lot to learn from historical precedents.

Although victims of Eichmann gave evidence at his trial in 1961, one of the ICC’s key provisions is that victims of whoever happens to be in the dock have a right to make their voices heard at a trial, which might perhaps explain why the process drags on for so long. There is also the factor of state participation in ICC indictments; an incumbent head of government cannot be forced to surrender to the body and many rulers accused of crimes against humanity, particularly in African countries, cling onto power as it continues to provide them with immunity. The difficulties in dragging a defendant to The Hague in the first place is then compounded by the tedious legal chess game that constitute the trial itself, fuelling the opinion in many quarters that the International Criminal Court is an insufficient means of dealing with the problem of war crimes.

One thing that can be said in the ICC’s favour is that the trials at The Hague tend to be held no more than a decade after the crimes of which the defendant is accused were allegedly committed. Justice rather than vengeance is the prime objective. When it comes to frail men in their 90s being wheeled into the dock and charged with something they are alleged to have done long before most of those present in court were even born, concepts of justice become more problematic.

If someone who regards an octogenarian as a young man is hauled before a court and charged with a crime committed when he was in his 20s, the paucity of witnesses to and survivors of his crime often suggests vengeance is the prime motivation for his presence there. After all, what kind of prison sentence is someone within a whisker of being one hundred-years-old really expected to serve? The fact that the vast cast of Nazi war criminals who evaded justice in the first few years after the end of the War have already died seems to be no impediment to the hunger for vengeance, and those intent on pursuing the few that are left has seen guards at concentration camps carry the can for the actions of their superiors. They were, as the cliché goes, ‘only obeying orders’. There has to be a cut-off point for a conflict that ended over 70 years ago now, just as there has to be in all cases of ‘historic crime’ of whatever nature. Vengeance is not the same as justice, though domestic courts in this country have yet to accept this. Just ask Wen Zhou Li.

© The Editor


EggsIf only we could blame it on the EU or Muslims – I suspect that was the first thought that entered the heads of Mail and Telegraph journos when it was pointed out that the word ‘Easter’ had been quietly dropped from the branding on boxes containing the springtime produce of the chocolate manufacturers. It would certainly chime with some of the silliest Brussels directives of recent years, ones concerning new specifications over the size of sausages, ones that prompt Sun campaigns to ‘save the British Banger’ and so on. It would also fit the classic Fleet Street Islamic narrative of Loony Left local councils banning Christmas decorations on the grounds that they might offend the non-Christian community. Alas, no. This particular move appears to have come from the chocolate companies themselves.

Nestlé deny there has been a deliberate decision to drop the word Easter from their produce this time of year, but it has disappeared none the less, just as it has from the eggs sold by their rival Cadbury. Is this a sinister conspiracy to erase the key word from a Great British Tradition? I don’t think so; it probably has more to do with the fact that businesses – which is, lest we forget, what Nestlé and Cadbury are – have merely picked up on which way the popular wind is blowing and have gone with the secular flow.

The front cover of the Radio Times – in many respects an unsung barometer of the zeitgeist – this week displays a cartoon bunny. When I was a child, the cover of the Easter issue would always be graced with religious imagery; at one time, I suspect only Father Christmas had appeared on more RT front covers than Jesus. The programming on the BBC reflected the roots of the Easter festival as well; aside from the token church services, I remember seeing the enjoyably kitsch 1973 movie of ‘Jesus Christ Superstar’ on BBC1 at Easter 1979, and even ITV exhibited a reverence for the Christian tradition by spending a fortune on ‘Jesus of Nazareth’, its epic retelling of the Christ legend with Robert Powell in the title role.

After donning a turban for a visit to a Sikh temple in order to win the ‘ethnic vote’ during the last General Election, David Cameron then went on to nail his colours to the Church of England mast in stressing Britain was still a Christian country. On paper, yes it is; but it’s not the Christian Britain I had shoved down my throat in early schooldays, when assemblies would be dominated by the RE teacher reading a Biblical fable, and the entire class would have to close its eyes and recite the Lord’s Prayer in unison at the end of every day before being allowed to leave at 3.30. It’s not even the Christian Britain that marked Harvest Festival with each pupil bringing a tin of food to school that would then be distributed to the pensioners of the parish. I’m not up to speed on the curriculum these days, but perhaps if such a ritual still exists it now bears a name of something like The Non-Denominational, All-Inclusive Multi-Faith Festival?

While the likelihood of a ‘Jihadi Egg’ being manufactured in the shape of a severed head and featuring a cartoon incarnation of a sword-wielding ISIS assassin on the packaging is probably a step too far even for the paranoid fantasies of a right-wing tabloid editor, the removal of a word that a Nestlé spokesman says is so associated with chocolate eggs that it’s no longer necessary to put it on the box isn’t quite the end of the world as we know it. One could argue the disappearance of references to a religious festival from something so frivolous should enhance the presumed dignity of the occasion rather than detract from it.



Thirty years ago, I bought the first issue of a new national newspaper; today, I bought the last. Yes, the Independent remains an online presence, but it’s not quite the same, is it? I don’t buy a paper often, but when I do, I tend to opt for the Indy; the content always seemed well-balanced between left and right to me, attempting to navigate a middle ground where other papers are incurably partisan, something I found refreshing. But what I really liked about the Independent, a factor it will be impossible to reproduce online, was the design of the front cover.

A cover featuring a single image was pioneered by the Daily Mirror at the turn of the twentieth century, when the Times famously featured classified ads on its cover. Ever since it adopted this eye-catching tactic for major news stories, the Independent has stood out from the other cluttered covers on the newsstands, in which a hysterical headline is hemmed-in by free gift offers, quotes from a columnist whose column features inside, and photos of celebrity fashion faux-pas. By its very nature, an online newspaper’s homepage is cluttered, with the need to include links to every section. The single image that gave the print edition of the Independent’s front cover its aesthetic uniqueness cannot have the same impact when viewed on a monitor or a mobile.

The paper may be moving with the times, but by doing so it has undoubtedly lost something special in the process. The Independent stood out from its competitors in the newsagents, but it won’t online. It can’t.

© The Editor


CruyffThere will always be the debate – Stanley Matthews or George Best? Pele or Maradona? For me, the man who played the beautiful game with greater artistry than anyone ever to slip into a pair of studded boots was Johan Cruyff, whose death from lung cancer at the age of 68 has been announced. The outspoken and opinionated Dutchman displayed an undoubted arrogance and absence of modesty both on and off the pitch because he knew he was the best; and he was. Although he made his professional debut as a 17-year-old in 1964, domestic football in Holland had only recently emerged from its amateur age and the Netherlands had never made any notable impact at either club or international level. Cruyff’s club, whose books he had been on from the age of ten, was Ajax of Amsterdam. When former Ajax player Rinus Michels returned to the club as coach, he saw in Cruyff a player around whom he could build a team playing a style of football that would eventually turn Ajax into the world’s greatest club side.

Ajax first became a name to reckon with when reaching the 1969 European Cup Final; though they lost to AC Milan, there were clear signs that a little tinkering here and there could take the team onto the next level. Bringing a crop of remarkably talented youngsters such as Johan Neeskens, Johnny Rep, Barrie Hulshoff, Ruud Krol, Gerrie Muhren and Arie Haan into the side, Michels, with Cruyff as his eyes and ears on the pitch, developed what was christened ‘Total Football’; this system required all outfield players to be equally adept up front and in defence, a fluid, rotating form of play that was expertly orchestrated by Cruyff, who acted as a veritable puppet-master, pulling the strings of the team and directing events as a virtual footballing auteur. Due to an act of serendipitous superstition, Cruyff wore No.14 on his shirt at both club and international level (at a time when teams were exclusively numbered 1-11), as though his unique talents couldn’t be contained within the numerical strictures of the game. The style of football was special; the team were special; and he was special.

The Total Football system swept all competition aside in the early 1970s. Ajax won three consecutive European Cups in 1971, ’72 and ’73 – a feat that had only previously been achieved by the great Real Madrid side of the late 50s. On the domestic front, Ajax were almost unbeatable. In the 1971-72 season, they won the domestic treble as well as their second European Cup and the Intercontinental Cup, a contest between the top teams in Europe and South America in which the brutal approach to the game typified by the latter couldn’t contrast greater with that of the Dutch. Come the 1974 World Cup in West Germany, the time had arrived to showcase Total Football on the international stage.

Although Rinus Michels had by this time relocated to Barcelona (with Cruyff joining him there after Ajax’s third consecutive European Cup win), he took control of the Dutch national side for the 1974 World Cup tournament and filled the team with players he’d brought to fruition at Ajax. Holland illuminated that summer; World Cups traditionally draw in viewers who don’t follow domestic soccer but become bedazzled by the event, and watching Cruyff in full flow at the peak of his powers transformed him and Total Football into a global phenomena. That Dutch team resembled rock stars and remain the coolest-looking group of players the World Cup has ever seen. Watching Cruyff was like watching Nureyev; he had the same God-given grace in his feet and it seemed only right that he should lift the new FIFA World Cup trophy in the Final. Unfortunately, the all-conquering Dutchmen came up against the host nation.

The significance of Holland taking on West Germany in Munich just thirty years after the Netherlands had been subjugated by the German jackboot cannot be underestimated. Many members of that Dutch side had lost family members under the Nazi regime and it would be hard to deny the arrogance that often comes with greatness was evident for more than mere football reasons in the Final. Cruyff showed he meant business straight from the kick-off, cutting a swathe through the West German defence and being brought down in the penalty area. With barely a minute gone, Holland were awarded a penalty which Neeskens scored; they were 1-0 up with West Germany yet to touch the ball. Rather than building on that early lead, the Dutch chose to humiliate the Germans on their own soil, using their breathtaking talent to toy with and taunt what was undoubtedly a great German side, one containing the likes of Beckenbauer, Muller, and Breitner. This time, however, Dutch arrogance backfired and sabotaged what should have been the nation’s crowning glory as the Germans clawed their way back into the game, taking a 2-1 lead at half-time that remained the score when the ref blew the whistle after 90 minutes. Holland’s greatest team had failed at the final hurdle. It was the swansong for Total Football.

Cruyff chose not to participate in the 1978 World Cup in Argentina, perhaps stung by the memory of Independiente, the Argentine team that had kicked Ajax off the pitch in the 1972 Intercontinental Cup. He cited disapproval of the military junta then running Argentina as his main reason for pulling out, which seems a remarkable stance to take from today’s money-grabbing football perspective. How many international soccer superstars will decline to go to Russia for the 2018 World Cup on political grounds, I wonder?

Despite his voluntary retirement from the national side in 1977, Cruyff’s domestic career continued until 1984, when he ironically ended his twenty-year journey at Ajax’s bitter rivals Feyenoord. As soon as he hung-up his boots, he entered management and proved to be as successful on the touchline as he had been on the pitch, coaching Ajax, Barcelona and the Catalonian national side, despite the latter remaining unrecognised by FIFA. For most watchers of the game, however, it was his role conducting the footballing orchestra of both Ajax and the Dutch national team in the early 70s that left us with the quintessential memories of Johan Cruyff, gone at just 68, but never to be forgotten.

© The Editor


V-1It’s a ghastly side-effect of warfare that those who haven’t signed up for either side always end up in the firing line. Whether their turf is contested by professional national armies, amateur guerrillas or nihilistic terrorists, the civilian population is invariably caught in crossfire while attempting to go about their daily business. To the left is a remarkable photograph taken on a London street in 1944: two women step off a bus onto the pavement while the cloud from a V-1 rocket that exploded a second before the photographer clicked the camera shutter can be seen rising in the background. It’s the juxtaposition of the ordinary with the extraordinary in the image that speaks volumes as to the tenacity of the human spirit when necessary routine is confronted by a potentially fatal threat.

By 1944, of course, London’s population had already survived the devastation of the Blitz, rendering the introduction of the flying bomb – Hitler’s last petulant throw of the dice towards a country he couldn’t conquer – the latest in a long line of man-made meteors falling from the sky that were faced with weary resignation. The people wouldn’t have been desensitised to the horrors inflicted by the V-1 due to having endured half-a-decade of bombardment on the home front, but that experience made a new spin on the art of mass murder perhaps less shocking in 1944 than it would have been in 1940. Such experience counts for a lot when the enemy devotes so much time and energy to devising more effective ways to kill you.

Life goes on in Israel and Iraq just as it went on in London when V-1s were being launched from the French coastline and as it went on in Ulster at the height of the Troubles; when the possibility of carnage is an ever-present, acceptance of the fact is one way of being able to deal with it, it would seem. Otherwise, every metropolitan area with the permanent likelihood of premeditated bloodshed hanging over it would simply consist of streets filled with people running up and down screaming all day long.

Ever since the Madrid Bombings of March 2004, when 191 lost their lives in ten separate bomb blasts on the rail line, Europe has had to contend with the threat of large-scale slaughter that puts past masters such as ETA, the Red Brigades and the IRA firmly in the shade. What happened in Madrid twelve years ago was the first mega-massacre of modern times on European soil and the first in which the Allied Invasion of Iraq was held up by the perpetrators as justification; a year later, Islamic terrorism hit London when 52 died in the 7/7 attacks. After 148 innocent lives were lost in the two Paris attacks that bookended a gruesome twelve months for the French capital last year, it seemed to be only a matter of time before the next one, and it came yesterday in Brussels, with a death toll so far of 34.

These major incidents have been interspersed with smaller acts of terrorism, often carried out by a lone wolf and lacking the meticulous planning of a Madrid or Paris attack, buts ones that have nevertheless still claimed lives. The kind of hideous event that occurred in the Belgian capital yesterday remains mercifully rare. One year and four months separated Madrid and 7/7; almost a full decade separated 7/7 from the Charlie Hebdo attacks; there was a gap of eleven months before the second, even more devastating, assault on Paris last November; and, rather worryingly, just four months between that and Brussels.

Whereas previous terrorist movements used political ideology or nationalism to justify the murder of civilians, the faith element of Al Qaeda or ISIS is a throwback to the state-sponsored slaughter that flourished across Europe several centuries ago. A continent largely governed by contemporary secular values was unprepared for the resurgence of religion as a convenient excuse, but the way in which Radical Islam has taken grip of young men of Middle Eastern descent across Europe can arguably be viewed as a failure on the part of authorities to provide its young with that most necessary of human goals – hope.

An adolescent male raised in a poverty ghetto is susceptible to the allure of fast cash that drug-dealing and other forms of petty crime can bring, especially in the kind of desperate environment the financial crash of 2008 gave birth to. When he inevitably ends up behind bars, this is when religion can be sold as salvation. Someone who feels powerless will grab at anything that offers the illusion of power, and the alienation from their fellow man that social deprivation can engender is a dangerous weapon in the hands of recruiters for the cause. The public cease to be viewed as merely annoying, uncaring idiots and are transformed into vermin that can be wiped out without conscience.

Dehumanisation is a necessary aspect of warfare that persuades the soldier he has the right to kill his enemy and is never plagued by doubt. Anyone not signed-up to the ISIS agenda is therefore regarded as a viable target and can be exterminated without the perpetrator being kept awake at night by the lives of strangers he has ended – if he lives to survey the carnage he has created, of course.

The suicide bomber element of Radical Islam is an innovation in the urban war-zone, with the only comparable precedent being the Kamikaze pilots of World War II, young Japanese men indoctrinated with the fanatical belief that dying for their Emperor was a worthy act of heroism; substitute Hirohito with Allah and the indoctrination is identical. The key difference is that the inheritors of the Kamikaze mantle aren’t conducting their suicide missions out in the middle of the Pacific Ocean, but on the street, one indistinguishable from the street you’ve set foot on in the last 24 hours.

Whether one is the ‘collateral damage’ of Obama’s insidious indiscriminate drones or a European commuter boarding public transport in order to simply get from A to B, Civvy Street is today’s Agincourt, Waterloo or Verdun – a battlefield for the non-conscripted. We’re all in the army now.

© The Editor


NellThe Pox, Consumption, Dropsy, Scurvy, Scarlet Fever – anyone who has read the canon of classic English literature penned in the eighteenth and nineteenth centuries will be familiar with references to diseases that at the most could kill or at the very least leave a lifelong physical effect on the health of those exposed to them. In the absence of modern medical science, ancient herbal remedies were applied across the social divide until the aristocracy were introduced to the benefits of inoculation against Smallpox in the 1720s. Prominent lady of letters Lady Mary Wortley Montagu advocated inoculation after witnessing its success on her travels through the Ottoman Empire and convinced Caroline, Princess of Wales that something previously untested in the West could serve as a cure. Princess Caroline famously had her two daughters successfully inoculated, and less celebrated guinea pigs received the same treatment at Newgate Gaol. Offered a choice of execution or inoculation, seven condemned prisoners unsurprisingly chose the latter, survived the experiment, and were all granted their freedom as a consequence.

However, in an age of limited healthcare for those unable to pay for it – an age not noted for its excellent sanitation – sicknesses now viewed as completely curable claimed lives on a scale that we would today associate with the Third World. A society rigid in the primitive belief that diseases travelled on the air was ignorant of causes we now accept as a given, and it wasn’t until the widespread adoption of penicillin as a medical treatment from the 1940s onwards that many of the old illnesses ceased to be potential death sentences. Within living memory, a simple cut could lead to infection and then to death very quickly; penicillin and the drugs that were developed from it changed that forever. Or so we imagined.

Despite the publicised global eradication of Smallpox in 1980 – the only time in history an infectious disease had ever been eradicated until the animal disease Rinderpest was also wiped out five years ago – the belief that antibiotics ensure immunity from the majority of sicknesses has taken a knock in recent years due to the gradual evolution of various viruses to combat the ability of medicine to neutralise their impact. A battle medical science appeared to have won is now seeing battle lines being redrawn.

Whether this state of affairs can be held responsible for the resurgence of certain diseases that remain associated with the past is debatable, though the sudden rise in cases of Syphilis since the turn of the millennium was something few saw coming (if you’ll pardon the uncomfortable pun). A couple of hundred years ago, Syphilis was common among men who regularly frequented prostitutes, and a painful game of pass-the-parcel between punter and prostitute could then be handed on to babies born to an infected parent, causing appalling facial deformities. More prevalent as the cause of misery for small children, however, was a bacterial disease such as Scarlet Fever.

I first became aware of Scarlet Fever when watching one of those superb period dramas BBC1 used to produce as part of their children’s programming in the 1970s. I can’t remember the name of the particular serial in which one of the main characters was struck down by the sickness, but the name Scarlet Fever stuck in my head due to its evocative and lyrical moniker. It sounded like the title of a horror film or even a prog rock band. But the disease was not quite as attractive as its name, being one of the major contributors to the high infant mortality rate for centuries. Even if a child survived, the chances of heart and kidney disease in later life could be the lethal legacy of Scarlet Fever.

Both internally and externally painful, Scarlet Fever symptoms include headaches, hallucinations, a sore throat, swollen glands and the rash that gives the sickness its name, a bright red sandpaper-like varnish visible on both the skin and the tongue. The old Central Office of Information catchphrase, ‘Coughs and sneezes spread diseases’, could have been coined with Scarlet Fever in mind, as that was how it was usually passed from one person to another. Considering the cramped and insanitary conditions the poorer working-class dwelled in up until at least the halfway point of the twentieth century, it was no surprise that children were extremely vulnerable to a disease that could be caught quickly and could be fatal.

In 1900, a serum was developed in Vienna that was drawn from the blood of horses. The impact of the serum was considerable upon the patients it was tested on, reducing the mortality of Scarlet Fever by as much as 40%, an impressive success rate for something that had traditionally been resistant to treatment. But the serum was never widely available, and scientists continued to develop other vaccines they hoped would work up until the mass introduction of penicillin in the immediate post-war era. Antibiotics served to diminish the curse of this long-term plague upon children and Scarlet Fever became one of those archaic diseases that successive generations would mostly associate with ‘the olden days’.

This week’s news, that there have been more than 6,000 cases of Scarlet Fever in Britain over the last six months – the highest amount for several decades – has served as a sober reminder that medical science cannot afford to rest on its laurels when it comes to age-old diseases we imagined had been banished to the distant periphery of modern life. We may have a far greater awareness of diseases and their causes today, but Bacterium remains a wily adversary we ignore at our sophisticated peril.

© The Editor


1The annual survey by the Sunday Times to name the most perfect place to live in Britain has revealed the winner. Hold on a minute, Grimsby, Workington and Hartlepool – and step forward Winchester. According to the compilers of the survey, Hampshire’s county town offers ‘a tasty slice of authentic history, with great transport links and fine schools. It also has an irresistible mix of food, festivals and feel-good factor.’ I’ve no reason to doubt this brief summary of Winchester’s plus points; though I’ve never visited this most perfect of places, I know enough about the Home Counties to recognise a uniquely English ideal of picturesque beauty when I hear it described.

So, let’s all go live there, yeah? If you’ve got around £715,000, you can pick up a nice detached property for yourself; roughly £444,000 will get you a terraced property; and if you can only stretch to a flat, just over £300,000 will do. I’m already there. Not as much as the pricier corners of the capital, true; but considerably costlier than Burnley, where the average house price is around £40,000. Hands up who’d rather relocate to urban Lancashire.

As in the old Miss United Kingdom contest, each geographical region of the country has its nominees for this prestigious contest. Ballycastle in County Antrim received the Northern Irish vote; Wales had Penarth in the Vale of Glamorgan; Scotland got Stockbridge in Edinburgh; for London, it was Fitzrovia; the South East pick was Midhurst in West Sussex; the South West was Falmouth in Cornwall; Orford in Suffolk was the choice for the East; the Midlands got Ledbury in Hertfordshire; Oop North saw Harrogate named for the North East (even though it’s in Yorkshire), and Whalley in Lancashire was the North West’s representative.

The factors taken into account when compiling such a survey are such things as crime rates, house prices and the performance of schools, all of which suggests a specific demographic are the target audience for the tourist boards of the respective locations. Couples in well-paid professions with (or intending to have) children clearly figure highly in the list of desirables. Spinsters with a brood of cats or bachelors with a library of 90s porn videos are probably not as welcome; okay, so I know I’m generalising terribly, but it makes for convenient (if admittedly lazy) shorthand. It’s something of a given that everyone would – or should – want to live somewhere that isn’t going to be populated by feral hoodies or street-corner crack-dealers and isn’t constructed entirely of concrete. Though it may surprise the electorate in David Cameron’s constituency, not many of the people who reside in such neighbourhoods actually want to live there either. But most of them have no choice.

Personally, I like the ground to be coloured green and I also like stepping outdoors and not inhaling enough petrol fumes to power a fleet of juggernauts. Born into a densely-packed urban area with factory chimneys pumping toxic discharge into the atmosphere and coating the surrounding houses in a grimy patina of dirt, it’s no wonder my mother says I was always sniffy as an infant; it’s a miracle children weren’t still issued with gas-masks in the early 70s. Of course, the majority of the industries that rendered the North such a Dark Satanic landscape for over a century have now all disappeared. Those factories that weren’t demolished stood derelict for decades before being converted into luxury apartments, sturdy Victorian constructions competing for the attention of the Young Professional with twenty-first century towers. Service industries superseded the manufacturing ones to attract investment; and thus the Northern Powerhouse was born!

Forgive me if I don’t get too celebratory over this fact, unlike the council running one of the North’s most neglected cities, Hull, as they prepare to turn a metropolis boasting an impressive collection of boarded-up businesses and shops into a City of Culture. Winchester must be crapping its pants at the prospect. Of course, I’m sure Hull has its scenic areas just as well as its shit-holes, as most big cities do; indeed, some might argue the ‘edge of the world’ feel that the bleak grey vista of the North Sea generates possesses a beauty of its own, albeit one that has more in common with Scandinavia, a part of Europe much closer to Hull than Hampshire. No wonder it appealed to Philip Larkin.

Beyond the undoubted allure of the Green and Pleasant Land evident in the South East, not to mention all the social elements that contributed towards Winchester’s poll-topping position, what is it that makes a location truly great? New York was bordering on the brink of bankruptcy in the mid-70s, degenerating into the sewer so eloquently described by Travis Bickle in ‘Taxi Driver’, and yet that turbulent period in the history of the Big Apple produced Blondie, Patti Smith, The Ramones, Talking Heads and Television. When the Mersey Beat sound conquered the Hit Parade (as they used to say) in 1963, it sprang from a city containing thousands of homes officially labelled as not fit for human habitation. Sheffield was undergoing the painful decline of its traditional industry when it spawned The Human League, Heaven 17, ABC, Pulp and…erm…Def Leppard, and Coventry wasn’t exactly a boomtown when 2 Tone exploded into life.

Naturally, for those who weren’t forming future chart-topping bands in New York, Liverpool, Sheffield or Coventry, life was hard and the appeal of somewhere like Winchester would have been understandable if what one sought from life was a good job, a home of one’s own in a crime-free neighbourhood, and children that could receive an education that wasn’t a prep school for prison. Let’s face it, that’s what most people seem to want from life, so it’s not fair that so many of them will never taste that ‘tasty slice of authentic history, with great transport links and fine schools’. At one time, maybe – just maybe – social mobility might have made the dream a possibility, however faint. But that kind of mobility has been slowing down to a stationary position in the last few years, and the likelihood of it gathering speed again, let alone hitting the M3, seems sadly remote.

© The Editor


UntitledWho’d have thought it? Work and Pensions Secretary Iain Duncan Smith, arch-advocate of cutting benefits to the bone for the best part of a decade, has resigned on the pretext that the cuts to disability benefits proposed by the Chancellor in the Budget went too far. Yes, you heard right. I know it sounds about as plausible as Nick Griffin regarding Oswald Mosley as someone who was a bit extreme, but that’s what ‘the quiet man’ said in his resignation letter as he walked out of the Cabinet.

George Osborne had again exhibited his charmless talent for embodying the Nasty Party mantle that continues to plague the Conservatives when unveiling this week’s Budget. This time – surprise, surprise – the recipients of his purse-string-pruning belonged to one of the few sections of society that he and his spivvy cronies can’t make a profit from: the sick and the disabled. I’m sure I wasn’t the only one whose memories of a ‘Not the Nine O’Clock News’ sketch spoofing Chancellor Geoffrey Howe were evoked, the one where he announces taxes on wheelchairs, white sticks and guide-dogs, adding ‘I am deliberately targeting those who can’t fight back’. So far, so predictable – but wait! There are actually some Tories sitting in the House who didn’t endorse his proposed disability benefit cuts, some who don’t fit the born-to-rule profile, some who are decent constituency MPs concerned that the wrong people are being punished again, some who are even threatening to stage a backbench rebellion if Gideon attempts to force the measure through Parliament.

The backtracking has already begun, barely 48 hours after Osborne proclaimed the policy with his customary brand of misanthropic smugness; Education Minister Nicky Morgan – wearer of a curious expression that implies she’s being permanently goosed – has hastily stepped in to declare that Osborne’s Personal Independence Payment cuts were ‘just a suggestion’. Of course, Gideon has been here before – just last year, as a matter of fact. Remember his attempts to slash £4 billion from Working Tax Credits? That’s the one that was famously thrown out by the Lords and resulted in the Chancellor’s Autumn Statement being dominated by the humiliating abandonment of the idea. And if that concept was regarded as an attack on David Cameron’s favourite standby of ‘hard-working families’, how will this latest example of Osborne’s arrogance and conceit blinding him to his own miscalculations be welcomed?

One would expect the Opposition to oppose Osborne’s idea; it’s their job to do so, after all. Labour leader Jeremy Corbyn accused the Chancellor and his party of waging war on the disabled, but I doubt anybody would have anticipated less. However, the fury of disability campaigners – 25 charities have wasted little time in composing a joint letter asking the Government to think again – seems to be complemented by an unexpectedly sympathetic response to their concerns from within the Conservative Party itself.

Iain Duncan Smith, in his role as the man with whom the buck stops when it comes to benefit cuts, has responded to Osborne’s plans by suddenly agreeing with anyone in possession of a heart. ‘I have for some time,’ he writes, ‘and rather reluctantly come to believe that the latest changes to benefits to the disabled, and the context in which they’ve been made, are a compromise too far.’ For a man who has already overseen more than £30 billion cuts to the welfare budget to exit government on such a pretext sounds a bit rich, yet Duncan Smith goes on to cite the unfairness of a Budget that benefits higher-earners and penalises those at the bottom. He knows he would have been in the firing line had these cuts been implemented and he also knows his position as a long-term Euro-Sceptic, in direct opposition to Osborne, would have rendered his post even more intolerable at such a politically perilous moment for Britain’s EU membership. Iain Duncan Smith has ironically quit on a day when Cameron and Osborne have quickly distanced themselves from these controversial proposals, but the fact that the quiet man hasn’t gone quietly is further evidence of Tory tensions as the EU Referendum edges closer.

For all IDS’s apparent U-turn on benefit cuts, one cannot but see this resignation in the context of the Brexit issue. It colours everything in Tory circles right now. One could even be cynical – perish the thought! – and suggest the backbenchers who oppose Osborne’s plans might just be doing so because Gideon represents the anti-Brexit faction and they’re making the most of every opportunity to give him a bloody nose.

George Osborne has gleefully promoted himself as the main man in the Remain camp along with scaremongering Dave, yet he increasingly seems to be going further out on a limb in a party that can call on some of its most prominent heavyweights to sell the opposing message. Another Budget cock-up is the last thing Gideon needed; that it has resulted in the voluntary exit of a man he hoped would deflect the vitriol of disability campaigners away from him is an additional blow that doesn’t bode well for his Prime Ministerial ambitions. If that’s the case, I suspect there won’t be a moist eye in the House.

© The Editor


ReaperYesterday it was Sylvia Anderson; today it’s the turn of Paul Daniels and Cliff Michelmore; last week it was George Martin and Keith Emerson. As far as famous names go, the one area of the economy currently experiencing a boom is undertaking. The Grim Reaper has seemingly never had it so good. A year only midway through its third month has seen a remarkably sweeping clear-out of characters from the entertainment firmament on a scale unseen since the mid-to-late 80s, when it felt as though every day saw the curtain come down on careers that had begun during the golden age of Hollywood. The 30s, 40s and 50s have been comprehensively covered and now the shadow of the scythe has fallen on the 60s and 70s.

What we have to remember, however, is that these two iconic decades are now forty and fifty years distant from the here and now, just as the 20s and 30s were when Charlie Chaplin and Groucho Marx died within a few months of each other in 1977. The silent era felt a long way away then because it looked very much like a different world, almost another century; but perhaps that was largely because those of us born in the post-war years had no first-hand memory of it. Then again, many aspects of the 60s and 70s still have a ring of present tense to them; look at any up-and-coming rock band (if there are any) and you will recognise the same sullen expressions, floppy hair and shoulder-shrugging sang-froid as emanating from the blueprint patented by the Stones in 1964. Then again, any contemporary musician worth their weight in gold discs still has one foot in the age when music meant more than a ringtone. And if you’re that way inclined, look at young women on the move along the high-street; the length of the hair and the shortness of the hemline hark back to the decades in which their parents were children.

Any regular viewer of documentaries who has been recording their broadcast off-air since the 80s will experience an acute awareness of the passing years when coming across one taped twenty or thirty years ago; the interviewees often seem well-preserved well into their forties when appearing as talking heads – a slight smattering of grey hairs, but jowls not yet sagging; it’s only catching sight of them as they were a couple of decades back and then comparing them to their present day selves that the ageing process really hits home. Sixties and seventies are the ages of man when time really begins to have fun with the face; but we have to remember that we too have aged during the same span. Perhaps their ageing acts as a trigger for reluctant realisations of our own.

The generation that grew up with the silver screen as the source of heroes and heroines will have undergone the same sensations when their adolescent icons began dropping like flies as the generation raised on a diet of television and pop are currently experiencing now that theirs are following suit. Although cancer and other incurable conditions have a lot to answer for, most of the giants still revered are approaching an age when natural causes will also begin to play a part. Moreover, there’s the dismal fact to contend with that few comparable successors have emerged to serve as compensation in recent years; it’s not much comfort to think we may have lost Bowie but we’ve still got Bieber.

Possibly what makes the latest crop of cultural cremations especially difficult to deal with is that the majority belong to the baby-boomer generation and therefore made their mark at a moment when thirty suddenly became a cut-off point. Previously, entertainers like Sinatra or Judy Garland reached a peak as performers when they were way past thirty; indeed, the former arguably made all of his finest recordings when he was in his forties. By contrast, there are few artists that appeared in the 60s or 70s who did much of any real value beyond thirty; all their best work was done when they were remarkably young. Ringo Starr was the eldest member of The Beatles when ‘Sgt Pepper’s Lonely Hearts Club Band’ was released, and he was just 27 at the time. Youth was the raison d’être of their era. Pete Townshend famously wrote he hoped he’d die before he got old, but he wrote that from the perspective of a twenty-year-old for whom there was only the present, and he’s had to deal with the ageing process every time Roger Daltrey sings that bloody line ever since – even with his earplugs in.

In a way, the emphasis on youth and the writing-off of what comes next has condemned that group of performers to the unenviable task of attempting to live up to their younger selves for the rest of their performing lives, something their predecessors didn’t have to worry about; Sinatra’s only concession to the passing of the years was his toupee. Whenever Keith Richards is interviewed, nobody wants to know about some new Stones album; they want to hear him talk about Brian Jones or the 1967 Redlands bust or Altamont. The band’s creative spark was condensed into a period of roughly seven or eight years, which must be a curious situation to be in – always perceived by the public as the embodiment of eternal youthful rebellion when you’re now older than Noel Coward was back when you were at the peak of your powers.

2016 has already seen the loss of Pierre Boulez, Ed Stewart, David Bowie, Alan Rickman, Glenn Frey, Paul Kantner, Jacques Rivett, Frank Finlay, Terry Wogan, Harper Lee, Tony Warren, George Martin, Keith Emerson, Peter Maxwell Davies, Asa Briggs, Sylvia Anderson, Cliff Michelmore, Paul Daniels and Frank Sinatra Jr, to name just a few. Some were as old as 90, but none were under 65. Such is life – and death. And, to paraphrase Benjamin Franklin, the latter and taxes are the only certainties the world has to deal with.

© The Editor


RowlandAs far as serendipitous scheduling goes, the decision to repeat a documentary about the career of The Carpenters on BBC4 Monday night was somewhat ironic. The sad story of Karen Carpenter’s surrender to eating disorders was screened on a day when venerable veteran broadcaster and Labour Peer Joan Bakewell publicly apologised for comments she made on the subject of anorexia nervosa. I haven’t seen the original piece in which Dame Joan aired her opinions, though from what I can gather she suggested the sickness was a consequence of narcissism amongst adolescent girls. I presume they provoked the usual storm in the twitter teacup, hence the climb-down by a woman from an era in which opinions were expressed without the need to retract them 24 hours later.

Without reading the original statement, it’s not fair to base one’s judgement on the basis of a hasty apology, though if Baroness Bakewell did indeed uphold the outdated notion that eating disorders are the province of white, middle-class teenage girls who have the ‘luxury’ of starving themselves to death, she should really have done her research first. Anorexia is a physical manifestation of mental illness, and like all mental illnesses, it doesn’t recognise race, class, wealth or social situation. I speak with a degree of experience in that I have suffered from eating disorders during large chunks of my adult life, and supermodels or social media had nothing to do with it.

For me, it all stemmed from puberty. Anyone old enough will recall ‘Row-land’ from ‘Grange Hill’, the bespectacled podgy schoolboy whose torment at the hands of resident bully Gripper Stebson was a long-running story in the early 80s. I was that boy! My fat year was 1980, and while I usually attribute a sudden weight gain to both the pubic season and the fact that my mother’s sandwich shop provided a steady stream of crisps and chocolate bars before tea, I can also see a traumatic rejection at school probably played its part. If the common consensus is that only girls are bitchy little shit-stirrers, think again; I had three close friends who inexplicably ejected me from their group without any valid reason and I was an overnight Billy No Mates for several months, which is pretty heavy going at 12.

Fortunate to fall in with another crowd and then facing up to the fact that I had a weight problem was my salvation; but though I’d lost the excess plumpness by the time I entered my teens, the painful memory of being that roly-poly four-eyes never really left me. Around 15, I developed an obsession with David Bowie, to the point of putting his name on a school exercise book in place of my own and leaving a particular teacher to presume I was actually called David Bowie. Although this was the era of the healthy, tanned Bowie of ‘Let’s Dance’, I’d been seduced by his previous incarnations of Ziggy Stardust and the Thin White Duke and began to emulate their physiques.

This process continued even when the intensity of my Bowie fixation lessened and it reached a peak (or low) in the early 90s when I remember once weighing myself and seeing I had shrunk to six stone, something I was both thrilled and scared by. I don’t think I’ve ever weighed myself since, as I don’t regard it as a particularly good idea for someone with a penchant for self-loathing. After that, I suspect my weight hovered around the eight-nine stone mark if the few photos from a three-year ‘drug’ period in the late 90s are anything to go by. The last fifteen years have been characterised by a lingering paranoia over any sign of perceived fatness, leading to prolonged bouts of what I call a ‘Stalin’ diet, imagining the menu from a Soviet gulag standing on the dining table. Bar one day a week, I still don’t have proper meals, tending to snack my way through the day. That’s how long this thing can remain a fixture in your head.

To restrict eating disorders to teenage girls from a specific social strata suggests a lack of first-hand experience; yes, it is a common condition amongst that demographic, but not an exclusive one. Much as I would have fancied it, I never actually was a teenage girl. Discussing the topic with the addition of listeners’ phone calls, Radio 4’s ‘You and Yours’ received stories of people in their 80s undergoing treatment for anorexia, and middle-aged male accounts of the illness as well. Whilst one or two praised Joan Bakewell for raising the subject and daring to link it to narcissism, the general consensus was that eating disorders spring from the polar opposite. A narcissist loves what he or she sees in the mirror; most anorexia victims detest their reflection, some even going to the extreme of removing every looking-glass in the house. It is true that one’s mind becomes detached from one’s body and it often feels as though it belongs to somebody else, somebody one hates. You take out your frustrations on them.

As some voices pointed out on the ‘You and Yours’ phone-in, controlling the flow of food through the mouth is sometimes the only degree of control one possesses when external events appear beyond one’s control. At times of personal crisis, having that solitary power to wield is a curious comfort, albeit a dangerously self-destructive one. Like most mental illnesses, each individual’s experience is unique to them, making the concept of a cure-all solution nigh-on impossible. Even if professional treatment can appear to put the brakes on what is perhaps the slowest form of suicide outside of alcoholism, it never entirely leaves you. If one is lucky, being able to control it supplants the sense of ‘control’ it generates at the height of its nihilistic powers.

© The Editor