DRINKING VIOLETS

Wollstonecraft BabychamLittle girls and ladies who’ve been through ‘The Change’ – presumably the two female demographics the manufacturers of alcoholic drinks should henceforth aggressively target. Women of ‘childbearing age’ are now apparently verboten, at least according to the ever-dependable World Health Organisation, so sales of the most prominent ‘lady drinks’ are destined to plummet unless the prepubescent and postmenopausal are encouraged to swarm into their local off-licence. That’s right – the WHO didn’t say ‘childbearing women’, but ‘women of childbearing age’. As girls are able to become pregnant once they start riding the menstrual cycle and can pretty much keep popping them out until they hit the menopause, that’s a pretty wide area to make an alcohol-free zone.

The World Health Organisation hasn’t exactly covered itself in glory over the past eighteen months, so the timing of this bizarre recommendation seems especially odd, particularly when it stands to reduce the WHO to an even more contemptible laughing stock than it already is. The response to the proclamation has been pretty universally derisive; the WHO was accused of paternalism and sexism, both of which seem fairly accurate accusations. The ongoing infantilisation of women has taken numerous fatuous forms over the last few years, often emanating from a position of seeking to protect the precious little shrinking violets from the malevolent male of the species. However, it sometimes feels like the Suffragettes never happened, so patronising and Victorian have many of the proposals been, and this latest laughable WHO advice is treating women like the archetypal ‘sickly child’ of the 19th century novel.

Mary Wollstonecraft, the 18th century author and thinker routinely (and rightly) cited as the Godmother of feminism, railed against the way in which young women continued to be treated as children both socially and legally in her landmark 1792 book, ‘Vindication of the Rights of Woman’. Her passionate and groundbreaking work is an ideological foundation stone unearthed during each successive feminist wave, yet were she around today Wollstonecraft would see in this WHO recommendation precisely the same condescending tone women of her era were confronted by whenever they sought to assert any sort of independence as befitting a fully-grown adult. Amidst increased marginalisation by the loud, screeching voices of trans-activism and the capitulation of institutions, public bodies and the corporate world to this unhinged take on biology, women are now being informed that their childbearing years – essentially the prime years of their lives – should be years of teetotal temperance, presumably so they can perform their sole duty as breeding machines.

Almost 30 years ago now, a friend of mine who was a smoker didn’t pack in the habit during her first pregnancy; the baby was healthy when born and it appeared the impact of cigarettes on the womb was nonexistent. Around a decade or so later, a friend of a friend who also smoked when pregnant often spoke of the ‘dirty looks’ she received if lighting-up in public when carrying such a prominent bump. Move on another decade and-a-bit and it’s hard to imagine a woman having the nerve to grab a quick fag in private when with-child, let alone in public. My point is that smoking during pregnancy is now such a social black-mark against the mother-to-be that it has practically been outlawed. Drinking when pregnant doesn’t provoke quite the same horror in the observer, but it’s still regarded as ill-advised and reveals potentially bad parenting skills. The WHO proclamation unsurprisingly references this, recommending that ‘appropriate attention’ should be given to the prevention of drinking ‘among pregnant women’ – which is what you would expect them to say – but then adds the more contentious inclusion of ‘women of childbearing age’.

Christopher Snowdon from the Institute of Economic Affairs didn’t mince his words. ‘This is classic World Health Organisation idiocy,’ he said. ‘Not content with repeatedly dropping the ball on Covid-19 and dishing out awards to politicians for banning vaping, it now thinks most of the world’s women should abstain from alcohol. The idea that it is unsafe for women of childbearing age to drink any alcohol is unscientific and absurd. Moreover, it is none of the WHO’s business.’ One wonders if any of the experts who put this WHO recommendation together are mothers of young children for whom a glass of wine at the end of a stressful day is such a vital shot of medicine that it should probably be available on prescription. Even the chief executive of Alcohol Change UK, whilst sticking to the ‘drinking when pregnant is bad’ narrative, was critical of the WHO advice. ‘Drinking alcohol in the early stages of pregnancy…can be very damaging for a foetus,’ said Dr Richard Piper before going on to add that it was ‘vital we balance this against each adult’s right to make informed decisions about what we do with our bodies, no matter our age or sex.’

Joining in the chorus of disapproval was the Portman Group, which regulates alcohol in Britain. ‘We are extremely concerned by the WHO calling on countries to prevent drinking among women of childbearing age in their latest action plan,’ said chief executive Matt Lambert. ‘As well as being sexist and paternalistic, and potentially restricting the freedoms of most women, it goes well beyond their remit and is not rooted in science. It is wrong to scaremonger in this irresponsible way and associate women’s alcohol-related risks with those of children and pregnant people.’ He could have done without saying pregnant ‘people’ – the word ‘women’ would have sufficed; but the fact even organisations like the Portman Group and Alcohol Change UK have reacted in such a manner perhaps shows what an own-goal this WHO ‘action plan’ really is.

A current storyline on ‘The Archers’ concerns the alcoholism of young mother Alice Aldridge; the character drank during pregnancy and the baby was born premature, thus enforcing the public health edict that drinking when pregnant can be damaging for one’s baby. Had the WHO’s ‘global alcohol action plan 2022-2030’ concentrated on that as well as the children and teenagers it also mentioned as the groups who should be dissuaded from hitting the bottle, few would’ve batted an eyelid. However, to include women alongside babies, kids and teens – regardless of whether or not they intend to have a family – seems to bracket women back in the same infantile limbo they occupied during Mary Wollstonecraft’s lifetime.

NHS advice on alcohol consumption is awash with the familiar language of ‘units’, recommending that not exceeding 14 of them a week is the act of a responsible drinker. Apparently, that translates as 10 glasses of wine (low-strength) or half-a-dozen pints of beer (average-strength). A recent study revealed binge-drinking remains an issue for one in three adults; despite regular claims that today’s adolescents spurn the practice in comparison to their predecessors of 10-20 years ago, it would seem the grownups still like a binge – particularly those at extreme opposites when it comes to incomes. ‘Highly-educated women’, AKA the middle-class Alice Aldridge types, are also cited as being most at risk. Were the World Health Organisation not possessed by the same crusading moralistic zealousness that appears to afflict every institution with a remit for improving public health, maybe people could actually be persuaded to alter their more unhealthy habits; as it is, by overreaching this remit and extending even further into the private sphere, any sensible suggestions are lost amidst the anger and derision this latest WHO missive deserves.

© The Editor

THE FAT OF THE LAND

The great recruitment programme for the Boer War at the end of the nineteenth century was the first eye-opener for the British Army as to how the nation’s diet had substantially altered in an extremely short space of time. From possessing a population in the mid-Victorian era that recent research has shown was healthier than we’ve ever been since, the health of England’s cannon-fodder had been ruined by food imports from the colonies; salt-heavy tinned meat, syrup-heavy canned fruit and sugar-laden condensed milk had served to wreck the iron constitution of John Bull. A different kind of diet, though no less damaging, was exposed this week following emergency surgery on a defector from North Korea, revealing a body riddled with grisly parasites.

Apologies if you’re eating as you read this, but the defector – also a military man – was operated on in Seoul to repair injuries sustained during his escape from South Korea’s neighbour. One parasitical worm removed from the injured man was 27cm long, extracted from his digestive tract by a surgeon claiming to have only ever come across such internal infections in medical textbooks before. One would assume a major qualification for joining any army is to have an above average level of physical fitness, so if this soldier is in such bad condition, what does that imply about the rest of the North Korean people?

Nutrition and hygiene in North Korea have long been suspected as being pretty appalling, though the closed shop the country remains has prevented any sustained study of the nation’s diet. Most of the conclusions made by outsiders are dependent upon examinations of recent defectors, and the kind of parasites discovered during the operation on the latest escapee were apparently commonplace in South Korea half-a-century ago until economic improvements all-but wiped them out. Again, apologies are in order if you’re perusing this post with your egg & chips, but some believe the use of ‘night soil’ (i.e. human excrement) as fertiliser in North Korea could have a lot to answer for. The drying-up of state-supplied chemical fertiliser from the 90s onwards has resulted in this desperate scenario, encouraged by the far-from malnourished Kim Jong-un, a man who probably doesn’t have to eat his own shit.

Corn was also prevalent in the soldier’s stomach; more and more North Koreans are dependent on cheap imported corn from China (49,000 tonnes this year so far) following a series of droughts in the country. The scraps of info available, such as that supplied by the World Food Programme, paint a bleak picture of a populace decimated by drought, famine and a totalitarian regime viewing it as utterly dispensable. According to the WFP, North Koreans are on average 5 inches smaller and 15 pounds lighter than their South Korean counterparts due to decades of poor diet with a distinct absence of protein and fats; a quarter of pre-school children are estimated to suffer from chronic malnutrition. The contents of the escaped soldier’s stomach appear to serve as evidence of what a lifetime of a limited diet imposed by Government can do.

Of course, the West’s health worries are of a different nature; unlike North Koreans, we have an abundance of choice, albeit both good and bad. The plague of obesity may contrast sharply with the widespread malnutrition in Kim Jong-un’s backyard, though even the relatively recent upsurge in home-grown fatties is nevertheless something we’ve been sliding towards over the last affluent hundred years. It can be traced all the way back to the point in the nineteenth century when processed sugar and salt-based foodstuffs superseded the previous dependency on fresh veg, fruit, fish, eggs and nuts. The impact of just one generation hooked on such a diet was as evident to doctors examining volunteers for the Boer War as any exploitative Channel 5 documentary about ‘Britain’s Fattest Bastard’ would today show how dangerously pivotal the innovations of the late Victorian dinner-table have become to the twenty-first century appetite. Ironically, Kim Jong-un has the kind of physique more characteristic of the West than the Far East, though he (like us) has the choice to overindulge if he so wishes.

However, whilst the imposition of physical ill-health via the portly gangster running North Korea may be unique to dictatorships, the mental malnutrition that goes hand-in-hand with it isn’t. A nation such as ours might be able to boast a higher standard of living for its people than North Korea, though the austerity measures of the past seven years, which have hit the poorest hardest, have long been linked to the increasing tendency of more people than ever to prop themselves up with antidepressants. A new report even attributes Tory policies since 2010 to 120,000 deaths. From a steady decline in mortality rates between 2001 and 2010, the authors of the study claim this trend has subsequently been reversed from the Coalition onwards, with more than 45,000 deaths during the first four years of Dave’s stint at No.10 than anticipated as funding for health and social care fell in real terms.

It’s hardly rocket science that if healthcare provision is underfunded, those most reliant on it are at greater risk of their lifespan being reduced. The social care budget between 2010 and 2014 dropped from 2.20% to 1.57%, and the spending constraints then coincided with a sudden rise in the death rates. One of the paper’s authors referred to austerity policies as ‘a public health disaster. It is not an exaggeration to call it economic murder’. Critics have called the conclusions drawn in the study as ‘speculative’, though I often marvel at the fact that the entire population hasn’t formed an orderly queue at Beachy Head, considering the increasing paucity of reasons to keep buggering on. Then again, at least we’re not living off ‘night soil’. Yet.

© The Editor

https://www.amazon.co.uk/Mr-Yesterday-Johnny-Monroe/dp/154995718X/ref=sr_1_1?s=books&ie=UTF8&qid=1510941083&sr=1-1

OBESITY ROLLERS

The votes have been counted and verified, and the pies have been eaten; the results can be announced! Yes, the latest statistics reveal England’s leading Fatty Town is Rotherham, followed by another South Yorkshire Area of Outstanding Natural Beauty, Doncaster; hot on their heels is Halton in the North West. Waddling its way towards the top spot, Rotherham can boast more fatties amongst its population than anywhere else in the country; just under half of the town’s entire population are overweight, with 32.6% classed as obese.

Apparently, as much as a quarter of the population of the entire UK are now obese, but even though 40.4% of the English are overweight and 24.4% obese, England isn’t as abundant in blubber as Scotland, according to the NHS; last time the NHS looked, 27% of Scots were obese, with Wales ranking at 22%. The figures that awarded Rotherham the unenviable heavyweight crown were collated between 2013 and 2016, though England’s fattest geographical region is the North East; 41.5% of Tyne, Tees and Weir-siders are overweight, whereas 27.1% fall into the obese category. The Yorkshire & Humber region runs a close second, with the East Midlands behind that.

In a sense, the statistics paint a wider picture of what our own eyes are telling us whenever we’re out and about. A couple of times this week, I’ve been sat in a friend’s car whilst she’s popped into a shop on a retail park, and as I observed passers-by, I reckoned well over half of them were what could genuinely be called fat. When I was a child, fatties weren’t unheard of, but they were certainly less prevalent than today; most classes at school had the token fat kid, and most of us had a fat uncle or aunt; though there’s no doubt they were a far rarer sight than today, almost something of a novelty.

The blame game is inevitable when such a dramatic alteration to the national character as this occurs. Using my own childhood experience, I know for sure instant and frozen foods certainly existed, though they co-existed with meals consisting of fresh vegetables and the dreaded ‘greens’ that had such an unsavoury reputation. With parents raised on the legacy of wartime digging for victory and grandparents still possessing vivid memories of days that might go by without any food whatsoever, it was no wonder the importance of greens and meals cooked from scratch remained high. This thinking also extended to school dinners; but in order to make the far-from desirable recipe of cabbage, beetroot, spam fritters, lumpy mash and hard peas remotely tolerable, the prize at the end of this gastronomic obstacle course was a pudding bathed in custard, awarded to everyone who managed to grin and swallow their way through the first course, and washed down with that most basic of table wines – water.

Anyone of a certain age will recall that the chocolate bar Milky Way used to be advertised as ‘The sweet you can eat between meals without ruining your appetite’, with the emphasis on can. This carried clout with kids of my generation; if this statement was broadcast on TV, then it had to be true – right, mum? Therefore, it’s okay to guzzle one before teatime, yeah? It was also a canny tagline by the manufacturers because eating between meals was so frowned upon at the time that a chocolate bar sold as a sweet that wouldn’t interfere with the compulsory cleaning of the plate might just be a smart way round the unwritten rules of the nation’s children’s diet.

It’s no wonder the corner-shop did a roaring trade in penny sweets both on the way to and on the way back from school. Such cheap confectionary was within the budget of most kids (even those who helped themselves when the newsagent fatally turned his back) and wasn’t considered substantial enough to damage appetites for the next meal. The main accusation levelled at sweets was that they rotted your teeth if taken to excess, so most parents tolerated them as long as they were consumed in moderation. A proper chocolate bar boasting a big brand name or even a packet of crisps were a little pricier and therefore had an air of ‘treat’ about them, something one could look forward to perhaps once a week, though not much more often than that. They even used to print the actual price of the item on the wrapper then, as if to emphasise the gulf between it and the more accessible penny varieties stocking the shelves.

The sudden colonisation of the country by the burger-bar, something that seemed to happen from the second half of the 1980s onwards, is regularly blamed as the biggest cause of rising child obesity, and there’s no denying the proliferation of such fast-food quick-fix solutions to the headache of being a weekend dad haven’t helped. But the collapse of the old system when it comes to a daily dietary regime probably has more to answer for than the cure-all option of a Big Mac & Fries – specifically, the gradual abolition of the not-eating-between meals rule. Many of today’s parents had their childhood eating habits governed by the old order, yet unlike their own parents, have decided not to impose it on the next generation, instead turning their kitchen cupboards into an all-you-can-eat buffet. They no doubt blame McDonald’s or blame the electronic gadgets that keep their kids indoors even when the weather is ideal for playing-out. But they should really look a little closer to home, and in the mirror.

Outside of the actual food consumed, the subject of exercise is also unavoidable. One wonders how much of an impact the selling-off of school playing fields to developers and the cutting of extracurricular sporting activities have had, let alone the establishment of ‘the school run’, whereby walking to and from school has been superseded by the internal combustion engine. Throw in the reluctance of parents to let their children loose come summer holidays for fear of the prowling Paedo and it’s no wonder their offspring are waddling as much as their parents are.

Food that is deemed good for you today – sugar-free, organic and deprived of artificial colouring – is expensive and therefore only within the regular budget of the relatively affluent, whereas food that is deemed bad for you – loaded with sugar, salt and all those other tasty ingredients that clog-up arteries – is not only affordable for those on low incomes, but also more available. That the South East and London register at the bottom of the obesity chart speaks volumes, but idleness and ignorance play their part too. It is possible to eat healthily on a tiny budget; cooking healthy food is as cheap an option as opting for a pre-packaged and processed ready-meal crammed with chemicals, though why that message is failing to get through could be down to simple laziness. I myself purchased some broccoli and a courgette this morning, costing less than a quid. Carrots, cabbage, onions, lettuce and the rest remain cheaper than any packet of pound shop frozen plastic; but you can’t just bung them in the microwave for ten minutes. Say no more.

© The Editor

HEALTHY, UNWEALTHY AND WISE

john-bull-and-dogHealth-wise, the Victorian era tends to throw up images of malnourished urchins with rotten teeth and rickets living on hard bread and mouldy cheese (if they were lucky) after a hard day’s work up a chimney. Not that this wasn’t the case where many of the working poor were concerned, but as a dietary portrait of an entire society it can not only have the effect of imbuing smugness and superiority in the Victorians’ descendents; it can also prove highly misleading, as has been pointed out in ‘How the Mid-Victorians Ate Worked and Died’, a new study published by MDPI, the International Journal of Environmental Research and Public Health.

The Mid-Victorian period roughly covers 1850-75 and the MDPI report claims the generation that lived through that quarter-century was the healthiest this country has probably ever had in its history, a statement which certainly contradicts perceived wisdom. The notoriously high mortality rates in infancy could often be the greatest challenge to the prospect of a long life, but if one could make it to the age of 5, the compilers of the study say life expectancy was more or less the same, if not better, than it is today; and contrary to popular belief, we live on average no longer than they did.

Most fatalities in Mid-Victorian society were due to infections that a combination of improved sanitation and modern science has now rendered non-fatal if treated early enough; workplace accidents were a far higher cause of death as well, though the chronic degenerative diseases that are such prominent killers today were as much as 90% less prevalent than they are now. And for all the moral panic over drunkenness amongst the poor, the alcoholic content of beer (the most commonly consumed drink during this period) was lower due to it being watered down to an extent than no publican could get away with in the twenty-first century. Even cancers were far rarer, especially of that of the lung variety, as the Mid-Victorian era predates the mass industrial production of cigarettes.

The more physical nature of work played its part in the health of the Mid-Victorians as much as what they ate; unlike 2016, very few professions consisted of sitting at a desk all day. They may have ingested between 50% and 100% more calories than us, but they burned it off through work; obesity was associated with the idleness of the wealthy and virtually unknown amongst the working-classes. Public transport was threadbare in comparison to now (not to mention pricey), with the majority walking to and back from their workplace, something Dickens vividly described when witnessing the march of the workers at the crack of dawn. Overall, physical activity far exceeded levels we indulge in today, a factor that undoubtedly contributed towards the healthier condition of the Mid-Victorians.

The Mid-Victorian diet in the study is compared to that of the Mediterranean diet and is regarded as superior to any government dietary recommendations issued now, let alone what the populace actually do eat in the twenty-first century. Most fruits and vegetables were affordable for even the poorest households, largely due to the growth of the railways, which enabled food to be delivered to markets and shops in far higher numbers. Onions, leeks, watercress, carrots, cabbage, turnips, peas, beans, artichokes, apples, plums, cherries and gooseberries were commonplace and cheap. Nuts, particularly chestnuts, were consumed with more regularity than they are today, whereas the presence of backyard hens provided a constant supply of eggs. The nature of the meat eaten back then may not appear especially appetising now (on the bone with accompanying offal), but it would seem the large amounts of fish and seafood that constituted the diet also aided good health.

Bereft of margarine and processed foods, not to mention a lower salt intake, the Mid-Victorians were of a sufficiently healthy constitution to power the engine of Empire as well as providing the armed forces with fitter men than ever. Ironically, Britain’s global dominance made it a target market for edible imports that began to flood into the country towards the end of the nineteenth century. Tinned meats were salt-heavy, whereas canned fruit was syrup-heavy, and condensed milk laden with sugar. It was the sudden increase in sugared foods that sowed the seeds of decline in this brief period of good health, rapidly ruining teeth to the extent that many of the foods consumed in large numbers before could no longer be eaten. This decline, which was so evident when men were being recruited for the Boer War at the turn of the twentieth century, served to create the popular image of the undernourished working poor that we now tend to associate with the whole of the Victorian era.

The MDPI report concludes that today’s intensively grown crops are less beneficial than the organically grown fruit and veg the Mid-Victorians ate, their meat was all free-range, and (to get technical for a mo) their diet contained ‘pharmacological levels’ of phytonutrients that were effectively protection against cancers, heart diseases and other degenerative disorders that are so in abundance today. It makes for surprising, sober and fascinating reading. Not only does it shine a fresh light on an era retrospectively (and, it would seem, erroneously) regarded as a low-point in public health; it also makes one realise that twentieth and twenty-first century advances in medical and pharmaceutical science are only ever a stay of execution when we’re being dished up crap.

© The Editor