George Winter looks at the evidence linking the consumption of red meat and processed foods to a range of health conditions
Farms are out of sight and mind for many of us, augmenting the daunting challenge of addressing farm-related injuries, writes George Winter
Running is a great way of keeping fit, but George Winter asks whether marathons are for everyone and if there could be adverse health consequences for ageing athletes
recently visited a friend whose Edinburgh street hosts a free library comprising two small shelves of books, protected by a wooden door, bolted to railings. Anyone can borrow or donate books, and I felt like browsing. Both shelves were crammed with paperbacks, and the previous donor had wedged their hardback into the cupboard, supported only by the closed door. When I opened it, An Illustrated Anthology of Erotica (Little, Brown and Co, 1992) fell into my arms.
On the cover, directly beneath a large-font ‘EROTICA’, was Heinrich Lossow’s painting of Leda and the Swan. Leda hadn’t gone to the riverbank to feed the ducks; that was clear. So too was the fact that I hadn’t heard the approaching footsteps of an elderly gentleman of military bearing. His look of disdain as I spun around — affording him the chance to register ‘EROTICA’ above the recumbent Leda in my trembling hands — was withering, and I scurried off, unwilling to wrestle the troublesome anthology back into place.
“Mmm,” was my wife’s unenthusiastic response to this true account, and ‘mmm’ is my unenthusiastic response to this anthology; the visual equivalent of a symphony played on one note. Having plunged in, as it were, to emerge with my eyes duly boggled, I turned to the foreword, which quotes the German art collector Eduard Fuchs (1870-1940), who claimed that “eroticism lies at the root of all human life”. Really? In that case, I can’t help feeling how dull I am.
When does eroticism become pornography? Possibly when subdued lighting yields to floodlit spectacle. And pornography is about more than sex, otherwise there wouldn’t be a demand, say, for a dominatrix in a jacuzzi full of custard.
But despite the absurdity of blank-eyed purveyors of customised — even ‘custardised’ — sex, pornography is no joke. For example, in Raymond Cleaveland’s Pornography and Priestly Vocations (www.catholicculture.org/culture/library/view.cfm?recnum=6182), his tale of porn-drenched priests includes the observation: “Like never before in the 2,000-year history of the Roman Catholic Church, the young men called to the priesthood today have been immersed in a pornographic world from the day of their birth.” Unsurprising perhaps, since as Shakespeare’s character Angelo demonstrates in Measure for Measure, nothing corrupts like virtue.
And if, as Fuchs notes above, eroticism does lie at the root of all human life, then there’s the risk of courting a sexually-transmitted disease, especially among those who cavort on computer screens. For instance, when Hill et al considered ‘Condom use and prevalence of sexually-transmitted infection (STI) among performers in the adult entertainment industry’, their results, published in the International Journal of Sexually Transmitted-Diseases and AIDS (2009, 20: 809–810), showed that men who had sex with men (MSM) reported higher rates of condom use “than either women or heterosexual men (25 per cent, versus 0 per cent, versus 7 per cent, respectively)”. There was a high prevalence of STIs among the study group, who had attended the STI clinic over 400 times. Thus, eight MSM had a total of 123 STIs, 14 heterosexual men had 135 STIs, and 15 heterosexual women had 22 STIs.
Given examples such as these, let us leave aside the question of the morality and ask whether pornography is a public health problem, as the US state of Utah declared in 2016. Perhaps the Utah state legislators had read Dr Mary S Calderone’s ‘Pornography as a public health problem’ in the American Journal of Public Health (March 1972, 374-376). Calderone complained that our capacity to judge when the expression of eroticism is appropriate “is so poorly developed in such a large proportion of individuals that the term ‘pornography’ begins to be applied with faulty judgment — one might almost say — promiscuously”.
But when it comes to determining what is eroticism and what is pornography, who, asks Calderone, “is to arbitrate and adjudicate the question, and on what basis?”
The answer appeared to be public health physicians. Dr Calderone, incidentally, was the Executive Director of the Sexuality Information and Education Council of the United States, which she founded in 1964.
Which dovetails into Prof Irving Zola’s essay Healthism and Disabling Medicalisation, where he cites Dr P Henderson’s 1971 address to the British School Health Service Group. Henderson called for school health workers’ involvement in a series of “health problems”. Zola cites 10, including maladjustment, juvenile delinquency, children in care and maladjustment. “One wonders,” asks Zola, “who or what is left out?” Pornography, I suggest.
My inclination is to favour the views expressed by David Boaz, writing in the Business Journal (22 July 2016), whose title asserts that “Porn is not a ‘public health crisis’”.
He warns that bureaucracies are notorious in their wish to expand: “So, true to form, the public health authorities have broadened their mandate and kept ongoing.” And it seems to me that the exclusion of a pornography subtype of hypersexual disorder from the DSM-5 was probably a close-run thing.
Erotica or pornography? Either way, it seems to me that sex is about a feeling… which a tastefully-illuminated dominatrix in a jacuzzi of custard will never evoke.
Everyone needs an editor”, observed London-born American writer Tim Foote (1926-2015), pointing out that Hitler’s original title for Mein Kampf (1925) was Four-and-a-Half Years of Struggle against Lies, Stupidity and Cowardice.
Leaving aside the thought that what Hitler needed more than an editor was an early demise, his rise to power and the collapse of the Weimar Republic enfeebled Germany’s hitherto vibrant medical press. For example, in his Racial Hygiene: Medicine Under the Nazis (1988: p70), Robert N Proctor notes how, in December 1933, Germany’s leading medical journal, the Deutsches Ärzteblatt, drooled over the benefits conferred by Nazism: “Never before has the German medical community stood before such important tasks as that which the national socialist ideal envisions for it.”
This “national socialist ideal” included human experimentation of appalling depravity. Decades later, the ethical backwash of those crimes stimulates contemporary debate in the biomedical community. For example, Dr Robert L Berger, in the New England Journal of Medicine (17 May 1990), considers ‘Nazi Science — The Dachau Hypothermia Experiments’, finding that some scientists insist on banning citations of these tainted data, while others advocate their dissemination, claiming that they might save lives.
Yet the assumption that the lessons of publishing and/or citing ethically-suspect experimental results might not have been lost on those who exercised medical editorial clout in the post-war era is mistaken. For example, published results acquired from the infamous 40-year Tuskegee Syphilis Study in Alabama evidently met editorial and ethical standards of scrutiny.
But in what other ways might patients’ lives be risked for the want of moral and editorial rigour? In a recent paper in the Journal of Medical Ethics, the philosopher Prof Thomas Ploug asks: ‘Should All Medical Research be Published? The Moral Responsibility of Medical Journal Editors’, contemplating the permissibility of conducting research but not publishing results. One example Ploug cites in support of his contention that publishing certain studies may harm patients is the analysis by Abramson et al in the BMJ (22 October 2013), titled ‘Should People at Low Risk of Cardiovascular Disease Take a Statin?”, where they note that “[t]he side-effects of statins… occur in about 20 per cent of people treated… ”
Ploug cites one study’s claim that “as many as 200,000 people in the UK have stopped taking statins” following media coverage, apparently increasing the likelihood of harmful effects to these people.
Ploug’s faith in editors’ ability to foresee “the potential harmful effects of publishing research” is evident because “[m]ost if not all editors have a background in research and therefore must be expected to be able to understand and critically engage with the content of research publications. And, very importantly, they are aided in these efforts by the reviewers, who may have a firmer grasp of specific areas within a particular field of medical research.”
So, did the editors/reviewers of the British Journal of Sports Medicine (online 21 January 2018) act irresponsibly in publishing ‘Statin Wars: Have we Been Misled About the Evidence? A Narrative Review’, a challenge to the efficacy of statins from Dr Maryanne Demasi, stating that “[d]octors and patients are being misled about the true benefits and harms of statins, and it is now a matter of urgency that the raw data from the clinical trials are released”?
Conversely, the notorious Naudé review raises questions about the extent to which certain editorial boards and reviewers are “able to understand and critically engage with the content of research publications”. In July 2014, the journal PLoS One published Naudé et al’s ‘Low Carbohydrate versus Isoenergetic Balanced Diets for Reducing Weight and Cardiovascular Risk: A Systematic Review and Meta-Analysis’. It reported that when the energy consumed by people following low-carbohydrate and balanced diets was similar, there was no difference in weight loss. This conclusion was crucial in Prof Tim Noakes — a leading promoter of low-carbohydrate, high-fat diets — being charged with “disgraceful conduct” by the Health Professions Council of South Africa two months later; a charge which he defeated… twice.
In ‘Lore of Nutrition: Challenging Conventional Dietary Beliefs’ (2017) by Noakes and Sboros, Noakes describes how his and UK researcher Dr Zoë Harcombe’s reanalysis of the review uncovered 15 material errors. Harcombe and Noakes corrected these errors, repeated the meta-analysis and found that the lower-carbohydrate diet “produced significantly greater weight loss than did the balanced diet”.
Harcombe and Noakes reported their findings in the South Africa Medical Journal (2016, 106: 1,179-1,182), asking whether the Naudé review was “mistake or mischief?”, with Noakes commenting in ‘Lore of Nutrition’ (p 128) that “the reluctance of the editors of PLoS One to properly investigate the nature of the material errors raises questions of who the journal is protecting, and why”. To date, the Naudé review remains unretracted. A correction that was issued on 2 July 2018 fails to address many remaining substantive issues raised by Harcombe and Noakes.
Scientific facts and human values are intertwined, entailing a moral responsibility on researchers and editorial boards to promote a spirit of genuine enquiry and constructive criticism based on ethically-acceptable content, but not unsubstantiated opinion nor authorial reputation.
Many years ago, we lived in London’s Braemar Avenue, where Ginger Baker of the super-group Cream had stayed in the 1960s. Despite my fondness for Ginger’s drumming, it was another resident — our local GP — who eclipsed Braemar Avenue’s rock god.
Impeccably clad, often in a Marengo dinner suit, white shirt, bow tie and cummerbund, this fine Asian gentleman received patients in his well-appointed consulting room.
The atmosphere evoked Edwardian Bloomsbury, not 1970s Neasden, and one felt less of a patient but rather a member of an establishment club. “So, how’s that cyst doing, hmm?” he might ask in his received-pronunciation English, reclining in a leather-upholstered chair and rolling an unlit cigar between manicured finger and thumb. We could have been dining in Belgravia, recalling an Old Etonian chum, now administering a distant outpost of Empire. The last thing our immaculately-tailored GP looked at was his watch. Consultation over, diagnosis made, treatment finalised, one emerged to find an orderly queue, its members drawn to his unhurried temperament and determination to treat patients, not diseases.
So how to respond to Cohen et al’s recent ‘An observational study of patients’ attitudes to tattoos and piercings on their physicians: The ART study’ in the Emergency Medical Journal (doi:10.1136/emermed-2017-206887)? This American study of over 900 emergency department patients concluded: “In the clinical setting, having exposed body art does not significantly change patients’ perception of the physician.”
Well, if I were admitted to an emergency department with a cardiac arrest or a smashed kneecap, I would uncomplainingly endure the ministrations of a medic as ‘inked’ as a tugboat captain and with ornamental ironmongery dangling from his — or her — ears like carabiners from a mountaineer’s hip. But faced with such an individual in the relatively relaxed surroundings of a consulting room, I might be less interested in what’s causing my gall bladder to swell like a wind-sock in a typhoon, and more interested in why, precisely, my doctor has tattooed on his — or her — forearm a shotgun-toting chipmunk.
In these ‘anything goes’ times, many would assert that it’s none of my business if a physician chooses to heal the sick with the Gettysburg Address tattooed down one side of his — or her — neck. Yet for all I know, beneath his impeccable Savile Row threads, Braemar Avenue’s star GP of the 1970s might have sported the Battle of Thermopylae all over his back; although my impression was that no tattooist’s ink had despoiled his skin. And one’s impressions help determine the success of a medical consultation. For example, writing in Clinical Paediatrics (2016, 55: 915-920), Johnson et al considered the ‘Adverse effects of tattoos and piercing on parent/patient confidence in health care providers’. In this American study, 314 voluntary participants were shown photographs of tattooed and non-tattooed practitioners, both with and without facial piercings. The participants “rated tattooed practitioners with lower confidence ratings when compared with non-tattooed practitioners and reported greater degrees of discomfort with greater degrees of facial piercing”.
And would you blame me if I were to hesitate before giving a surgeon — whose gamboge-hued bicep portrayed allegiance to ancient human-sacrificing Aztecs — the free run of my innards for a few hours of an afternoon? I only ask having been given cause to speculate that the psychological landscapes of some individuals who opt for tattoos might contain one or two darker contours than those who choose to remain unadorned. This was contemplated by Dr Viren Swami, whose ‘Written on the body? Individual differences between British adults who do and do not obtain a first tattoo’ appears in the Scandinavian Journal of Psychology (2012, 53: 407-412). His investigation of 136 British residents who visited a tattoo parlour found that “… compared to individuals who did not subsequently obtain a tattoo, individuals that did were significantly less conscientious, more extraverted [sic], more willing to engage in sexual relations in the absence of commitment and had higher scores on sensation-seeking”.
All in all, I prefer my physicians not to be inked… at least not so that I can see, but nor do I expect them to achieve the sartorial heights attained by Braemar Avenue’s star GP (although his ability to express himself clearly in plain English is a gift that has deserted many present-day practitioners and is one to be prized). However, I do recommend the findings of Dr Selena Au, who considered ‘Physician attire in the intensive care unit and patient family perceptions of physician professional characteristics” in JAMA Intern Med (2013, 173:465-467) and found that most respondents “indicated that it was important for physicians to be neatly groomed, be professionally dressed and wear visible name tags, but not necessarily a white coat”.
To many, I suspect, such strictures may represent a reluctance to change with the times and given the grave nature of these times, a tattooed medic is surely a trivial matter. Indeed, it is… which is my point. As Howard Jacobson once observed in the context of sexism: “To trivialise is also to dishonour.”
A tattooed medic dishonours the star GP of Braemar Avenue.
In his novel The Doctor (1812), Robert Southey warns: “Beware of those who are homeless by choice.” By way of context, Southey maintained that a man who cared no more for one place than another was someone “who loves nothing but himself”. Southey was long dead by the time George Price (1922-1975) came along to disprove his assertion. Price, an evolutionary biologist, was born in America, lived in England and far from loving nothing but himself, arguably loathed nothing but himself, finally taking a pair of nail scissors to his throat in a London squat.
Trained as a chemist, and with a Harvard doctorate, Price worked on equations first devised by William Hamilton, the founding father of sociobiology, who discovered a mathematical formula suggesting a genetic basis for human idealism. Price showed that while self-sacrificing behaviour exists among animals and humans, it is a behaviour shorn of nobility: Only altruism which spreads the genes that cause it can achieve long-term survival.
This insight plunged Price into a malignant sadness, and depression ensued. In 1972, in a letter to biologist John Maynard Smith, with whom he was preparing a paper for Nature, Price wrote: “I am now down to exactly 15p and my visitor’s permit for staying in the UK expires in less than a month.” Yet he remained optimistic, buoyed by the knowledge that there were baked beans in the fridge. Embracing the unconditional altruism embodied by The Good Samaritan, he cleaved without equivocation to the motto: “Sell all you have and give to the poor.” When Price helped tramps, alcoholics and assorted outcasts, they stole his money and he fled to a squalid Euston squat — where his hand found the nail scissors.
Many medical professionals press the tenets of evidence-based medicine into service as a means of uncovering scientific truth. But when it comes to health-related aspects of life as experienced by ‘the homeless’, George Price’s story contributes to an evidence base supporting the fact that ‘the homeless’ do not exist; at least they don’t exist as anything beyond a category. Categorisation can help in the search for scientific truth, but while the round hole of a ‘homeless’ category containing both George Price and the alcoholics who robbed him might bulge with data, to become meaningful, it needs a synthesis that acknowledges the square pegs of the individual lives that are rammed into it. Such a synthesis, I suggest, cannot occur without striving to accommodate at least a grain of critical truth. The challenge is to reconcile the fact that whereas the pursuit of scientific truth demands brains, the search for critical truth recruits the mind.
And it is here that the importance of the medical humanities asserts — or should assert — itself. Take, for example, the recent review by Gulati et al of the University of Limerick, published in the Irish Journal of Psychological Medicine (doi:10.1017/ipm.2018.15), which investigated ‘The Prevalence of Major Mental Illness, Substance Misuse and Homelessness in Irish prisoners: Systematic Review and Meta-analyses’. The prevalence of psychotic disorder was 3.6 per cent; affective disorder, 4.3 per cent; those homeless on committal, 17.4 per cent; alcohol use disorder, 28.3 per cent; and substance use disorder, 50.9 per cent. Noting that Ireland has the lowest per capita availability of secure psychiatric beds in developed countries, Gulati et al not only call for more beds within Irish mental health services, but also for “changes in attitudes towards mentally-disordered offenders” (my emphases).
In this context, it was encouraging to read an editorial in the journal Medical Humanities (2012, 38: 1) in which Dr Deborah Kirklin considers “the isolation, loneliness and helplessness of being homeless”, observing that she finds it “entirely plausible that changes in attitude and consequent realignment of priorities and outcomes can be effected by art”. While acknowledging that if such changes were to happen, they would be subtle, occur beneath the evidence-based radar and be unquantifiable, Kirklin states they would also be “to my mind at least, no less clinically significant”.
One can infer from the work of Gulati et al that raw data are essential to help define the problems associated with, say, homelessness and imprisonment. However, attitudinal change — perhaps including approaches advocated by, for example, Kirklin — might help to shape a refined response that addresses these problems.
On the other hand, it cannot be denied — and I write as someone who struggles to see modern art as anything more than talentless would-be celebrities daubing on canvas — that art isn’t all it’s cracked up to be. As John Carey reminds us in What Good are the Arts? (2005, p 140), Hitler excelled as a patron of the arts, informing Goebbels “at the height of the Stalingrad campaign… of his pleasure in Bruckner’s symphonies, and concluded by comparing the philosophies of Kant, Schopenhauer and Nietzsche”.
Yet despite this caveat, I suggest that one’s search for scientific truth can be much enhanced when it is tempered by an awareness of the need to also sharpen a humanities-derived, non-scientific gift which we all possess: Criticism.
Perhaps if he’d lived, George Price might have worked out an equation for it.
I waited for the trail to steepen before making my move. The lady in front – in her 60s – appeared to be struggling, and I offered a few words of encouragement as I passed. We were approaching the halfway mark of a 38-mile cross-country ultra-race. Thirty minutes later and faced with the first of the route’s three peaks I had slowed to a stuttering jog, slathered in sweat, my legs burning. Taking off my rucksack, I rummaged for a banana sandwich, cramming it into my mouth.
“Keep going,” said the same lady, overtaking me and easing up the tussocky hillside en route to the summit. Gorging on my food, I hoped she would interpret my “Hmphh rrggh” as “thank you”. I never saw her again, but in our brief exchanges we had shared the life-enhancing pleasure and pain of sport. We were competitors, and the better person had finished before me. To me, this was athletics at its best.
Athletics at its worst used to be associated with disgraceful doping behaviour by athletes. But even this has now been eclipsed by disgraceful dopey behaviour from athletics administrators. This is exemplified by the award-winning fracas, which the International Association of Athletics Federations (IAAF) has taken great pains to generate. And with a display of the same steely determination, which earned him a multitude of middle-distance medals and records, IAAF President Lord Sebastian Coe is bent on sustaining its mind-boggling folly.
It began when the South African athlete Caster Semenya won the 2009 African Junior Championships 800 metre women’s final in 1:56.72. By the time she won the World Championships in 1:55.45 later that year, questions over whether Semenya’s ‘masculine’ physique conferred an unfair advantage were being raised, not least by those she had beaten, with sixth-placed Elisa Piccione complaining “For me she is not a woman.” Meanwhile, as Jordan-Young et al (BMJ, 28 April 2014) note in “Sex, health, and athletes”, Semenya had undergone intensive medical and psychological examinations and justifiably commented that she had been “subjected to unwarranted and invasive scrutiny of the most intimate and private details of my being”.
By May 2011 the IAAF – endorsed the following month by the International Olympic Committee (IOC) – had introduced the “hyperandrogenism” rule that women participating in elite sport must have a serum testosterone concentration of less than 10nmol/L. According to the IAAF, “there is a difference in sporting performance between elite men and women, that is predominantly due to higher levels of androgenic hormones in men”. But this was contradicted by Ferguson-Smith and Bavington. Writing in Sports Medicine (2014, DOI 10.1007/s40279-014-0249-8), they acknowledged that although there is a relationship between exogenous testosterone and muscle mass and strength in men, “no such relationship between endogenous testosterone levels and muscle mass and strength has been established for women or, in particular, for female athletes”. Ferguson-Smith and Bavington further cite a study of hormone profiles in 693 post-competition elite athletes suggesting “serum testosterone is not related to athletic performance in women”, yet despite this, “the new regulations on female hyperandrogenism that now replace the rules for gender verification perpetuate the view that testosterone levels are the critical factor”.
As Prof Peter Sonksen notes (British Journal of Sports Medicine, 2018; 52: 219-229), the IAAF ruling led to several elite female athletes being banned from competition and at least four subjected to unnecessary medical and surgical intervention. Sonksen makes the point that “many eminent scientists felt that it was unfair, scientifically unsound and discriminatory”. Yet although the ruling was subsequently rescinded, the IAAF has now introduced draft revised “Hyperandrogenism Regulations”, that would be applicable to women’s track events between 400 metres and the mile and are currently under consideration by the Court of Arbitration for Sport in Lausanne, Switzerland.
This dogged attempt by the IAAF to introduce such blatant discriminatory regulations prompted the resignation of Prof Steve Cornelius from the IAAF’s disciplinary tribunal, telling Lord Coe in a stinging letter that he did not wish to be associated with an organisation that “insists on ostracising certain individuals, all of them female, for no other reason than being what they were born to be”.
In her response to the proposed ruling, Caster Semenya showed that while she might have raised serum concentrations of testosterone, she certainly has an abundance of intelligence: “I don’t talk about nonsense,” Semenya said.
At a time when there is much talk about gender identity, women’s rights and inclusivity, it is disappointing to read the comments of former marathon world champion Paula Radcliffe, who supports the proposal, observing that without a ruling “girls on the start line know they’re never going to get a medal in an 800m. That’s the bottom line”.
No, the bottom line is that we’re all different. For all I know the lady in her 60s who coasted past me may have been hyperandrogenic, although my guess is that she was a better runner who had trained harder. But if she had been hyperandrogenic, so what? That’s life, so let’s get on with it and stop whining.
In August 2013, the UK Parliament voted to reject possible military action against the Syrian regime to deter the use of chemical weapons. In September 2013, the then US President Barack Obama said: “If we fail to act, the al-Assad regime will see no reason to stop using chemical weapons… What kind of world will we live in if the United States of America sees a dictator brazenly violate international law with poison gas, and we choose to look the other way?” (My emphases.)
Obama, like the UK Parliament, failed to act and chose to look the other way.
Between the onset of the Syrian conflict in 2011 and Obama’s address, the world had indeed seen “a dictator brazenly violate international law with poison gas”, resulting in 1,414 verified fatalities. These figures are from the Syrian American Medical Society, which, in February 2016, published a report titled A New Normal: Ongoing Chemical Weapons Attacks in Syria.
The report records 161 documented chemical attacks in Syria (a further 133 chemical attacks could not be fully substantiated), resulting in 1,491 deaths and 14,581 injuries. It further notes that of the 161 documented attacks, “77 per cent have occurred after the passage of United Nations Security Council Resolution 2118 in September 2013, which created a framework for the destruction of Syria’s declared chemical weapons stockpiles.”
And what did we, the international community, do? As Dr Mohammed Tennari of Syria’s Idlib governorate observed in A New Normal: “In response to chemical attacks in Syria, the international community sends us more antidotes. This means that the world knows that chemical weapons will be used against us again and again. What we need most,” Dr Tennari continued, “is not antidotes — what we need is protection, and to prevent another family from slowly suffocating together after being gassed in their home.”
Having read this, I opened my copy of Tongues of Conscience: War and the Scientists’ Dilemma (1970) by Robert William Reid. In a chapter titled ‘The Shape of War to Come’, Reid considers chemical weapons, noting: “Unlike the atomic bomb, they have yet to be put to use in war… ”
It’s unlikely that Saddam Hussein and al-Assad read Reid’s words and thought, “now there’s an idea”. But whatever brought them to share the tawdry distinction of being the first leaders since the end of the Second World War to use chemical weapons against civilians, they wouldn’t have been troubled by the possibility that the so-called ‘international community’ might have roused itself to intervene beyond words of censure.
One of the recommendations by the authors of A New Normal is that “… financial support from States must occur alongside an active effort by all States to end chemical attacks and other international humanitarian and human rights law violations and hold perpetrators accountable for violations”. (My emphases.)
So it was a small step, which I welcomed, when last month the US, supported by the UK and France, responded to al-Assad’s deployment of chemical weapons against civilians by launching missile strikes at selected Syrian targets. However, it was a forlorn hope that this ‘active effort’ by three States might attract a groundswell of international support. In my view, the reason for such inertia was that many asked themselves why they should stand up for a point of principle when it was being asserted by a President of questionable moral rigour.
And coming on the 15th anniversary of the invasion of Iraq, am I alone in detecting echoes of the strident assertions of many in the anti-war movement back then? Their prime motivation appeared to be strident anti-Americanism and a loathing for a President who couldn’t pronounce correctly the word “nuclear”. Far from embodying left-leaning values, here were conservatives uniting to preserve a cruel regime — governmental rape squads and a Baath Party inspired by many aspects of Nazism — while upholding a system of international law that was unable to implement basic ethical tenets of civilised behaviour.
But at least there were some who recognised that irrespective of what international law might say, humanitarian intervention is a moral right that can, and should, be asserted. One of these was France’s former health minister Dr Bernard Kouchner, who co-founded Doctors Without Borders/Médecins Sans Frontiéres. Another was Nobel Peace Laureate José Ramos-Horta, who points out in his piece in the Wall Street Journal (13 May 2004), titled ‘Sometimes a War Saves People’, that he supported Vietnam’s 1978 invasion of Cambodia to oust Pol Pot, and Tanzania’s invasion of Uganda in 1979 to eject Idi Amin, both without United Nations or international approval; applauded the French who deposed “Emperor” Jean Bokassa in the then Central African Empire; endorsed the NATO intervention in Kosovo without a UN mandate; and “I rejoiced once more in 2001 after the US-led overthrow of the Taliban liberated Afghanistan… ”
It is time to get used to the fact that Trump is in the White House; to shake free of the cultural relativism that prizes national sovereignty over human rights; and pay attention to the Dr Mohammed Tennaris of this world.
In October 2017, 20 doctors at Warsaw’s Paediatric Hospital went on hunger strike, demanding greater health expenditure from the government; and in 2015 two doctors staged a 24-hour hunger strike at London’s Parliament Square in protest at the treatment of NHS whistleblowers.
In the Polish strike, doctors were monitored by colleagues, with those whose health raised concerns being replaced by healthy volunteers; and the London case was self-limiting. Thus, some hunger strike protests or demands trying to effect political change cause, at worst, discomfort to participants. But hunger strikes undertaken by the incarcerated can sometimes be lethal, posing legal and ethical challenges for medical personnel.
Two recent studies by Gulati et al of the University of Limerick and University College Cork consider these issues in the Irish Journal of Psychological Medicine, with one addressing ‘Hunger strikes in prison: A legal perspective for psychiatrists’. They highlight guidelines from the World Medical Association, noting that autonomy is favoured over beneficence; that the neutrality of physicians is to be upheld; and that “force-feeding of an individual with capacity who refuses the same is not acceptable”.
Which makes perfect sense. The only problem is that lodged in our skulls, psychiatrists included, is an often impenetrable barrier separating the ice-block of legal reasoning from a cauldron of disparate beliefs, political opinions and views. My inference is that this separation is acknowledged in Gulati et al’s second study, ‘Hunger strikes in prisons: A narrative systematic review of ethical considerations from a physician’s perspective’, where they state: “Whilst there seems to be an overall consensus favouring autonomy over beneficence, tensions along this fine balance are magnified in jurisdictions where legislation leads to a dual loyalty conflict for the physician.”
The extent to which “dual loyalty conflicts” can arise can be gauged from the titles of two chapters in the British Medical Association’s Medicine Betrayed: The participation of doctors in human rights abuses (1992). With chapter four entitled ‘Medical involvement in torture’ and chapter five entitled ‘Abuse of psychiatry for political purposes’, here are reminders that one’s professional status does not confer immunity from uncivilised behaviour.
A more recent reminder is supplied by Bringedal et al in the Journal of Medical Ethics (2018, 44: 239‒43) in their paper ‘Between professional values, social regulations and patient preferences: Medical doctors’ perceptions of ethical dilemmas’. This Norwegian survey of over 1,200 doctors requested them to define whether a supplied range of different circumstances constituted a medical dilemma. One was the ethics of force-feeding a person on hunger strike and 76 per cent judged it to qualify as a medical dilemma. To my surprise, however, only 42 per cent said they would not force-feed a hunger striker and 39 per cent admitted they did not know what they would do. In the context of a supposedly enlightened Scandinavian model of social attitudes, these data are concerning.
Equally concerning is an apparent unquestioning attitude to the widely accepted concept of autonomy, with Garasic and Foster, writing in Medicine and Law (2012, 31: 589-98) to remind us of “a huge and dangerous political elephant in the room”. In ‘When Autonomy Kills: The case of Sami Mbarka Ben Garci’, we read that: “Autonomy rights (and therefore the right to die) are often accorded to hunger strikers who come from classes perceived to be undesirable, but withheld (or trumped by other considerations) in the case of strikers from more desirable classes.”
When the Muslim Tunisian Sami Mbarka Ben Garci was imprisoned in Pavia, Italy, charged with rape, he went on hunger strike, asserting his innocence. Assessed as competent, his autonomy was respected and he died on 5 September 2009.
Garasic and Foster note the silence of the Roman Catholic Church in this case: “It is not usually slow to comment about cases where the sanctity of life is at stake. Was this because, as well as not being seen as part of the body politic, [Ben Garci] was not part of the Catholic body ecclesiastical?”
This approach to autonomy, illustrated by the Ben Garci case, prompts the thought that treatment decisions are seldom made in a political vacuum, in which case the concept of autonomy in the context of hunger strikes may not always be a clear-cut one.
If it is acceptable not to force-feed a mentally competent hunger striker who wishes to die, is it acceptable to force medication on a mentally incompetent individual who doesn’t wish to die, to make that person sane so that they can be executed? Writing in Medicine Healthcare and Philosophy (2013, 16: 795-806), Garasic considers ‘The Singleton case: Enforcing medical treatment to put a person to death’. In October 2003 the United States Supreme Court ruled that Arkansas prison officials could force schizophrenic convicted murderer Charles Singleton to take drugs that would render him sufficiently sane to receive a lethal injection. He was executed on 6 January 2004, having undergone a series of therapeutic interventions aimed at achieving a decidedly non-therapeutic outcome: Death.
The conflicts between patient autonomy, medical ethics and political jurisdictions ought to be capable of resolution. Who would have thought that ‘ought’ could tax we humans so much?
“Whassa pointa that?” I moaned to Smicker during our English Lit class. “The ‘pointa that’, Winter,” thundered my teacher — who could hear a participle drop at 40 yards — “is that I say there is a point to learning poetry by heart!” And it was one that only took me decades to grasp: That a readily-accessible verse or poetic fragment can be a life-enhancing experience.
But life enhancement of a different nature was sought urgently many years ago when my wife somehow survived an initial assault from a grade 5 sub-arachnoid haemorrhage. She was taken from intensive care to a neuro ward prior to surgery the next day. Without an operation (which proved successful) to repair the aneurysm, she would die, so there was no questioning its necessity. What I did question — albeit silently — was the procedure for obtaining her consent. And that’s when, inexplicably, schoolboy memories of Keats’s On First Looking into Chapman’s Homer (1816) arose to provide a split perspective.
I sat at her bedside; she lay semi-conscious, the effects of the sedative slowly ebbing away. Then a junior doctor — let’s call him ‘Chapman’ — arrived, together with a demeanour whose unspoken declaration was that this ward was ‘ruled as his demesne’. I wasn’t sure what was happening “til I heard Chapman speak out loud and bold”. Focused as a spotlight, he was armed both with paperwork and a determination to ensure that when he left, all box-shaped elements of this exercise in obtaining consent for tomorrow’s operation would be well and truly ticked.
I felt less “like some watcher of the skies when a new planet swims into his ken” and more that we had been noisily interrupted. My wife, stirred by the rumpus, succeeded in opening her eyes and tried to focus on the document now brandished in front of her — a futile attempt, as her contact lenses had been removed days before. But now a new challenge arose: She — we — had to assimilate and consider some of the many risks the operation courted, and which Chapman was reciting like a cantor on Benzedrine; so, if she could just sign… here? On his way out, Chapman pirouetted and asked, “Any questions?” We “look’d at each other with a wild surmise” and said “Duhhh”, much as Homer — the Simpson, not the Greek — might have uttered when faced with choosing between a crate of beer and a box of donuts. Chapman interpreted my wife’s head moving slowly in bewilderment as a ‘no, you’ve made it all perfectly clear’, said thanks, and left.
With time comes perspective, however, together with a more charitable recollection. In retrospect, it may well have been that a stressed-out Chapman had been up all night, all by himself, tending to patients and that he just wanted to complete this administrative chore before going home to fall into a deep sleep. So, I was intrigued to come across ‘All by myself: Interns’ reports of their experiences taking consent in Irish hospitals’, published recently in the Irish Journal of Medical Science. Heaney et al’s aim was to evaluate a 12-point questionnaire returned by each of 60 interns at three Irish teaching hospitals to determine their roles in the surgical consent process and to identify their concerns.
Noting that 44 interns (73.3 per cent) had never been supervised by a senior doctor in obtaining consent; of 58 interns who had obtained consent, “six interns (10.3 per cent) reported knowledge of ‘all’ the steps of the procedure [and]… only five interns (8.6 per cent) reported that they were aware of all the risks of the procedures”, the authors concluded that most of the “interns reported that they had taken consent for a procedure without full knowledge of the procedure and its complications”.
Citing the Medical Council’s view that “an intern is deemed to be an unsuitable delegate”, the authors speculate “whether [interns] should be involved in the consent process at all”. It seems that more-senior doctors need to be involved in this sometimes tricky undertaking.
I would make a further suggestion, having looked at Consent to Medical Treatment in Ireland (2015) by the Medical Protection Society, where it states (page three) that there are three components to valid consent: Capacity, information and voluntariness. Bearing these three in mind, and as a non-medic with a narrow but sharp experience of the consent process on a patient sample of one, consenting would be a more meaningful process if the patient were fully conscious at the time consent is sought.
But there are other aspects of the process that merit further consideration. For example, Donovan-Kicken et al considered “sources of patient uncertainty when reviewing medical disclosure and consent documentation” in Patient Education and Counselling (2013, 90:254-260), finding four distinct areas of uncertainty: Language; risks and hazards; the nature of the procedure; and the composition and format of documentation. The most important area is language, and interns trying to impart lucid explanations in what might not be their mother tongue — or their patient’s — may struggle.
If only the consenting process, like poetry, could be learned by heart.
The Irish Hospice Foundation Strategic Plan 2016-2019 reminds us that 57 per cent of people in Ireland say there is insufficient discussion about death and dying. I agree… or rather, that’s what I would have said, had I not taken another look at the front cover of the Strategic Plan, which includes a photograph of two women standing in front of a poster with the caption: ‘Think Ahead: Speak for Yourself’. This prompted the thought: What about those who are neither able to think ahead nor speak for themselves? And I remembered the case of Charlie Gard.
Born in August 2016, Charlie was admitted to London’s Great Ormond Street Hospital (GOSH), where he was diagnosed with the rare infantile-onset encephalomyopathic mitochondrial DNA depletion syndrome. Charlie was deaf, paralysed, needed ventilatory support and had a poor prognosis. In January 2017, Charlie’s parents identified an experimental nucleoside treatment in the US and launched a fund-raising effort.
But when the infant developed seizures, GOSH doctors objected to what they considered futile treatment; they applied to the High Court for permission to withdraw life support. This was granted and subsequently upheld, with a ruling in June 2017 from the European Court of Human Rights exhausting all legal options. The GOSH team made plans to withdraw medical treatment.
Which brings me to the point about whether there is enough discussion about death and dying. In the case of Charlie Gard, there seemed to be discussion aplenty, if only to the extent that here was an opportunity for the public at large to deploy social media to offer anyone who cared to read or listen the benefit of their insights. Unfortunately, these insights were gleaned largely from a clamouring press whose support for Charlie receiving treatment in the US, I suggest, was based more on emotional rabble-rousing than a dispassionate consideration of the issues involved. Some of the roused rabble even threatened medical staff at GOSH.
Apparently unable to resist throwing themselves into the media maelstrom too were the Pope and the President of the US. As far as Trump is concerned, I’ll opt for Wittgenstein’s ‘Whereof one cannot speak, thereof one must be silent’ and leave it at that. The Pope — as Patrick Greenford reported in The Guardian (3 July 2017) — prayed that Charlie’s parents’ “wish to accompany and treat their child until the end isn’t neglected”. Yet following the infant’s death on 28 July 2017, Elise Harris, writing in Catholic Online (17 November 2017, http://www.catholic.org/news/hf/faith/story.php?id=76383) quoted the Pope as stressing that medical options “must avoid the temptation either to euthanise a patient or to pursue disproportionate treatments which do not serve the integral good of the person” (my emphasis).
So would it have been better to allow Charlie travel to the US for treatment as a last resort? I don’t know. And even when Wilkinson and Savulescu considered ‘Hard lessons: Learning from the Charlie Gard case’ in the Journal of Medical Ethics (published online 2 August, 2017) they conceded that they disagreed with each other about what the right course of action ought to have been. They did agree, however, on the importance of considering whether we should have lower thresholds for undertaking experimental treatments when no other option exists; how limited resources should be allocated; and whether resolution of treatment disputes can be achieved without legal recourse.
Ethical decisions are based on facts and values, and while few would dispute the facts of a case, values pose a trickier challenge. For example, when Sprung et al, writing in Intensive Care Medicine (2007, 33: 1,732-39), considered ‘The importance of religious affiliation and culture on end-of-life decisions in European intensive care units’ among patients, doctors and families, they concluded: “Significant differences associated with religious affiliation and culture were observed for the type of end-of-life decision, the times to therapy limitation and death, and discussion of decisions with patient families.”
The notion that ethics is intelligible only in the context of religion is risible, foolishly recruiting false reasoning to the interpretation of divine commands. A fully-rounded, ethically-based set of precepts cannot be usefully and uniformly applied to the same situation by individuals motivated by different values. Perhaps, as Wilkinson et al argue in Bioethics (2016, 30: 109-118), we should be — as the title of their paper asserts — “in favour of medical dissensus: Why we should agree to disagree about end-of-life decisions”.
Agreeing to disagree doesn’t alleviate the stress of wrestling with dilemmas such as those posed by the Charlie Gard case, but perhaps we can feel our way by disputation and discussion towards something approaching resolution. But prolonged reasoning is ridiculed today. Our screen-based zeitgeist dictates that mass values determine what passes for discussion, with 280 characters deemed sufficient for a point to be made. Our coarsened, consumption-obsessed society is neither an educated nor a moral one, despite having facts — literally — at our fingertips. Perhaps Eliot anticipated this when he asked in The Rock (1934): “Where is the wisdom we have lost in knowledge? Where is the knowledge we have lost in information?”