Medicine

The department for patients no-one else wants – The BMJ

This blog by Chris Bulstrode was written well before Covid. The future needs to be different: it’s the doctors stupid.

However both of these issues pale into insignificance compared with the challenge of providing life-long job satisfaction in a career consisting of endless night and weekend shifts. The consequence is that if we don’t do something radical soon, I fear that the emergency department may collapse and bring the whole NHS system down with it. Sometimes, when I am not on duty, I muse about how our society as it is now will be seen in 100 years’ time. Children might be taught in history classes that in 1948 a small island off the west of Europe set-up a revolutionary advance in civilization which was the talk of the rest of the world. It was called the NHS and provided free healthcare for all. My fear is that the next sentence from the teacher will be that it collapsed in the year 201..? [emphasis added]

Shareholder value

The myth of healthy smoking

Menthol cigarettes were first promoted to soothe the airways of “health conscious” smokers. Long used as an analgesic, menthol evokes a cooling sensation that masks the harshness of tobacco smoke. In the competition to capitalize on the growing menthol market, the industry’s marketing experts “carved up, segmented, and fractionated” the population, exploiting psychology and social attitudes to shape product preferences.

[emphasis added]

For reasons of State

Stephen Sedley · A Decent Death · LRB 21 October 2021

A sharp pen from Stephen Sedley, a former appeal court judge, in the LRB.

Absurdly and cruelly, until the 1961 Suicide Act was passed it was a crime to kill yourself. While those who succeeded were beyond the law’s reach, those who tried and failed could be sent to jail. In the 1920s the home secretary had to release a Middlesbrough woman with fourteen children who had been given three months in prison for trying to kill herself. There is a Pythonesque sketch waiting to be written about a judge passing a sentence of imprisonment for attempted suicide: ‘Let this be a lesson to you and to any others who may be thinking of killing themselves.’ In fact, by the mid 19th century the law had got itself into such a tangle that a person injured in a failed attempt at suicide could be indicted for wounding with intent to kill, an offence for which Parliament had thoughtfully provided the death penalty.

But the repeated resort by doctrinal opponents of assisted dying to the need for safeguarding tends to be directed not to resolving any difficulties but to amplifying and complicating them to the point of obstruction – the kind of argument which, as Gore Vidal once put it, gives intellectual dishonesty a bad name.

[emphases added]

What do I need to know to pass the exam?

by reestheskin on 24/06/2021

Comments are disabled

After thirty-five years of teaching medical students dermatology the 2021 GMC’s Medical Licensing Assessment (MLA) content map makes for dispiriting reading. The document states that it sets out the core knowledge expected of those entering UK practice. It doesn’t.

My complaint is not the self-serving wish of the specialist who feels that his subject deserves more attention — I would willingly remove much of what the GMC demand. Nor is it that the document elides basic clinical terminology such as acute and chronic (in dermatology, the term refers to morphology rather than just time). Nor, bizarrely, that it omits mention of those acute dermatoses with a case-fatality rate higher than that of stroke or myocardial infarction: bullous pemphigoid, pemphigus, and Stevens-Johnson syndrome/Toxic Epidermal Necrolysis are curiously absent. No, my frustrations lie with the fact that the approach taken by the GMC, whilst superficially attractive, reveals a lack of insight into, and, knowledge of medicine and expertise in medicine. The whole GMC perspective, based on a lack of domain expertise, is that somehow they can regulate anything. That somehow there is a formula for ‘how to regulate’. This week, medicine; next week, the Civil Aviation Authority. The world is not like that — well it shouldn’t be.

Making a diagnosis can be considered a categorisation task in which you not only need to know about the positive features of the index diagnosis, but also those features of differential diagnoses that are absent in the index case (for Sherlock Holmes aficionados, the latter correspond to the ‘curious incident of the dog in the night’ issue). It is this characteristic that underpins all the traditional ‘compare and contrast’ questions, or the hallowed ‘list the differentials, and then strike them off one-by-one’.

Take melanoma, which the MLA content guide includes. Melanoma diagnosis requires accounting for both positive and negative features. For the negative features, you have to know about the diagnostic features of the common differentials that are not found in melanomas. This entails knowing something about the differentials, and, as the saying goes, if you can’t name them, you can’t see them. A back of the envelope calculation: for every single case of melanoma there are a quarter of a million cases made up of five to ten diagnostic classes that are not melanomas. These include melanocytic nevi, solar lentigines, and seborrhoeic keratoses; these lesions are ubiquitous in any adult. But the MLA fails to mention them. What is a student to make of this? Do they need to learn about them or not? Or are they to be left with the impression that a pigmented lesion that has increased in size and changed colour is most likely a melanomas (answer:false).

Second, the guide essentially provides a list of nouns, with little in the way of modifiers. Students should know about ‘acute rashes’ and ‘chronic rashes‘ — terms I should say that jar on the ear of any domain expert — but which conditions are we talking about, and exactly what about each of these conditions should students know?

In some domains of knowledge it is indeed possible to define an ability or skill succinctly. For instance, in mathematics, you might want students to be able to solve first-order differential equations. The competence is simply stated, and the examiner can choose from an almost infinite number of permutations. If we were to think about this in information theory terms, we would say we can highly compress in a faithful (lossless) way what we want students to know. But medicine is not like this.

Take psoriasis as another example from the MLA. Once we move beyond expecting students to know how to spell the word watch what happens as you try to define all those features of psoriasis you wish them to know about. By the time you have you finished listing what exactly you want a student to know, you have essentially written the textbook chapter. We are unable to match the clever data compression algorithms that generate MP3 formats or photograph compressions. Medical texts do indeed contain lots of annoying details — no E=MC2 for us — but it is these details that constitute domain expertise. But we can all agree, that we can alter the chapter length as an explicit function of what we want students to know.

Once you move to a national syllabus (and for tests of professional competence, I am a fan) you need to replace what you have lost; namely, the far more explicit ‘local’ guides such as ‘read my lecture notes’ or ‘use this book but skip chapters x, y and z’ that students could once rely on. The most interesting question is whether this is now better done at the level of the individual medical school or, as for many non-medical professional qualifications, at the national level.

Finally, many year ago, Michael Power, in his book, The Audit Society: Rituals of Verification demolished the sort of thinking that characterises the whole GMC mindset. As the BMJ once said, there is little in British medicine that the GMC cannot make worse. Pity the poor students.

Which experts?

by reestheskin on 16/06/2021

Comments are disabled

The following is from Pulse, a magazine that is aimed at GPs. My point is not so much about the specifics but a more general point.

Headache, runny nose and sore throat top three symptoms of Delta variant, says researcher – Pulse Today

Professor Spector said cases were rising exponentially and people who have only had one vaccine dose should not be complacent.

The UK really does now have a problem and we’ll probably be seeing, in a week, 20,000 cases and by 21st June well in excess of that number,’ he said. ‘Most of these infections are occurring in unvaccinated people. We’re only seeing slight increases in the vaccinated group and most of those in the single vaccinated group,’ he said.

He goes on to say:

Covid is also acting differently now. Its more like a bad cold in this younger population and people don’t realise that and it hasn’t come across in any of the government information.This means that people might think they’ve got some sort of seasonal cold and they still go out to parties and might spread around to six other people and we think this is fuelling a lot of the problem.

He added:

The number one symptom is headache, followed by runny nose, sore throat and fever. Not the old classic symptoms. We don’t see loss of smell in the top ten any more, this variant seems to be working slightly differently.

He advised people:

who were feeling unwell to stay at home for a few days, use lateral flow tests with a confirmation PCR test if they get a positive result.

Now comes the boilerplate Orwellian response from the Department of Health and Social Care

[A] spokesperson said: ‘Everyone in England, regardless of whether they are showing symptoms, can now access rapid testing twice a week for free, in line with clinical guidance.

Experts keep the symptoms of Covid-19 under constant review and anyone experiencing the key symptoms – a high temperature, a new continuous cough, or a loss or change to sense of smell or taste – should get a PCR test as soon as possible and immediately self-isolate along with their household.’ [emphasis added]

Two points:

  1. The spokesperson, as usual, is not named nor are the credentials of this person available. How are we to assess their competence or reliability? At least you can look Prof Spector up and check out his work.
  2. Following on from the first point, which experts are we talking about? Most expertise, we know, is not within the Dept of Health. One of the most interesting features of the pandemic has been the recognition that the government nor the state (including the Dept of Health) have a monopoly on knowledge. Of course, we know they will seek to conceal and dissemble for political reasons. But the fact remains that many people now appreciate that knowledge is diffused more widely within society. David King and his alternative SAGE group have played an important role — beyond just Covid.

Body lice

by reestheskin on 14/06/2021

Comments are disabled

It is one of dermatology’s tedious and fun facts that, in contradistinction to say scabies or head lice, you treat body lice by treating not the patient (directly) but the clothing. The pharmacological agent is a washing machine. But the excerpt quoted below tells you something wonderful about science: you get things out that you never expected. Spontaneous generation — not in the real world — but in the world of ideas. Well, almost.

How clothing and climate change kickstarted agriculture | Aeon Essays

Scientific efforts to shed light on the prehistory of clothes have received an unexpected boost from another line of research, the study of clothing lice, or body lice. These blood-sucking insects make their home mainly on clothes and they evolved from head lice when people began to use clothes on a regular basis. Research teams in Germany and the United States analysed the genomes of head and clothing lice to estimate when the clothing parasites split from the head ones. One advantage of the lice research is that the results are independent from other sources of evidence about the origin of clothes, such as archaeology and palaeoclimatology. The German team, led by Mark Stoneking at the Max Planck Institute for Evolutionary Anthropology, came up with a date of 70,000 years ago, revised to 100,000 years ago, early in the last ice age. The US team led by David Reed at the University of Florida reported a similar date of around 80,000 years ago, and maybe as early as 170,000 years ago during the previous ice age. These findings from the lice research suggest that our habit of wearing clothes was established quite late in hominin evolution.

Questionable blood

by reestheskin on 07/06/2021

Comments are disabled

Infected blood scandal: government knew of contaminated plasma ‘long before it admitted it’ | Contaminated blood scandal | The Guardian

From the Guardian.

Among the victims of the contaminated blood scandal, which is the subject of a public inquiry, were 1,240 British haemophilia patients, most of whom have since died. They were infected with HIV in the 1980s through an untreated blood product known as Factor VIII.

In 1983, Ken Clarke, then a health minister, denied any threat was posed by Factor VIII. In one instance, on 14 November 1983, he told parliament: “There is no conclusive evidence that Aids is transmitted by blood products.”

However, documents discovered at the national archives by Jason Evans, whose father died after receiving contaminated blood and who founded the Factor 8 campaign, paint a contrasting picture.

In a letter dated 4 May 1983, Hugh Rossi, then a minister in the Department of Health and Social Security (DHSS), told a constituent: “It is an extremely worrying situation, particularly as I read in the weekend press that the disease is now being transmitted by blood plasma which has been imported from the United States.”

(HIV screening for all blood donated in the UK only began on 14 October 1985.)

Rossi’s letter was considered damaging enough for the government to seek to prevent its release in 1990 during legal action over the scandal, by which time Clarke was health secretary.

In another letter uncovered by Evans, dated 22 March 1990, a Department of Health official wrote to government lawyers saying it wanted to withhold Rossi’s letter, despite admitting the legal basis for doing so was “questionable”.

Clarke has a legal background. There is a large logical gap between between denying ‘any threat’ and the statement that there is ‘no conclusive evidence’. The Department of Health would be better named the Department without Integrity. Recent events suggest things are no better now. It didn’t all start with Johnson.

We know less that we pretend

by reestheskin on 27/05/2021

Comments are disabled

Jonathan Flint · Testing Woes · LRB 6 May 2021

Terrific article from Jonathan Flint in the LRB. He is an English psychiatrist and geneticist (mouse models of behaviour) based in UCLA, but like many, has put his hand to other domains (beyond depression). He writes about Covid-19:

Perhaps the real problem is hubris. There have been so many things we thought we knew but didn’t. How many people reassured us Covid-19 would be just like flu? Or insisted that the only viable tests were naso-pharyngeal swabs, preferably administered by a trained clinician? Is that really the only way? After all, if Covid-19 is only detectable by sticking a piece of plastic practically into your brain, how can it be so infectious? We still don’t understand the dynamics of virus transmission. We still don’t know why around 80 per cent of transmissions are caused by just 10 per cent of cases, or why 2 per cent of individuals carry 90 per cent of the virus. If you live with someone diagnosed with Covid-19, the chances are that you won’t be infected (60 to 90 per cent of cohabitees don’t contract the virus). Yet in the right setting, a crowded bar for example, one person can infect scores of others. What makes a superspreader? How do we detect them? And what can we learn from the relatively low death rates in African countries, despite their meagre testing and limited access to hospitals?

That we are still scrambling to answer these questions is deeply worrying, not just because it shows we aren’t ready for the next pandemic. The virus has revealed the depth of our ignorance when it comes to the biology of genomes. I’ve written too many grant applications where I’ve stated confidently that we will be able to determine the function of a gene with a DNA sequence much bigger than that of Sars-CoV-2. If we can’t even work out how Sars-CoV-2 works, what chance do we have with the mammalian genome? Let’s hope none of my grant reviewers reads this.

Medicine is always messier that people want to imagine. It is a hotchpotch of kludges. For those who aspire to absolute clarity, it should be a relief that we manage effective action based on such a paucity of insight. Cheap body-hacks sometimes work. But the worry remains.

What will tomorrow bring

by reestheskin on 07/04/2021

Comments are disabled

Wonderful piece by Janan Ganesh in the FT on the life choices made by young bankers and corporate lawyers, and the crazy (work) demands placed on them. I was surprised that he also has junior doctors in his sights.

Yes, the graduates knew the deal when they joined, but the appeal to free will is an argument against almost any labour standards whatever. Nine-year-old Victorian chimney sweeps knew the deal. As for all the talk of character-forging, of battle-hardening: to what end, exactly? The point of a corporate career arc is that work becomes more strategic, less frenzied over time. The early hazing should not be passed off as a kind of Spartan agoge.

The ageing process — as I have lived it, as I have observed it in friends — has convinced me of one thing above all. The deferral of gratification is the easiest life mistake to make. And by definition among the least reversible. A unit of leisure is not worth nearly as much in late or even middle age as it is in one’s twenties. To put it in Goldman-ese, the young should discount the future more sharply than prevailing sentiment suggests.

The first reason should be obvious enough, at least after the past 12 months. There is no certainty at all of being around to savour any hard-won spoils. The career logic of an investment banker (or commercial lawyer, or junior doctor) assumes a normal lifespan, or thereabouts. And even if a much-shortened one is an actuarial improbability, a sheer physical drop-off in the mid-thirties is near-certain. Drink, sex and travel are among the pleasures that call on energies that peak exactly as graduate bankers are wasting them on work.

I don’t know enough to be confident about clinical medicine but I do often wonder how things will look in a decade or so. Many junior medical jobs are awful, the ties and bonds between the beginning, middle and end of medical careers sundered. Many drop out of training, some treading water in warmer climes, but with what proportion returning? Some — a small percentage perhaps— move into other jobs, and the few I know who have done this, I would rate among the best of their cohort. Of those who stick to the straight and narrow, many now wish to work less than full time, although whether this survives the costs of parenthood, I do not know. At the other end all is clear: many get out as soon as they can, the fun long gone, and the fear of more pension changes casting an ever larger shadow, before the final shadow descends.

Medicine remains — in one sense — a narrow and technical career. The training is long, and full confidence in one’s own skills may take twenty years or more to mature. By that time, it is hard to change course. This personal investment in what is a craft, leaves one vulnerable to all those around you who believe success is all about line-managing others and generic skills.

I am unsure how conscious (or how rational) many decisions about careers are, but there may well be an invisible hand at play here, too. I imagine we may see less personal investment in medical careers than we once did. It’s no longer a vocation, just a job, albeit an important one. Less than comforting words, I know — especially if you are ill.

Smallpox again

by reestheskin on 16/03/2021

Comments are disabled

For some reason — COVID of course — I keep coming back to perhaps the greatest vaccine success ever: the eradication of smallpox (here, here and here). But the figures quoted by Scott Galloway make you sit up and notice both the magic — and value — of science.

Values in America – Scott Galloway on recasting American individualism and institutions | By Invitation | The Economist

International bodies are immolated. Consider the World Health Organisation. Mr Trump’s decision to pull America out of the WHO in the midst of a pandemic (reversed under President Joe Biden) was galling, particularly as the WHO is responsible for one of humanity’s greatest public-health accomplishments: the eradication of smallpox in the 1970s. To appreciate the magnitude of this, Google images of “smallpox” and glimpse the horror that once killed millions each year. It was a victory for co-operative, state-funded projects and it cost a mere $300m. By one estimate, America, the largest contributor, recoups that value every 26 days from savings in vaccinations, lost work and other costs. [emphasis added]

A Pox on You All

by reestheskin on 26/02/2021

Comments are disabled

Or not, as the case may be.

Smallpox is the greatest success story in the history of medicine. It once took huge numbers of lives — as many as half a billion people in the 20th century alone — and blinded and disfigured many more.

So writes the distinguished historian of science, Steven Shapin in the LRB (A Pox on the Poor, February 4, 2021). He is reviewing The Great Inoculator: The Untold Story of Daniel Sutton and His Medical Revolution).

In historical times you had a one in three chance of getting smallpox, and, if you got it, the case-fatality was 20%. Some outbreaks, however, had a case-fatality of 50% and, unlike Covid-19, its preferred targets were children.

My exposure to smallpox was (thankfully) limited. My mother told me that there was an outbreak in South Wales and the West of England when I was around five or six. There was widespread vaccination, but kids like me with bad eczema, were spared, with the parent advised to ‘shield’ the child indoors (my sympathies go to my mother). (The risk was from the grandly named, but sometimes fatal, Kaposi’s varicelliform reaction, which was due to the vaccinia virus — not smallpox — running riot on the diseased skin).

As a med student, I remember learning how to distinguish the cutaneous lesions of smallpox from chicken pox. Smallpox was declared eradicated in 1980, but, as a dermatology registrar, seeing the occasional adult with chickenpox who seemed so ill (in comparison with kids), I often had doubts that I had to reason away. Perhaps those stores held by the US and USSR were not so secure…

Before Jenner

Jenner and smallpox vaccination go together in popular accounts, but the history of this particular clinical discovery is much older and richer — at least to me.

As ever, in medicine, and in particular for anything involving the skin, the terminology is confusing. The Latin name for cowpox is Variolae vaccinae, meaning the pox from the cow (vacca). It was Pasteur who subsequently honoured Jenner by deciding that all forms of inoculation be called vaccination.

Edward Jenner took advantage of the already-known fact that whilst milkmaids tended to be afflicted with the far more mild cowpox virus, they rarely suffered from the more serious, smallpox (they are different, but related, viruses). Jenner, in 1796, inoculated an eight-year-old boy with the pus from a milkmaid’s cowpox sore. After being exposed to smallpox material the boy appeared immune, in that he did not suffer adversely when subsequently exposed to smallpox.

Once Jenner’s method was accepted as safe, Acts of Parliament introduced free vaccination n 1840, and vaccination became obligatory in 1853.

I had never been quite certain of the distinction between inoculation and vaccination, but there is history here too. Shapin writes that the term inoculation was borrowed from horticulture — the grafting of a bud (or ‘eye’) to propagate a plant (something I was taught how to do in horticulture lessons when I was aged 11, in school in Cardiff, by Brother Luke, who, I thought so old, he might have been a contemporary of Jenner). Why the name is apt is explained below.

Before vaccination, inoculation was actually meant to give you a direct form of smallpox (this was also referred to as variolation, after variola, the term for smallpox). The source material, again, was from a lesion of somebody with smallpox. The recipient it was hoped would develop a milder version of smallpox. Shapin writes:

The contract with the inoculator was to accept a milder form of the disease, and a lower chance of death, in exchange for a future secured from the naturally occurring disease, which carried a high chance of killing or disfiguring.

Shapin tells us that Lady Mary Wortley Montagu, when holed up with her husband in Constantinople in 1717, heard stories about how such ‘in-grafting’ was in widespread use by the ‘locals’. She was scarred from smallpox, and therefor she had the procedure carried out on her then five-year-old son. The needle was blunt and rusty, but her son suffering just a few spots and the procedure was judged a success. He was now immune to smallpox.

Not surprisingly, the story goes back further: inoculation was folk medicine practice in Pembrokeshire as early as 1600, and the Chinese had been blowing dried, ground-up smallpox material up the nose for many centuries.

There is capitalism and then there is capitalism.

The London medical establishment were apparently not too impressed with the non-British origin of such scientific advance, nor its apparent simplicity (and hence low cost). So, they made the procedure made much more complicated, with specific diets being required, along with advice on behaviour, and, of course, blood-lettings and laxatives, all in addition to not just a ‘prick’ but a bigger incision (payment by the inch). The physician’s ‘fees’ no doubt rose in parallel. Not a bad business model, until…

There is plenty of room at the bottom.

The London physicians’ ‘add-ons’ put the treatment costs of inoculation out of reach of most of the population, restricting it, for decades, to the ‘medical carriage trade’. Along comes Richard Sutton, a provincial barber-surgeon, with no Latin or Greek, no doubt, who effectively industrialised the whole process, making it both more profitable and cheaper for the customer.

Based in a village near Chelmsford, he inoculated tens of thousands locally. The method was named after him, the Suttonian Method. On one day he inoculated 700 persons. Incisions (favoured by the physicians) were replaced by the simpler prick, and patients were not confined, but instead told to go out in the fresh air (day-case, anybody?). Product differentiation was of course possible:spa-like pampering in local accommodation was available for the top end of the market, with a set of sliding fees depending on the degree of luxury.

Shapin writes:

Splitting the market and niche pricing were aspects of Sutton’s business success, but so too was control of supply. The extended Sutton clan could satisfy a significant chunk of provincial demand, but Daniel also worked out a franchising system, which ‘authorised’ more than fifty partners throughout Britain and abroad to advertise their use of the ‘Suttonian System’ — provided they paid fees for Sutton-compounded purgatives, kicked back a slice of their take, and kept the trade secrets. Control was especially important, since practically, anyone could, and did, set up as an inoculator. The Suttons themselves had become surgeons through apprenticeship, but apothecaries, clergymen, artisans and farmers were inoculating, and sometimes parents inoculated their own children. The profits of the provincial press were considerably boosted as practitioners advertised their skills at inoculation and their keen prices. Daniel went after competitors — including his own father-in-law and a younger brother — with vigour, putting it about that the Suttonian Method depended on a set of closely held secrets, to which only he and his approved partners had access. His competitors sought to winkle out these secrets, occasionally pouncing on Sutton’s patients to quiz them about their experiences.

Sutton admitted that had ‘lost’ five out of forty thousand patients (due to smallpox). He offered a prize of 100 guineas to anybody who could prove he ever lost a patient due to inoculation, or that any patient got smallpox a second time. More confident, and perhaps more generous, than modern Pharma, I think. By 1764, Sutton had an annual income of 6300 guineas — over one million sterling in today’s money.

What’s in a name?

by reestheskin on 03/02/2021

Comments are disabled

Metabolic surgery versus conventional therapy in type 2 diabetes – The Lancet

I like the parlour game of inventing collective nouns for doctors — a ganglion of neurologists, a scab of dermatologists, and so on— and I also cannot help but smile when Mr Butcher turns out to be, well, a surgeon, and Lord Brain is a…. You get it.

I saw this article when I was skimming through the Lancet the other week, and something tweaked in my mind from long-back.

Metabolic surgery versus conventional therapy in type 2 diabetes. Alexander D Miras & Carel W le Roux

A few more details:… “report their trial in which they randomly assigned patients to metabolic surgery or medical therapy for type 2 diabetes.1 60 white European patients (32 [53%] women) were evaluated 10 years after Roux-en-Y gastric bypass (RYGB), biliopancreatic diversion (BPD), or conventional medical therapy.

Now, I suspect the name Roux is not rare but of course checking with Wikipedia, I can remind you:

César Roux – Wikipedia

César Roux (23 March 1857, in Mont-la-Ville – 21 December 1934, in Lausanne) was a Swiss surgeon, who described the Roux-en-Y procedure.

No relation to Ross-on-Wye.

Can Medicine be Cured?

by reestheskin on 29/01/2021

Comments are disabled

I am probably biased as my mother was Irish, one of a large O’Mahony clan who were born in or around Cork. She moved to Dublin in her early teens, crossed the water in 1941, and, a few years later, after meeting a Welshman from Neath, set up home together in Cardiff. Cardiff had a long-established Irish community, one that was good at replenishing itself with fresh blood from the motherland, in tune with the rhythms of economic cycles. I was sent to a Christian Brothers’ school where many of the brothers were Irish exports. In junior school, at least, I can remember having some green plant pinned to my lapels on March 17. I didn’t stand out, many of the other kids were similarly tattooed. If I count my extended family — including my brother — they mainly reside across the water.

Without any grand theory at hand I have always thought there must be something special about schooling in Ireland, even if the supposed benefits are not intentional, nor shared equally. Historically, there are many bad things; this is well known. But if I cast an eye over medics who I judge to write deeply about medicine, there is a disproportionate number of Irish doctors.

Anthony Clare was the first example I came across. I was a wannabe psychiatrist when I was a medical student in Newcastle, and I spent undoubtedly the most enjoyable three months of my undergraduate medical course, in Edinburgh, on an elective on the unit headed by Prof Bob Kendell. For most of this time, since it was summer, there were no Edinburgh students, and so I was viewed by the staff as useful. One of the tragedies of being a medical student nowadays is that you don’t feel useful simply because for most of the time you are not useful. The thousand-year-old laws that guide apprenticeship learning have not so much been forgotten but well and truly trashed.

Clare wrote a wonderful book Psychiatry in Dissent when he was still a Senior Registrar at the Maudsley. Despite the title, it was a calm and measured critique of medicine and psychiatry. Reading it felt dangerous, but more than that, it was the voice of quiet reason and a call to arms. It stands as an example of the difference between a medical education and a medical training. The GMC don’t do the former, nor does the NHS.

Another Irish psychiatrist whose writings I have admired is David Healy. Healy is now in Canada and, as far as I can see, has been blackballed by the UK medical and psychiatric establishment. His three-volume history of Psychopharmacology (The Psychopharmacologists) is a masterpiece. Sam Shuster was the first person to introduce me — over coffee— to how many of the revolutionary drugs that changed medicine in the middle of the last century were developed, but Healy’s scholarship cast it in a larger and richer framework. Healy has done lots of other things, too, and possesses a well of moral courage that puts to shame most of the so-called leaders of the profession.

James McCormick was a professor of general practice at Trinity, in Dublin. I first came across him when he contributed a chapter to Bruce Charlton’s and Robert Downie’s book on Medical Education (The Making of a Doctor: Medical Education in Theory and Practice). I have only recently returned to this book, but reading his chapter is disturbing because it makes you shocked that you can ever have been taken in by the pabulum of the modern world of ‘primary care’ and its apologists. The late and singular Petr Skrabenek found his academic home with McCormick in Trinity. Petr was on holiday in Ireland when the Russian tanks rolled into his hometown of Prague n 1968. Yes, not Irish, but if you read his writings about medicine (check out Wikipedia), and have dipped into Havel and Flann O’Brien, you see he is of that place.

Seamus O’Mahony, a ‘Cork-man’, who used to work in Edinburgh before returning to Ireland, has published two books about medicine. The first — which I haven’t read — is named The Way We Die Now. His more recent book, published in 2019, is titled Can Medicine Be Cured? The full title is Can Medicine Be Cured?: The Corruption of a Profession. You get the message, and the jury didn’t take long to realise which clause required an affirmative verdict.

The book covers a lot of ground, yet the pages fly by. There are chapters on how much medical research is dysfunctional — when it is not criminal; on the corruption of the academy; and the oft hidden problems of combining the practice of science and medicine. He talks about pharma (of course), the invention of disease (there isn’t enough money in treating the sick, we need to treat the non-sick), the McNamara fallacy (data, just data, dear boy), and the never-ending bloody ‘War on Cancer’. He picks apart the lazy confusion between needs, wants, and consumerism, and highlights the fact that the directionless NHS is run by politicians who want to do everything— except politics: they just want to be loved. Meanwhile, the medics tired of the ever faddish evidence-based medicine turned to sentimentality-based medicine allowing ‘empathy’ and superstition to ride roughshod over the ability to reason about the natural world, and their patients. Among doctors and medical students, a facile sense of feeling good about yourself has overtaken technical mastery, allowing all to wallow in kumbaya around the campfire they now pretend to share with their charges. Not so. If doctors were once labelled as priests with a stethoscope, we have cast our tool away.

O’Mahony writes well. I particularly liked his metaphors that are familiar to anybody brought up in Catholicism even if they left the bus before it (and they) returned to the terminus. A few examples:

The decadence of contemporary biomedical science has a historic parallel in the mediaeval pre-Reformation papacy. Both began with high ideals. Both were taken over by careerists who corrupted these ideals, while simultaneously paid lip-service to them. Both saw the trappings of worldly success as more important than the original ideal. Both created a self-serving high priesthood. The agenda for the profession is set by an academic elite (the hierarchy of bishops and cardinals), all the day-to-day work is done by low status GPs and hospital doctors (curates, monks). This elite, despite having little to do with actual patient care, is immensely powerful in the appointment of doctors.”

The Czech polymath and contrarian Peter Skrabanek (1940–94) taught these skills at Trinity College Dublin medical school during the 1980s and early 1990s, and lamented that “my course on the critical appraisal evidence for medical students can be compared to a course on miracles by a Humean sceptic for prospective priests in a theological seminary”.

And on attending a consensus conference of medical experts in Salerno (you only have a consensus when there bloody-well isn’t any…).

I found a picture online of the experts gathered at Salermo, and was reminded of a fresco in the Sistine chapel depicting the first Council of Nicea in A.D. 325. The council was convened by the Emperor Constantine to establish doctrinal orthodoxy within the early Christian Church. The industry-sponsored get-together in Salerno had similar aims.… The aim is expansionist: the establishment of a new disease by consensus statement, the Big Science equivalent of a papal bull. Non-coeliac gluten sensitivity has been decreed by this edict, just as papal infallibility was decreed by the first Vatican Council in 1870.

As for the sick and needy

Meanwhile, the sick languish. The population are subjected to more and more screening programs (for breast cancer, cervical cancer, colon cancer, high blood pressure, cholesterol levels etc.), but if they become acutely ill and need to go to hospital, it is likely that they will spend hours on a trolley in an emergency department. When they are finally admitted to a ward, it is often chaotic, squalid and understaffed. Hospices have to rely on charity just to keep going, and have so few beds that ten times as many people die in general hospitals than hospices.

And, as for David Cameron (lol!) and his plans to fund cancer drugs that were rejected by NICE, well, he was a nice (not NICE) guy, and he was on the side of the people. O’Mahony points out that this money alone would have funded all UK hospices for over a year.

Populism doesn’t cure cancer, but it trumps justice, evidence and fairness every time.

Along with Henry Marsh’s Do No Harm, O’Mahony’s book deserves to become a classic. Buy it and read it. Just don’t turn it into a multiple-choice exam. A brave medical school might even add it to the freshers’ pack — well, I can still dream.


The photo, facing towards Kerry, is from Penglas, a mainly Welsh hamlet in West Cork.

Pathologies of power

by reestheskin on 16/01/2021

Comments are disabled

COVID

UK COVID-19 public inquiry needed to learn lessons and save lives – The Lancet

It is hard not to be moved nor not be angry on reading the editorial in this week’s Lancet, written by three members of the Covid-19 Bereaved Families for Justice group.

The UK Prime Minister Boris Johnson has previously suggested that an immediate public inquiry into the government’s handling of COVID-19 would be a distraction7 or diversion of resources in the fight against COVID-19. We have long proposed that quite the opposite is true: an effective rapid review phase would be an essential element in combating COVID-19.

An independent and judge-led statutory public inquiry with a swift interim review would yield lessons that can be applied immediately and help prevent deaths in this tough winter period in the UK. Such a rapid review would help to minimise further loss of life now and in the event of future pandemics. In the wake of the Hillsborough football stadium disaster on April 15, 1989, for example, the Inquiry of Lord Justice Taylor delivered interim findings within 11 weeks, allowing life-saving measures to be introduced in stadiums ahead of the next football season.

I will quote Max Hastings, a former editor of the Daily Telegraph and Evening Standard, and a distinguished military historian, writing in the Guardian many years ago. He was describing how he had overruled some of his own journalists who had suspected Peter Mandelson of telling lies.

I say this with regret. I am more instinctively supportive of institutions, less iconoclastic, than most of the people who write for the Guardian, never mind read it. I am a small “c” conservative, who started out as a newspaper editor 18 years ago much influenced by a remark Robin Day once made to me: “Even when I am giving politicians a hard time on camera,” he said, “I try to remember that they are trying to do something very difficult – govern the country.” Yet over the years that followed, I came to believe that for working journalists the late Nicholas Tomalin’s words, offered before I took off for Vietnam for the first time back in 1970, are more relevant: “they lie”, he said. “Never forget that they lie, they lie, they lie.” Max Hastings

Two of Hasting’s journalists at the Evening Standard were investigating the funds Peter Mandelson had used to purchase a house.

One morning, Peter Mandelson rang me at the Evening Standard. “Some of your journalists are investigating my house purchase,” he said. “It really is nonsense. There’s no story about where I got the funds. I’m buying the house with family money.”

I knew nothing about any of this, but went out on the newsroom floor and asked some questions. Two of our writers were indeed probing Mandelson’s house purchase. Forget it, I said. Mandelson assures me there is no story. Our journalists remonstrated: I was mad to believe a word Mandelson said. I responded: “Any politician who makes a private call to an editor has a right to be believed until he is proved a liar.” We dropped the story.

Several months later

…when the Mandelson story hit the headlines, I faced a reproachful morning editorial conference. A few minutes later, the secretary of state for industry called. “What do I have to do to convince you I’m not a crook ?” he said.

I answered: “Your problem, Peter, is not to convince me that you are not a crook, but that you are not a liar.”

The default, and most sensible course of action, is to assume that the government and many of those who answer directly to the government have lied and will continue to lie.


Canadian’s tolerance of mediocrity

Where Health Care Is a Human Right | by Nathan Whitlock | The New York Review of Books

An article discussing Canadian health care with echoes of the UK’s own parochial attitude to health care (and don’t mention Holland, Germany, France, Switzerland…).

How do such gaps and problems persist? Part of the problem, ironically, is the system’s high approval ratings: with such enthusiasm for the existing system, and with responsibility for it shared between federal and provincial or territorial governments, it’s easy for officials to avoid making necessary changes. Picard sees our narrowness of perspective as a big obstacle to reform: “Canadians are also incredibly tolerant of mediocrity because they fear that the alternative to what we have is the evil US system.” Philpott agrees that Canadians’ tendency to judge our system solely against that of the United States can be counterproductive. “If you always compare yourself to the people who pay the most per capita and get some of the worst outcomes,” she told me in a recent Zoom call, “then you’re not looking at the fact that there are a dozen other countries that pay less per capita and have far better outcomes than we do.”


The longest lasting health care corpotation in the world

Holy See – Wikipedia

The Holy See is thus viewed as the central government of the Catholic Church. The Catholic Church, in turn, is the largest non-government provider of education and health care in the world.[8] The diplomatic status of the Holy See facilitates the access of its vast international network of charities.[emphasis added]


The antithesis of science, is not art, but politics

There is a famous quote ( I don’t have a primary source) by the great Rudolf Virchow

“Medicine is a social science, and politics is nothing more than medicine on a large scale.”

I know what Virchow was getting at, but if only.

Who — or what — cares

by reestheskin on 14/12/2020

Comments are disabled

I think the quip was from the series Cardiac Arrest: the ITU used to be called the ICU (intensive care unit) until they realised nobody did.

In March, 2019, a doctor informed 78-year-old Ernest Quintana, an inpatient at a hospital in California, USA, that he was going to die. His ravaged lungs could not survive his latest exacerbation of chronic obstructive pulmonary disease, so he would be placed on a morphine drip until, in the next few days, he would inevitably perish. There was a twist. A robot had delivered the bombshell. There, on a portable machine bearing a video screen, crackled the pixelated image of a distant practitioner who had just used cutting-edge technology to give, of all things, a terminal diagnosis. The hospital insisted that earlier conversations with medical staff had occurred in person, but as Mr Quintana’s daughter put it: “I just don’t think that critically ill patients should see a screen”, she said. “It should be a human being with compassion.”

From Care in crisis – The Lancet

Retirement and the Curse of Lord Acton

by reestheskin on 10/12/2020

Comments are disabled

According to a helpful app on my phone that I like to think acts as a brake on my sloth, I retired 313 days ago. One of the reasons I retired was so that I could get some serious work done; I increasingly felt that professional academic life was incompatible with the sort of academic life I signed up for. If you read my previous post, you will see this was not the only reason, but since I have always been more of an academic than clinician, my argument still stands.

Over twenty years ago, my friend and former colleague, Bruce Charlton, observed wryly that academics felt embarrassed — as though they had been caught taking a sly drag round the back of the respiratory ward — if they were surprised in their office and found only to be reading. No grant applications open; no Gantt charts being followed; no QA assessments being written. Whatever next.

I thought about retirement from two frames of reference. The first, was about finding reasons to leave. After all, until I was about 50, I never imagined that I would want to retire. I should therefore be thrilled that I need not be forced out at the old mandatory age of 65. The second, was about finding reasons to stay, or better still, ‘why keep going to work?’. Imagine you had a modest private income (aka a pension), what would belonging to an institution as a paid employee offer beyond that achievable as a private scholar or an emeritus professor? Forget sunk cost, why bother to move from my study?

Many answers straddle both frames of reference, and will be familiar to those within the universities as well as to others outwith them. Indeed, there is a whole new genre of blogging about the problems of academia, and employment prospects within it (see alt-acor quit-lit for examples). Sadly, many posts are from those who are desperate to the point of infatuation to enter the academy, but where the love is not reciprocated. There are plenty more fish in the sea, as my late mother always advised. But looking back, I cannot help but feel some sadness at the changing wheels of fortune for those who seek the cloister. I think it is an honourable profession.

Many, if not most, universities are very different places to work in from those of the 1980s when I started work within the quad. They are much larger, they are more corporatised and hierarchical and, in a really profound sense, they are no longer communities of scholars or places that cherish scholarly reason. I began to feel much more like an employee than I ever used to, and yes, that bloody term, line manager, got ever more common. I began to find it harder and harder to characterise universities as academic institutions, although from my limited knowledge, in the UK at least, Oxbridge still manage better than most 1. Yes, universities deliver teaching (just as Amazon or DHL deliver content), and yes, some great research is undertaken in universities (easy KPIs, there), but their modus operandi is not that of a corpus of scholars and students, but rather increasingly bends to the ethos of many modern corporations that self-evidently are failing society. Succinctly put, universities have lost their faith in the primacy of reason and truth, and failed to wrestle sufficiently with the constraints such a faith places on action — and on the bottom line.

Derek Bok, one of Harvard’s most successful recent Presidents, wrote words to the effect that universities appear to always choose institutional survival over morality. There is an externality to this, which society ends up paying. Wissenschaft als Beruf is no longer in the job descriptions or the mission statements2.

A few years back via a circuitous friendship I attended a graduation ceremony at what is widely considered as one of the UK’s finest city universities3. This friend’s son was graduating with a Masters. All the pomp was rolled out and I, and the others present, were given an example of hawking worthy of an East End barrow boy (‘world-beating’ blah blah…). Pure selling, with the market being overseas students: please spread the word. I felt ashamed for the Pro Vice Chancellor who knew much of what he said was untrue. There is an adage that being an intellectual presupposes a certain attitude to the idea of truth, rather than a contract of employment; that intellectuals should aspire to be protectors of integrity. It is not possible to choose one belief system one day, and act on another, the next.

The charge sheet is long. Universities have fed off cheap money — tax subsidised student loans — with promises about social mobility that their own academics have shown to be untrue. The Russell group, in particular, traducing what Humboldt said about the relation between teaching and research, have sought to diminish teaching in order to subsidise research, or, alternatively, claimed a phoney relation between the two. As for the “student experience”, as one seller of bespoke essays argued4, his business model depended on the fact that in many universities no member of staff could recognise the essay style of a particular student. Compare that with tuition in the sixth form. Universities have grown more and more impersonal, and yet claimed a model of enlightenment that depends on personal tuition. Humboldt did indeed say something about this:

“[the] goals of science and scholarship are worked towards most effectively through the synthesis of the teacher’s and the students’ dispositions”.

As the years have passed by, it has seemed to me that universities are playing intellectual whack-a-mole, rather than re-examining their foundational beliefs in the light of what they offer and what others may offer better. In the age of Trump and mini-Trump, more than ever, we need that which universities once nurtured and protected. It’s just that they don’t need to do everything, nor are they for everybody, nor are they suited to solving all of humankind’s problems. As had been said before, ask any bloody question and the universal answer is ‘education, education, education’. It isn’t.

That is a longer (and more cathartic) answer to my questions than I had intended. I have chosen not to describe the awful position that most UK universities have found themselves in at the hands of hostile politicians, nor the general cultural assault by the media and others on learning, rigour and nuance. The stench of money is the accelerant of what seeks to destroy our once-modern world. And for the record, I have never had any interest in, or facility for, management beyond that required to run a small research group, and teaching in my own discipline. I don’t doubt that if I had been in charge the situation would have been far worse.

 

Reading debt

 

Sydney Brenner, one of the handful of scientists who made the revolution in biology of the second half of the 20th century once said words to the effect that scientists no longer read papers they just Xerox them. The problem he was alluding to, was the ever-increasing size of the scientific literature. I was fairly disciplined in the age of photocopying but with the world of online PDFs I too began to sink. Year after year, this reading debt has increased, and not just with ‘papers’ but with monographs and books too. Many years ago, in parallel with what occupied much of my time — skin cancer biology and the genetics of pigmentation, and computerised skin cancer diagnostic systems — I had started to write about topics related to science and medicine that gradually bugged me more and more. It was an itch I felt compelled to scratch. I wrote a paper in the Lancet   on the nature of patents in clinical medicine and the effect intellectual property rights had on the patterns of clinical discovery; several papers on the nature of clinical discovery and the relations between biology and medicine in Science and elsewhere. I also wrote about why you cannot use “spreadsheets to measure suffering” and why there is no universal calculus of suffering or dis-ease for skin disease ( here and here ); and several papers on the misuse of statistics and evidence by the evidence-based-medicine cult (here and here). Finally, I ventured some thoughts on the industrialisation of medicine, and the relation between teaching and learning, industry, and clinical practice (here), as well as the nature of clinical medicine and clinical academia (here  and here ). I got invited to the NIH and to a couple of AAAS meetings to talk about some of these topics. But there was no interest on this side of the pond. It is fair to say that the world was not overwhelmed with my efforts.

At one level, most academic careers end in failure, or at last they should if we are doing things right. Some colleagues thought I was losing my marbles, some viewed me as a closet philosopher who was now out, and partying wildly, and some, I suspect, expressed pity for my state. Closer to home — with one notable exception — the work was treated with what I call the Petit-mal phenomenon — there is a brief pause or ‘silence’ in the conversation, before normal life returns after this ‘absence’, with no apparent memory of the offending event. After all, nobody would enter such papers for the RAE/REF — they weren’t science with data and results, and since of course they weren’t supported by external funding, they were considered worthless. Pace Brenner, in terms of research assessment you don’t really need to read papers, just look at the impact factor and the amount and source of funding: sexy, or not?5

You have to continually check-in with your own personal lodestar; dead-reckoning over the course of a career is not wise. I thought there was some merit in what I had written, but I didn’t think I had gone deep enough into the problems I kept seeing all around me (an occupational hazard of a skin biologist, you might say). Lack of time was one issue, another was that I had little experience of the sorts of research methods I needed. The two problems are not totally unrelated; the day-job kept getting in the way.

 

 

Continue Reading

The last three patients: dermatology

by reestheskin on 27/11/2020

Comments are disabled

Patient 3

He was nearer seventy than sixty, and not from one of Edinburgh’s more salubrious neighbourhoods. He sat on the examination couch unsure what to do next. His right trouser was leg rolled up, exposing a soiled bandage crusted with blood that had clearly been there for more than a few days. He nodded as I walked into the clinic room and I introduced myself with a shake of his hand. This was pre-covid.

I knew his name because that was typed on the clinic list alongside the code that said he was a ‘new’1 patient, but not much else. Not much else because his clinical folder contained sticky labels giving his name, address, date of birth and health care number only. That was it. As has become increasingly the norm in the clinic room, you ask the patient if they know why they are here.

He had phoned the hospital four days earlier, he said, and he was very grateful that he had been given an appointment to see me. He thanked me as though I was his saviour. If true, I didn’t know from what or from whom. If he was a new patient he would have seen his GP and there should be a letter from his GP in his notes. But no, he hadn’t seen his GP for over a year. Had I seen him before? No, he confirmed, but he had seen another doctor in the very same department about eighteen months previously. I enquired further. He said he had something on his leg — at the site of the distinctly un-fresh bandage — that they had done something to. It had now started to bleed spontaneously. He had phoned up on several occasions, left messages and, at least once, spoken to somebody who said they would check what had happened and get back to him. ‘Get back to you’ is often an intention rather than an action in the NHS, so I was not surprised when he said that he had heard nothing back. His leg was now bleeding and staining his trousers and bed clothes, hence the bandage. He thought that whatever it had been had come back.

Finally, four days before this appointment day, after he relayed his story one more time over the phone, he had been given this appointment. He again told me again how grateful he was to me for seeing him. And no, he didn’t know what diagnosis had been made in the past. I asked him had he received any letters from the hospital. No, he replied. Could he remember the name of any of the doctors he had seen over one year previously? Sadly, not. Had he been given an appointment card with a consultant’s name on? No.

There was a time when nursing and medicine were complementary professions. At one time the assistant who ushered him into the clinic room would have removed the bandage from his leg. In my clinical practice, those days ended long ago. I asked him if he would unwrap the bandage while I went in search of our admin staff to see if they knew more than me about why he was here.

He had been seen before, just as he had said, around eighteen months earlier. He had seen an ‘external provider’, one of a group of doctors employed via commercial agencies who are contracted to cope with all the patients that the regular staff employed by the hospital are unable to see. That demand exceeds supply, is the one feature of the NHS that all agree on, whatever their politics. It outlives all reorganisations. Most of these external provider doctors travel up for weekends, staying in a hotel for one or more nights, and then fly back home. They get paid more than the local doctors (per clinic), and the agency takes a substantial arrangement fee in addition. This had been the norm for over ten years, and of course makes little clinical or financial sense — except if the name of the game is to be able to shape waiting lists with electoral or political cycles, turning the tap on and off. Usually more off, than on.

The doctors who undertake this weekend work are a mixed bunch. Most of them are very good, but of course they don’t normally work in Scotland, and medicine varies across the UK and Europe, and even between regions within one country. It is not so much the medicine that is very different, but the way that different components of care fit together organisationally that are not constant. This hints at one fault line.

That the external doctors are more than just competent is important for another reason. The clinic lists of the visiting doctors are much busier than those of the local doctors, and are full of new patients rather than patients brought back for review. The NHS and the government consider review appointments as wasteful, and that is why all the targets relate to ‘new’ patients. It’s a numbers game: stack them high, don’t let the patients sit down for too long, and process them. Meet those government targets and move in phase with the next election cycle. Consequently, the external provider doctors are being asked to provide episodic care under time pressure; speed dating rather than maintaining a relationship. For most of the time, nobody who actually works in Edinburgh knows what is going on with the patient. But the patients do live in Edinburgh.

Old timers like me know that one of the reasons why review appointments are necessary is that they are a security net, a back up system. In modern business parlance, they add resilience. Like stocks of PPE. In the case of my man, a return appointment would have provided the opportunity to tell him what the hell was going on and to ensure that all that had actually been planned had been carried out. There is supposed to be a beginning, a middle and an end. There wasn’t.

An earlier letter from an external provider doctor was found. It was a well-written summary of the consultation. The patient had a lesion on his leg that was thought clinically to be pre-malignant. The letter stated that if a diagnostic biopsy confirmed this clinical diagnosis — it did — then the patient would require definitive treatment, most likely, surgical. The problem was that in this informal episodic model, the original physician was not there to act on the result; nor to observe that the definitive surgical treatment had not taken place because review appointments are invisible in terms of targets. They are wasteful.

Even before returning to the clinic room, without sight of anything but the blood stained bandage, I knew what was going on. His pre-malignant lesion had, over the period of ‘wasteful’ time, transformed into full-blown cancer. He now had a squamous cell carcinoma. His mortality risk had gone from effectively zero to maybe 5%.

I went back to the clinic room, apologised, explained what had gone on and what needed to happen now, and apologised again. The patient picked up on my mixture of frustration, shame and anger, and it embarrasses me to admit that I had somehow allowed him —mistakenly — to imagine that my emotions were a response to something he had said or done. I apologised again. And then he did say something that fired my anger. I cannot remember the whole sentence but a phrase within it stuck: ‘not for the likes of me’. His response to the gross inadequacy of his care was that it was all people like him could expect.

He was not literally the last patient in dermatology I saw, but his story was the one that told me I had to get out. When a pilot or an airline engineer says that an aircraft is safe to fly there is an unspoken bond between passengers and those who dispense a professional judgement. But this promise is also made by one human to another human. I call it the handshake test, which is why I always shook hands when I introduced myself to patients. This judgement that is both professional and personal has to be compartmentalised away from the likes of sales and marketing, the share price — and government targets or propaganda. This is no longer true of the NHS. The NHS is no longer a clinically led organisation, rather, it is a vehicle for ensuring one political gang or another gains ascendancy over the other at the next election.  It is not so much about money, as about control. True, if doctors went down with the plane, in this metaphor, there would be a much better alignment of incentives. Doctors might be yet more awkward. Better still, we might think about where we seat the politicians and their NHS commissars.

Most doctors keep a shortlist of other doctors who they think of as exceptional. These are the ones they would visit themselves or recommend to family. If I had to rank my private shortlist, I know who would come number one. She is not a dermatologist, but a physician of a different sort, and she works far away from Edinburgh. She has been as loyal and tolerant of the NHS as anybody I know — much more than me. Yet she retired  before me, and her reasoning and justification were as insightful and practical as her medical abilities. Simply put, she could no longer admit her patients and feel able to reassure them that the care they would receive would be safe. It’s the handshake test.

I don’t shake hands with patients any more.

  1. A ‘new’ outpatient is usually a patient you are seeing for the first time, after they have been referred by their GP or another consultant. During this ‘illness episode’, if you see them again, they are a ‘review’ patient. Once they have been discharged from hospital review, they may of course re-enter the system — say many years later —as a ‘new’ patient once more with the same or a different condition.

Link to patient 1

Link to patient 2

The last three patients: general medicine

by reestheskin on 25/11/2020

Comments are disabled

Patient 2

It hasn’t happened to me often — maybe on only a handful of occasions — but often enough to recognise it, and dread it. I am talking to a patient, trying to second guess the future — how likely is it that their melanoma might stay away for ever, for instance — and I find myself mouthing words that a voice in my head is warning me I will regret saying. And the voice is not so much following my words but anticipating them, so I cannot cite ignorance as an excuse, nor is it a whisper or unclear in any way, and yet I still charge on. A moment later, regret will set in, and this regret I could share with you at that very moment if you were there with me.

The patient was a young man in his early twenties, who lived with his mother, just the two of them at home. He had dark curly hair, was of average height, and he lived for running. This was Newcastle, in the time of Brendan Foster and Steve Cram. He had been admitted with pyrexia, chest pains and a cough. He had bacterial pneumonia, and although he seemed pretty sick, none of us were worried for him.

After a few days, he seemed no better, and we switched antibiotics. Medics reading this will know why. He started to improve within a day or so, and we felt we were in charge, pleased with, and confident of our decisions. This was when I spoke with his mother, updating her on his progress. Yes, he had been very ill; yes, we were certain about his diagnosis; and yes, the change of antibiotics and his response was not unexpected. I then said more. Trying to reassure her, I said that young fit people don’t die from pneumonia any more. That was it. All the demons shuttered.

At this time I was a medical registrar and I supervised a (junior) house officer (HO), and a senior house officer (SHO). In turn, my boss was a consultant physician who looked after ‘general medical’ patients, but his main focus was clinical haematology. In those days the norm was for all of a consultant’s patients to be managed on their own team ward. On our ward, maybe half the patients were general medical, and the others had haematological diseases. Since I was not a haematologist, I was solely tasked with looking after the general medical patients, and mostly acted without the need for close supervision (in a way that was entirely appropriate).

One weekend I was doing a brief ‘business’ ward round on a Sunday morning. Our young man with pneumonia was doing well, his temperature had dropped, and he was laughing and joking. We would have been making plans to let him home soon. The only thing of note was that the houseman reported that the patient had complained of some pain in one calf. I had a look and although the signs were at best minimal I wondered whether he could have had a deep vein thrombosis (DVT). Confirmatory investigations for DVTs in those days were unsatisfactory and not without iatrogenic risk, whilst the risks from anticoagulation in a previously fit young man with no co-morbidities are minimal. We started him on heparin.

A few days later he was reviewed on the consultant’s ward round. I knew that the decision to anti-coagulate would (rightly) come under review. The physical signs once subtle were now non-existent, and the anticoagulation was stopped. A reasonable decision I knew, but one that I disagreed with, perhaps more because of my touchy ego than deep clinical judgement.

Every seven to ten days or so I would be the ‘resident medical officer’ (RMO), meaning I would be on call for unselected medical emergencies. Patients might be referred directly to us by their general practitioner, or as ‘walk-ins’ via casualty (ER). In those days we would usually admit between 10 and 15 patients over a 24-hour period; and we might also see a further handful of patients who we judged did not require hospital admission. Finally, since we were resident, we continued to provide emergency medical care to the whole hospital, including our own preexisting patients.

It was just after 8.30am. The night had been quiet, and I was in high spirits as this was the last time I would act as an RMO. In fact, this was to be the last day of me being a ‘medical registrar’. Shortly after, I would leave Newcastle for Vienna and start a career as an academic dermatologist, a career path that had been planned many years before.

The clinical presentation approaches that of a cliché. A patient with or without various risk factors, but who has been ill from one of a myriad of different conditions, goes to the toilet to move their bowels. They collapse, breathless and go into shock. CPR may or may not help. A clot from their legs has broken free, and blocked the pulmonary trunk. Sufficient blood can no longer circuit from the right side of the heart to the left. The lungs and heart are torn asunder.

When the call went out, as RMO, I was in charge. Nothing we did worked. There is a time to stop, and I ignored it. One of my colleagues took the decision. Often with cardiac arrests, you do not know the patient. That helps. Often the call is about a patient who is old and with multiple preexisting co-morbidities. That is easier, too. But I knew this man or boy; and his mother.

That was the last patient I ever saw in general medicine.

[Link to Patient 1]
[Link to Patient 3]

The last three patients: general practice

by reestheskin on 23/11/2020

Comments are disabled

Patient 1 

When I was a medical registrar I did GP locums for a single-handed female GP in Newcastle. Doing them was great fun, and the money — she insisted on BMA rates — was always welcome. Nowadays, without specific training in general practice, you can’t act as a locum as I did then. This is probably for the best but, as ever, regulations always come with externalities, one of which is sometimes a reduction in overall job satisfaction.

I worked as a locum over a three year period, usually for one week at a time, once or twice a year, covering some of the GP’s annual leave. Weekdays were made up of a morning surgery (8.30 to 10.30 or later), followed by house-calls through lunchtime to early afternoon, and then an evening surgery from 4.30 to around 6:30. I also ran a short Saturday morning surgery. Within the working day I could usually nip home for an hour or so.

From 7pm till the following morning, the Doctors Deputising Service (DDS) took over for emergency calls. They also covered the weekends. The DDS employed other GPs or full-time freelancers. Junior hospital doctors often referred to the DDS as the Dangerous Doctor Service. Whether this moniker was deserved, I cannot say, but seeing patients you don’t know in unfamiliar surroundings is often tricky. Read on.

Normally, the GP would cover the nights herself, effectively being on call 24 hours per day, week in, week out. Before she took leave, she used to proactively manage her patients, letting some of her surgery ‘specials’ or ‘regulars’ know she would be away, and therefore they might be better served by waiting for her to return. Because she normally did her own night-calls, she was aware of how a small group of patients might request night visits that might be judged to be unnecessary. I think the fee the DDS charged to her was dependent on how often a visit was requested, so, as far as was reasonable, she tried to ensure her patients knew that when she was away they would only get a visit from a ‘stranger’ — home night-time call-outs should be for real emergencies. I got the strong impression that her patients were very fond of her, and she of them. Without exception, they were always very welcoming to me, and I loved the work. Yes I got paid, but it was fun medicine, and offered a freedom that you didn’t feel in hospital medicine as a junior (or senior) doctor.

The last occasion I undertook the locum was eventful. I knew that this was going to be the last occasion, as that summer I was moving on from internal medicine to start training in dermatology — leaving for Vienna in early August. A request for a house-call, from forty-year-old man with a headache, came in just as the Friday evening surgery was finishing, a short while after 6.30pm. My penultimate day. I had been hoping to get off sharpish, knowing I would be doing the Saturday morning surgery, but contractually I was covering to 7pm, so my plan was to call at the patient’s house on the way home.

I took his clinical paper notes with me. There was virtually nothing in them, a fact that doctors recognise as a salient observation. He lived, as did most of the surgery’s patients, on a very respectable council estate that literally encircled the surgery. I could have walked, but chose to drive, knowing that since I had locked up the surgery, I could go straight home afterwards.

When I got to his house, his wife was standing outside, waiting for me. She was most apologetic, informing me that her husband was not at home, but had slipped out to take his dog for a walk. I silently wondered why if this was the case, he couldn’t have taken the dog with him to the surgery, saving me a trip. No matter. Grumbling about patient behaviour is not unnatural, but is often the parent of emotions that can cloud clinical judgement. There lie dragons.

The patient’s wife ran to the local park to find her husband, who, in tow with her and the dog, came running at a fair pace back to the house a few minutes later. The story was of a headache on one side of his head, posterior to the temple, that had started a few hours earlier. The headache was not severe, he told me, and he felt well; he didn’t think he had flu. His concern was simply because he didn’t normally get headaches. There was nothing else remarkable about his history; he was not on any medication, and had no preexisting complaints or diseases beyond the occasional cold. Nor did the actual headache provide any diagnostic clues. On clinical examination, he was apyrexial, with a normal pulse and blood pressure, and a thorough neuro exam (as in that performed by somebody who had recently done a neuro job) was normal. No neck stiffness or photophobia and the fundi were visualised and clear. The best I could do was wonder about a hint of erythema on his tympanic membrane on the side of the headache, but there was no local tenderness, there. I worried I was making the signs fit the story.

I told him I couldn’t find a good explanation for his headache, and that my clinical examination of him was essentially normal. There was a remote possibility that he had a middle ear infection, although I said that since he had no history of previous ear infections, this seemed unlikely. I opted to give him some amoxycillin (from my bag) and said that whilst night-time cover would be provided by the DDS, I would be holding a surgery on the Saturday morning in just over 12 hours time. Should he not feel right, he should pop in to see me, or I could visit him again. He and his wife thanked me for coming round, I went home and, as far as I knew, that was the end of the story of my penultimate day as a locum GP. He did not come to my Saturday morning surgery.

Several weeks later, when I was back doing internal medicine and on call for urgent GP referrals, the same GP phoned me up about another of her patients who she thought merited hospital assessment. This was easily sorted, and I then asked her about some of the patients of hers I had seen when I was her locum. There was one in particular, with abdominal pain, whom I had sent into hospital, and I wanted to know what had happened to him. She then told me that the patient had meningitis. There was a moment of confusion: we were not talking about the same patient.

The story of the man with the headache was as follows. I had seen him just before 7pm, apyrexial, fully conscious, with a normal pulse and blood pressure, and no neuro signs. By 8pm his headache was much more severe and his wife put a call into the DDS who saw him before 9pm, but could not find anything abnormal. By 10.30pm he was barely conscious, and his wife called the DDS who were going to be delayed. Soon after, she dialled 999. He was admitted and diagnosed and treated for bacterial meningitis. The GP told me he had made a prompt and complete recovery.

That was the last patient I ever saw in general practice.

[Link to patient 2]

[Link to patient 3]

You would have to be mad…

by reestheskin on 20/11/2020

Comments are disabled

“I have been this close to buying a nursing school.” This is not a sentence you expect to hear from a startup founder. Nursing seems a world away from the high-tech whizziness of Silicon Valley. And, to use a venture-capital cliché, it does not scale easily.

This was from an article in the Economist awhile back. As ever, there is a mixture of craziness and novelty. The gist of the article is about Lambda School, a company that matches ‘fast’ training with labour force shortages (hence the nursing angle). When I first read it, I had thought they had already opened a nursing school, but that is not so. Nonetheless, there are aspects that interest me.

We learn that

  1. Full-time students attend for nine months, five days a week from 8am to 5pm. Latecomers risk falling behind, but for most classes, 85% of students who began a course finish. Study is online but ‘live’ (rather than pre-recorded videos). These completion rates are a lot higher than for many community colleges in the US.
  2. Lambda only gets paid after its students have landed a job which pays them more than $50,000 a year. Around 70% of those enrolled do so within six months of graduation. Lambda then receives about a sixth of their income for the next two years, until they have paid about $30,000 (or they could pay £20,000 up front).
  3. One third of the costs are spent on finding jobs for graduates, another third on recruitment and only one third on the actual teaching. Scary.

The Economist chimes in with the standard “Too often students are treated as cash cows to be milked for research funding.” Too true, but to solve this issue we need to massively increase research costings, have meaningful conversations with charities and government (including the NHS) about the way students are forced to involuntarily subsidise research, and cut out a lot of research in universities that is the D of R&D.

But this is not a sensible model for a university. On the other hand it is increasingly evident to me that universities are not suitable places to learn many vocational skills. The obvious immediate problem for Lambda is finding and funding a suitable clinical environment. That is exactly the problem that medical (or dental) schools face. A better model is a sequential one, one which ironically mimics the implicit English model of old: university study, followed by practical hospital clerkships. Just tweak the funding model to allow it.

It’s just business

by reestheskin on 19/11/2020

Comments are disabled

I have rich memories of general practice, and I mean general practice rather than primary care 1. My earliest memories centre around a single-handed GP, who looked after my family until we left Wales in the early 1970s. His practice was in his house, just off Whitchurch village in Cardiff. You entered by what once may have been the back gate or tradesman’s entrance. Around the corner and a few steps up, you found the waiting room. Originally, I guess, it might have been a washroom or utility room for a maid or housekeeper. By the standards of the Rees abode the house was large.

The external door to the waiting room was opposite the door into the main part of the doctor’s house, and on the adjacent sides were two long benches. They were fun for a little boy to sit on because since your legs couldn’t touch the floor, you could shuffle along as spaces became available. When you did this adults tended to smile at you; I now know why. If you were immobile for too long your thighs might stick to the faux-leather surface; pulling them away fast resulted in a fart like noise, although in those days I was too polite to think out loud.

Once you were called — whether it was by the doctor or his wife I cannot remember— you entered his ‘rooms’. The consulting rooms was by my preferred unit measure — how far I could kick a ball — large, with higher ceilings than we had at home. The floorboards creaked and the carpet was limited to the centre of the room. If there was a need for privacy there was what seemed like a fairly inadequate freestanding curtained frame. For little boys, obviously, no such cover was deemed necessary.

I can remember many home visits: two stand out in particular, mumps, and an episode of heavily infected eczema where my body was covered in thousands of pustules, and where I remember pulling off sheets of skin that had stuck to the bedclothes. The sick-role was respected in our home: if you were ill and off school you were in bed. Well, almost. Certainly, no kicking the ball against the wall.

Naturally, the same GP would look after any visitors to my home. Although my memories are influenced by what my mother told me, on one occasion my Irish grandmother’s valvular heart failure decompressed when she was staying with us (her home was in Dublin). More precisely, I was turfed out of my bed, so she could occupy it. The GP phoned the Cardiff Royal Infirmary explaining that the patient needed admission, and would they oblige? The GP however took ten years-or-so off her true age. Once he was off the phone, my mother corrected him. He knew better: if I had told them the truth they would have refused to admit her, he said. (This was general practice, not state medicine, after all). The memory of this event stuck with me when I was a medical student on a geriatrics attachment in Sunderland circa 1981. Only those under 60 with an MI were deemed suitable for admission to the CCU, with the rest left in a large Nightingale ward with no cardiac monitoring 2. I thought of my father who was then close to 60.

I was lucky enough to be able to recognise this type of general practice — albeit with many much needed changes — as a medical student in Newcastle, and to be taught by some wonderful GPs, and even do some GP locums when I was a medical registrar. And although I had never met the late and great Julian Tudor-Hart face-to-face, we are linked by a couple of mutual Welsh friends, and we exchanged odd emails over the years.

So, why do I recall all of this? Nostalgia? Yes, I own up to that. But more out of anger that what was unique about UK general practice has been replaced by primary care and “population medicine”, and many patients are worse off because of this shift. Worse still, it now seems all is viewed not through the lens of vocation, but by the egregious ‘its just business’. Continuity of care and “personal doctoring” is, and has been, lost.

I write after being provoked by a comment in the London Review of Books. Responding to a terrific article by John Furse on the NHS, Helen Buckingham of the Nuffield Trust states — as many do — that “The reality is that almost all GP practices are already private businesses, and have been since the founding of the NHS.” (LRB 5/12/2019 page 4).

Well, for me, this is pure sophistry. There are businesses and businesses. If you wish, you might call the Catholic Church a business, or Edinburgh university a business, or even the army a business. You might even refer to each of them as a corporation. But to do so, misses all those human motivations that make up civil society. Particularly the ability to look people in the eye and not feel grimy. There is no way on earth that the GP who looked about me would have called what he did a business. Nor was he part of any corporation. And the reason is simple: like many think tanks, many modern corporations — especially the larger ones — have no sense of morality beyond the dollar of the bottom line3, often spending their undoubted skills wilfully arbitraging the imperfections of regulation and honest motivation. It does not have to be this way.

  1. Here I am echoing the arguments made by Howie, Metcalfe and Walker in the BMJ in 2008: The State of General Practice — not all for the better. Comments on this article effectively said: the halcyon days of general practice were over. Get used to it! I am not convinced. What has happened is that ‘government led population / public health’ has gobbled up ‘personal doctoring’. For the latter, it appears, you will need more than the NHS.
  2. Many epidemiologists argued that there was no need for CCUs as no RCTs had shown their benefit. Ditto for parachutes, renal transplantation , no doubt.
  3. You can insert your own favourite de jour: Pfizer and Flynn for raising the price of an anti-epilepsy drug by up to 2,600 per cent, or GSK, or Crapita, Test and Trace etc. The list goes on, well before we get to the likes of Facebook or the Financial Services Industry

On Idle thoughts and Flexner

by reestheskin on 18/11/2020

Comments are disabled

I have previously commented on Abraham Flexner on this site. The Flexner report is the most influential review of US medical education ever published, although some would argue that the changes it recommended were already working their way through the system. For a long time I was unaware of another project of his, an article with the title The Usefulness of Useless Knowledge 1. For me, there are echoes of Bertrand Russell’s In Praise of Idleness and the fact that Flexner’s essay was published at the onset of World War 2 adds anther dimension to the topic.

As for medical education, the ever-growing pressure is to teach so much that many students don’t have time to learn anything. I wish some other comments from Flexner opened any GMC dicta on what a university medical education should be all about.

“Now I sometimes wonder,” he wrote, “whether there would be sufficient opportunity for a full life if the world were emptied of some of the useless things that give it spiritual significance; in other words, whether our conception of what is useful may not have become too narrow to be adequate to the roaming and capricious possibilities of the human spirit.”

  1. The essay originally published in Harper’s Magazine was republished with a companion essay by Robbert Dijkgraaf by Princeton University Press in 2017.

On rejection by editors and society

by reestheskin on 16/11/2020

Comments are disabled

The history of science is the history of rejected ideas (and manuscripts). One example I always come back to is the original work of John Wennberg and colleagues on spatial differences in ‘medical procedures’ and the idea that it is not so much medical need that dictates the number of procedures, but that it is the supply of medical services. Simply put: the more surgeons there are, the more procedures that are carried out1. The deeper implication is that many of these procedures are not medically required — it is just the billing that is needed: surgeons have mortgages and tuition loans to pay off. Wennberg and colleagues at Dartmouth have subsequently shown that a large proportion of the medical procedures or treatments that doctors undertake are unnecessary2.

Wennberg’s original manuscript was rejected by the New England Journal of Medicine (NEJM) but subsequently published in Science. Many of us would rate Science above the NEJM, but there is a lesson here about signal and noise, and how many medical journals in particular obsess over procedure and status at the expense of nurturing originality.

Angus Deaton and Anne Case, two economists, the former with a Nobel Prize to his name, tell a similar story. Their recent work has been on the so-called Deaths of Despair — where mortality rates for subgroups of the US population have increased3. They relate this to educational levels (the effects are largely on those without a college degree) and other social factors. The observation is striking for an advanced economy (although Russia had historically seen increased mortality rates after the collapse of communism).

Coming back to my opening statement, Deaton is quoted in the THE

The work on “deaths of despair” was so important to them that they [Deaton and Case] joined forces again as research collaborators. However, despite their huge excitement about it, their initial paper, sent to medical journals because of its health focus, met with rejections — a tale to warm the heart of any academic whose most cherished research has been knocked back.

When the paper was first submitted it was rejected so quickly that “I thought I had put the wrong email address. You get this ping right back…‘Your paper has been rejected’.” The paper was eventually published in Proceedings of the National Academy of Sciences, to a glowing reception. The editor of the first journal to reject the paper subsequently “took us for a very nice lunch”, adds Deaton.

Another medical journal rejected it within three days with the following justification

The editor, he says, told them: “You’re clearly intrigued by this finding. But you have no causal story for it. And without a causal story this journal has no interest whatsoever.”

(‘no interest whatsoever’ — the arrogance of some editors).

Deaton points out that this is a problem not just for medical journals but in economics journals, too; he thinks the top five economics journals would have rejected the work for the same reason.

“That’s the sort of thing you get in economics all the time,” Deaton goes on, “this sort of causal fetish… I’ve compared that to calling out the fire brigade and saying ‘Our house is on fire, send an engine.’ And they say, ‘Well, what caused the fire? We’re not sending an engine unless you know what caused the fire.’

It is not difficult to see the reasons for the fetish on causality. Science is not just a loose-leaf book of facts about the natural or unnatural world, nor is it just about A/B testing or theory-free RCTs, or even just ‘estimation of effect sizes’. Science is about constructing models of how things work. But sometimes the facts are indeed so bizarre in the light of previous knowledge that you cannot ignore them because without these ‘new facts’ you can’t build subsequent theories. Darwin and much of natural history stands as an example, here, but my personal favourite is that provided by the great biochemist Erwin Chargaff in the late 1940s. Wikipedia describes the first of his ‘rules’.

The first parity rule was that in DNA the number of guanine units is equal to the number of cytosine units, and the number of adenine units is equal to the number of thymine units.

Now, in one sense a simple observation (C=G and A=T), with no causal theory. But run the clock on to Watson and Crick (and others), and see how this ‘fact’ gestated an idea that changed the world.

  1. The original work was on surgical procedures undertaken by surgeons. Medicine has changed, and now physicians undertake many invasive procedures, and I suspect the same trends would be evident.
  2. Yes, you can go a lot deeper on this topic and add in more nuance.
  3. Their book on this topic is Deaths of Despair and the Future of Capitalism published by Princeton Universty Press.

A carry-on of professors

by reestheskin on 11/11/2020

Comments are disabled

There was a touching obituary of Peter Sleight in the Lancet. Sleight was a Professor of Cardiovascular Medicine at Oxford and the obituary highlighted both his academic prowess and his clinical skills. Hard modalities of knowledge to combine in one person.

Throughout all this, at Oxford’s Radcliffe Infirmary and John Radcliffe Hospital, Sleight remained an expert bedside clinician, who revelled in distinguishing the subtleties of cardiac murmurs and timing the delays of opening snaps.

And then we learn

An avid traveller, Sleight was a visiting professor in several universities; the Oxford medical students’ Christmas pantomime portrayed him as the British Airways Professor of Cardiology. [emphasis added]

This theme must run and run, and student humour is often insightful (and on occasion, much worse). I worked somewhere where the nickname for the local airport was that of a fellow Gold Card professor. We often wondered what his tax status was.

Where there is muck, there is…science

by reestheskin on 20/10/2020

Comments are disabled

The background is the observation that babies born by Caesarian have different gut flora than those born vaginally. The interest in gut flora is because many believe it relates causally to some diseases. How do you go about investigating such a problem?

Collectively, these seven women gave birth to five girls and two boys, all healthy. Each of the newborns was syringe-fed a dose of breast milk immediately after birth—a dose that had been inoculated with a few grams of faeces collected three weeks earlier from its mother. None of the babies showed any adverse reactions to this procedure. All then had their faeces analysed regularly during the following weeks. For comparison, the researchers collected faecal samples from 47 other infants, 29 of which had been born normally and 18 by Caesarean section. [emphasis added]

Healthy childbirth — How to arm Caesarean babies with the gut bacteria they need | Science & technology | The Economist

They always put me on hold. Thank You for Being Expendable

Years after I first returned from Iraq and started having thoughts and visions of killing myself, I’d call the Department of Veterans Affairs. They always put me on hold. 

NYT

Mindlessness

by reestheskin on 10/10/2020

Comments are disabled

I cannot see the future, but like many, I have private models that I use to order the world, and for which I often have very little data. For instance, I think it obvious that the traditional middle-class professions (medicine, lay, veterinary medicine, architecture, dentistry, academia) are increasingly unattractive as careers1. I am not complaining about my choices — far from it; I benefited on the tailwinds of the dramatic social change that wars and other calamities bring. But my take on what has happened to school teachers and teaching is the model for what will happen to many others. I say this with no pleasure: there are few jobs more important. But the tragedy of schoolteaching — which is our tragedy — will continue to unfold as successive gangs of politicians of either armed with nothing more than some borrowed bullet points play to the gallery. Similarly, in higher education within a timescale of almost 40 years, I have seen at first-hand changes that would make me argue that not only are the days of Donnish Dominion(to use Halsey’s phrase2) well and truly over, but that most UK universities will be unable to recruit the brightest to their cause. I think we see that in clinical academia already — and not just in the UK. Amidst all those shiny new buildings moulded for student experience (and don’t forget the wellness centres…); the ennui of corporate mediocrity beckons. The bottom line is the mission statement.

As for medicine, a few quotes below from an FT article from late last year. I assume that without revolutionary change, we will see more and more medical students, and more and more doctors leaving mid-career. If you keep running to stand still, the motivation goes. And that is without all the non-COVID-19 effects of COVID-19.

One of the major factors for doctors is the electronic record system. It takes a physician 15 clicks to order a flu shot for a patient, says Tait. And instead of addressing this problem, healthcare companies end up offering physicians mindfulness sessions and healthy food options in the cafeteria, which only frustrates them further…[emphasis added]

Over the past few years, efforts have been made to increase the number of medical schools in the US to ensure that there is no shortage of doctors. “When you think about how much we’ve invested to create, roughly, 10 to 12 new medical schools in the last decade, at hundreds of millions of dollars per school, just to increase the pipeline of physicians being trained, we also need to think at the far end of the physicians who are leaving medicine because of burnout,” says Sinsky.

Take the case of a final-year resident doctor in New York, who spends a considerable part of his shift negotiating with insurance companies to justify why his patient needs the medicines he prescribed. “When I signed up to be a doctor, the goal was to treat patients, not negotiate with insurance providers,” he says.

According to Tait, 80 per cent of the challenge faced by doctors is down to the organisation where they work, and only 20 per cent could be attributed to personal resilience.

Re the final quote, 80:20 is being generous to the organisations.

Burnout rife among American doctors | Financial Times

  1. Richard and Daniel Susskind provide a good take on this theme. See The Future of the Professions OUP, 2015.
  2. Decline of Donnish Dominion : the British academic professions in the twentieth century, Halsey, A. H. OUP, Oxford, 1992.

The biomass of parasitism approaches unity.

by reestheskin on 09/10/2020

Comments are disabled

Many years ago I was expressing exasperation at what I took to be the layers and layers of foolishness that meant that others couldn’t see the obvious — as defined by yours truly, of course. Did all those wise people in the year 2000 think that gene therapy for cancer was just around the corner, or that advance in genetics was synonymous with advance in medicine, or that the study of complex genetics would, by the force of some inchoate logic, lead to cures for psoriasis and eczema. How could any society function when so many of its parts were just free-riding on error, I asked? Worse still, these intellectual zombies starved the new young shoots of the necessary light of reason. How indeed!

William Bains, he of what I still think of as one of the most beautiful papers I have ever read1, put me right. William understood the world much better than me — or at least he understood the world I was blindly walking into, much better. He explained to me that it was quite possible to make money (both ‘real’ or in terms of ‘professional wealth’) out of ideas that you believed to be wrong as long as two linked conditions were met. First, do not tell other people you believe them to be wrong. On the contrary, talk about them as the next new thing. Second, find others who are behind the curve, and who were willing to buy from you at a price greater than you paid (technical term: fools). At the time, I did not even understand how pensions worked. Finally, William chided me for my sketchy knowledge of biology: he reminded me that in many ecosystems parasites account for much, if not most, of the biomass. He was right; and although my intellectual tastes have changed, the sermon still echoes.

The reason is that corporate tax burdens vary widely depending on where those profits are officially earned. These variations have been exploited by creative problem-solvers at accountancy firms and within large corporations. People who in previous eras might have written symphonies or designed cathedrals have instead saved companies hundreds of billions of dollars in taxes by shifting trillions of dollars of intangible assets across the world over the past two decades. One consequence is that many companies avoid paying any tax on their foreign sales. Another is that many countries’ trade figures are now unusable. [emphasis added].

Trade Wars Are Class Wars: How Rising Inequality Distorts the Global Economy and Threatens International by Matthew C. Klein, & Michael Pettis.

  1. William Bains. Should you Hire an Epistemologist. Nature Biotechnology, 1997.

Go west young man

by reestheskin on 08/10/2020

Comments are disabled

But after completing medical training, Sacks fled the homophobic confines of his nation and family—his mother had called him “an abomination.” Paul Theroux tells Burns that Sacks’s “great luck” was ending up in Los Angeles in 1960, where he found ample “guys, weights, drugs, and hospitals.”

Advance requires those who can imagine new spaces, and medicine is even more hostile today than it was all those years ago. We pretend otherwise, thinking those tick-box courses will suffice, but real diversity of intellect is the touchstone of our future.

A case study of Oliver Sacks | Science

Medicine is awaiting its own Photoshop

by reestheskin on 05/10/2020

Comments are disabled

My experience is limited, but everything I know suggests that much IT in healthcare diminishes medical care. It may serve certain administrative functions (who is attending what clinic and when etc), and, of course, there are certain particular use cases — such as repeat prescription control in primary care — but as a tool to support the active process of managing patients and improving medical decision making, healthcare has no Photoshop.

In the US it is said that an ER physician will click their mouse over 4000 times per shift, with frustration with IT being a major cause of physician burnout. Published data show that the ratio of patient-facing time to admin time has halved since the introduction of electronic medical records (i.e things are getting less efficient). We suffer slower and worse care: research shows that once you put a computer in the room eye contact between patient and physician drops by 20-30%. This is to ignore the crazy extremes: like the hospital that created PDFs of the old legacy paper notes, but then — wait for it — ordered them online not as a time-sequential series but randomly, expecting the doc to search each one. A new meaning for the term RAM.

There are many proximate reasons for this mess. There is little competition in the industry and a high degree of lock-in because of a failure to use open standards. Then there is the old AT&T problem of not allowing users to adapt and extend the software (AT&T famously refused to allow users to add answering machines to their handsets). But the ultimate causes are that reducing admin and support staff salaries is viewed as more important than allowing patients meaningful time with their doctor; and that those purchasing IT have no sympathy or insight into how doctors work.

The context is wildly different — it is an exchange on the OLPC project and how to use computers in schools, but here are two quotes from Alan Kay that made me smile.

As far as UI is concerned — I think this is what personal/interactive computing is about, and so I always start with how the synergies between the human and the system would go best. And this includes inventing/designing a programming language or any other kind of facility. i.e. the first word in “Personal Computing” is “Person”. Then I work my way back through everything that is needed, until I get to the power supply. Trying to tack on a UI to “something functional” pretty much doesn’t work well — it shares this with another prime mistake so many computer people make: trying to tack on security after the fact …[emphasis added]

I will say that I lost every large issue on which I had a firm opinion.