“in a mammoth bureaucracy obsessed with its own secrecy, the fault lines are best observed by those who, instead of peering down from the top, stand at the bottom and look up.“
Absolute Friends by John le Carré
True of the NHS.
Digital Minimalism — An Rx for Clinician Burnout | NEJM
It’s easy to see how digital minimalism’s first tenet, “clutter is costly,” applies: the average patient’s EHR has 56% as many words as Shakespeare’s longest play, Hamlet. Moreover, half these words are simply duplicated from previous documentation.
Progress isn’t inevitable. Not even likely in some domains over working lifetimes.
Rotten, with no quick fixes: the state of our mouths reflects the plight of NHS dentistry | George Monbiot | The Guardian
Every child in the UK is entitled to free treatment by a nonexistent dentist. Some people on benefits, pregnant women and those who have recently given birth also have free and full access to an imaginary service. Your rights are guaranteed, up to the point at which you seek to exercise them.
The painfully high price of Humira is patently wrong | Financial Times
But nearly half of new drugs launched in the US in 2020-21 were priced at more than $150,000 a year, so others have followed its lead. An entire industry has moved towards making products that are breathtakingly expensive.
Please, please, start thinking about reining in IPR.
I came across this article by chance while trying to track down same old papers on skin cancer. It was published in the Lancet in 1904. A few days ago I heard a story about a colleague’s problem in trying to order an ambulance for somebody having a MI.
It is remarkable how institutions can fail, and competence be something that now only exists in the past. This is not a difficult issue to solve. We no longer have a functioning health service. We have stepped back in time. But, hey, it only affects other people.
At an inquest held at Lambeth on nov. 21st Mr.Troutbeck inquired into the death of a girl, named Alice Wood,aged 17 years, of Camborne-road, Southfields, who died in a laundry van as she was being removed to st. Thomas’s hospital to undergo an operation for perforated gastric ulcer.
On Nov. 16th she complained of severe pain in the chest and about midnight Dr. E. A. Miller of Upper Richmond-road was called. He found the girl collapsed and, having diagnosed the condition, decided that an operation was her only chance. No cab could be obtained but at about 2A.M.on Nov.17th a laundry van was procured and in this she was driven to the hospital where on arrival she was found to be dead. Dr. L. Freyberger, who made the post-mortem examination, said that death was due to heart failure following acute peritonitis caused by the rupture of an internal ulcer and that it was accelerated by the jolting of the van. The coroner’s officer said that a horsed ambulance was kept at the Wandsworth Infirmary but that a relieving officer’s order was necessary for its use. The jury returned a verdict of “Death from natural causes,” and added a rider expressing the opinion that there was urgent need for an improvement in the system of providing horsed ambulances for the various metropolitan districts. With this we quite agree and we earnestly hope that the London County Council, which recently received a deputation from the Metropolitan Street Ambulance Association, will see its way to make proper provision.
Salve Lucrum: The Existential Threat of Greed in US Health Care | Health Care Economics, Insurance, Payment | JAMA | JAMA Network
In the mosaic floor of the opulent atrium of a house excavated at Pompeii is a slogan ironic for being buried under 16 feet of volcanic ash: Salve Lucrum, it reads, “Hail, Profit.” That mosaic would be a fitting decoration today in many of health care’s atria.
The grip of financial self-interest in US health care is becoming a stranglehold, with dangerous and pervasive consequences. No sector of US health care is immune from the immoderate pursuit of profit, neither drug companies, nor insurers, nor hospitals, nor investors, nor physician practices.
Avarice is manifest in mergers leading to market concentration, which, despite pleas of “economies of scale,” almost always raise costs.
Yep. Don Berwick on fine form.
China’s Covid patients face medical debt crisis as insurers refuse coverage | Financial Times
Echoing Rudolf Virchow, frequent bedfellows. The spectrum includes the UK.
A doctor at Shanghai No 10 Hospital said staff had been instructed by the city’s health commission to limit Covid diagnoses. “We are advised to label most cases as respiratory infection,” the doctor said.
“What is certain is that the government can’t afford to treat everyone for free.”
China’s National Healthcare Security Administration said on Saturday that it would fully cover hospitalisation for Covid patients, but continued to exclude complications. Hospitals are also under pressure to reduce medical costs after the national insurance fund was strained by the costs of the sprawling zero-Covid apparatus.
In the eastern city of Hangzhou, Frank Wang, a marketing manager who bought a Covid insurance plan early last year, was refused proof of illness after he developed lung and kidney infections after testing positive for the virus.
“The hospital made it clear that Covid proof is not easy to obtain as the disease diagnosis has been politicised,” said Wang, who paid more than Rmb20,000 for treatment. “That makes patients like me a victim.”
The deserving and the undeserving sick redux; more crony capitalism.
Hugh Pennington | Deadly GAS · LRB 13 December 2022
Another terrific bit of writing by Hugh Pennington in the LRB. It is saturated with insights into a golden age of medical science.
Streptococcus pyogenes is also known as Lancefield Group A [GAS]. In the 1920s and 1930s, at the Rockefeller Institute in New York, Rebecca Lancefield discovered that streptococci could be grouped at species level by a surface polysaccharide, A, and that A strains could be subdivided by another surface antigen, the M protein.
Ronald Hare, a bacteriologist at Queen Charlotte’s Hospital in London, worked on GAS in the 1930s, a time when they regularly killed women who had just given birth and developed puerperal fever. He collaborated with Lancefield to prove that GAS was the killer. On 16 January 1936 he pricked himself with a sliver of glass contaminated with a GAS. After a day or two his survival was in doubt.
His boss, Leonard Colebrook, had started to evaluate Prontosil, a red dye made by I.G. Farben that prevented the death of mice infected with GAS. He gave it to Hare by IV infusion and by mouth. It turned him bright pink. He was visited in hospital by Alexander Fleming, a former colleague. Fleming said to Hare’s wife: ‘Hae ye said your prayers?’ But Hare made a full recovery.
Prontosil also saved women with puerperal fever. The effective component of the molecule wasn’t the dye, but another part of its structure, a sulphonamide. It made Hare redundant. The disease that he had been hired to study, funded by an annual grant from the Medical Research Council, was now on the way out. He moved to Canada where he pioneered influenza vaccines and set up a penicillin factory that produced its first vials on 20 May 1944. [emphasis added]
He returned to London after the war and in the early 1960s gave me a job at St Thomas’s Hospital Medical School. I wasn’t allowed to work on GAS. There wasn’t much left to discover about it in the lab using the techniques of the day, and penicillin was curative.
Getting too deeply into statistics is like trying to quench a thirst with salty water. The angst of facing mortality has no remedy in probability.
Paul Kalanithi, When Breath Becomes Air
Do you really want to live to be 100? | Financial Times
Sarah O’Connor in today’s FT
I’m one of life’s optimists. When I think about living to be 100 years old, I picture a birthday party where I am surrounded by my devoted descendants, perhaps followed by a commercial space flight as a celebratory treat.
But I’m in the minority here. A lot of people would rather be dead. In a recent UK poll by Ipsos, only 35 per cent of people said they wanted to become centenarians.
Henry Marsh is a retired neurosurgeon who has lived an interesting life and who writes with great insight about the NHS and medicine. The following are from an interview in the Guardian.
Marsh retired from the NHS at the age of 65, after growing disenchanted with bureaucratic managers and his reduced surgical schedule. “I just got more and more frustrated,” he says. “Which is very sad because I believe deeply in the NHS. I think straying away from a tax-funded system is a terrible mistake.”
The final straw was a meeting in which Marsh was threatened by a senior manager with disciplinary action for wearing a tie on ward rounds. “That was the end as far as I was concerned,” he says. “Being threatened with disciplinary action by a fellow doctor because I was wearing a tie! That was too much.” He is concerned about the long-term prospects for the health service. “There are a lot of unhappy doctors around,” he says.
Been there: done that. “By a fellow doctor” sticks.
A dog at the master’s gate predicts the ruin of the state.
Very good article by Clare Gerada. Julian Tudor-Hart must be turning over in his grave.
‘In my 30 years as a GP, the profession has been horribly eroded’ | GPs | The Guardian
This last day was in many ways symptomatic of the changes I have seen over the course of 30 years. Today, with advances of medicines and technology, patients are living longer, often with three or even four serious long-term conditions, so having one patient with heart failure, chronic respiratory problems, dementia and previous stroke is not at all unusual, whereas 30 years ago the heart failure might have carried them off in their 60s. This makes every patient much more complex, and it can be much harder to manage them and to get the balance of treatments right.
Today, unlike 30 years ago, all patients are strangers and, as my catchment area now extends into different London boroughs, even the places I go are unfamiliar. Gone is the relationship between my community and me. Instead, I am part of a gig economy, as impersonal as the driver delivering a pizza. I ended the shift with a profound sense of loss and sadness.
I cannot help but think that as a society we have lost the ability to do many things that we once did moderately well. Things that often worked and were good.
Rana Foroohar In the FT
I have a dermatologist friend who recently sold his practice to a private equity firm, but couldn’t bear to stay on after because management forced him to cut the amount of time spent with patients in half, and focus more on scale and less on people…
Why does the idea of Leon Black or Stephen Schwarzman focusing on post-Covid health issues make me feel more depressed? Is healthcare going to become the new subprime, with surprise billing, crushing debt, and sub-par treatment? Our system is complicated and patchy as it is. But Peter, the larger issue is what I’d like your take on. Do you agree with folks who say that we’ve never left the great financial crisis? With debt at record levels, and the Federal Reserve about to raise rates significantly, where will you be looking for financial risk?
Paul Taylor LRB 2002
This blog by Chris Bulstrode was written well before Covid. The future needs to be different: it’s the doctors stupid.
However both of these issues pale into insignificance compared with the challenge of providing life-long job satisfaction in a career consisting of endless night and weekend shifts. The consequence is that if we don’t do something radical soon, I fear that the emergency department may collapse and bring the whole NHS system down with it. Sometimes, when I am not on duty, I muse about how our society as it is now will be seen in 100 years’ time. Children might be taught in history classes that in 1948 a small island off the west of Europe set-up a revolutionary advance in civilization which was the talk of the rest of the world. It was called the NHS and provided free healthcare for all. My fear is that the next sentence from the teacher will be that it collapsed in the year 201..? [emphasis added]
Menthol cigarettes were first promoted to soothe the airways of “health conscious” smokers. Long used as an analgesic, menthol evokes a cooling sensation that masks the harshness of tobacco smoke. In the competition to capitalize on the growing menthol market, the industry’s marketing experts “carved up, segmented, and fractionated” the population, exploiting psychology and social attitudes to shape product preferences.
Stephen Sedley · A Decent Death · LRB 21 October 2021
A sharp pen from Stephen Sedley, a former appeal court judge, in the LRB.
Absurdly and cruelly, until the 1961 Suicide Act was passed it was a crime to kill yourself. While those who succeeded were beyond the law’s reach, those who tried and failed could be sent to jail. In the 1920s the home secretary had to release a Middlesbrough woman with fourteen children who had been given three months in prison for trying to kill herself. There is a Pythonesque sketch waiting to be written about a judge passing a sentence of imprisonment for attempted suicide: ‘Let this be a lesson to you and to any others who may be thinking of killing themselves.’ In fact, by the mid 19th century the law had got itself into such a tangle that a person injured in a failed attempt at suicide could be indicted for wounding with intent to kill, an offence for which Parliament had thoughtfully provided the death penalty.
But the repeated resort by doctrinal opponents of assisted dying to the need for safeguarding tends to be directed not to resolving any difficulties but to amplifying and complicating them to the point of obstruction – the kind of argument which, as Gore Vidal once put it, gives intellectual dishonesty a bad name.
After thirty-five years of teaching medical students dermatology the 2021 GMC’s Medical Licensing Assessment (MLA) content map makes for dispiriting reading. The document states that it sets out the core knowledge expected of those entering UK practice. It doesn’t.
My complaint is not the self-serving wish of the specialist who feels that his subject deserves more attention — I would willingly remove much of what the GMC demand. Nor is it that the document elides basic clinical terminology such as acute and chronic (in dermatology, the term refers to morphology rather than just time). Nor, bizarrely, that it omits mention of those acute dermatoses with a case-fatality rate higher than that of stroke or myocardial infarction: bullous pemphigoid, pemphigus, and Stevens-Johnson syndrome/Toxic Epidermal Necrolysis are curiously absent. No, my frustrations lie with the fact that the approach taken by the GMC, whilst superficially attractive, reveals a lack of insight into, and, knowledge of medicine and expertise in medicine. The whole GMC perspective, based on a lack of domain expertise, is that somehow they can regulate anything. That somehow there is a formula for ‘how to regulate’. This week, medicine; next week, the Civil Aviation Authority. The world is not like that — well it shouldn’t be.
Making a diagnosis can be considered a categorisation task in which you not only need to know about the positive features of the index diagnosis, but also those features of differential diagnoses that are absent in the index case (for Sherlock Holmes aficionados, the latter correspond to the ‘curious incident of the dog in the night’ issue). It is this characteristic that underpins all the traditional ‘compare and contrast’ questions, or the hallowed ‘list the differentials, and then strike them off one-by-one’.
Take melanoma, which the MLA content guide includes. Melanoma diagnosis requires accounting for both positive and negative features. For the negative features, you have to know about the diagnostic features of the common differentials that are not found in melanomas. This entails knowing something about the differentials, and, as the saying goes, if you can’t name them, you can’t see them. A back of the envelope calculation: for every single case of melanoma there are a quarter of a million cases made up of five to ten diagnostic classes that are not melanomas. These include melanocytic nevi, solar lentigines, and seborrhoeic keratoses; these lesions are ubiquitous in any adult. But the MLA fails to mention them. What is a student to make of this? Do they need to learn about them or not? Or are they to be left with the impression that a pigmented lesion that has increased in size and changed colour is most likely a melanomas (answer:false).
Second, the guide essentially provides a list of nouns, with little in the way of modifiers. Students should know about ‘acute rashes’ and ‘chronic rashes‘ — terms I should say that jar on the ear of any domain expert — but which conditions are we talking about, and exactly what about each of these conditions should students know?
In some domains of knowledge it is indeed possible to define an ability or skill succinctly. For instance, in mathematics, you might want students to be able to solve first-order differential equations. The competence is simply stated, and the examiner can choose from an almost infinite number of permutations. If we were to think about this in information theory terms, we would say we can highly compress in a faithful (lossless) way what we want students to know. But medicine is not like this.
Take psoriasis as another example from the MLA. Once we move beyond expecting students to know how to spell the word watch what happens as you try to define all those features of psoriasis you wish them to know about. By the time you have you finished listing what exactly you want a student to know, you have essentially written the textbook chapter. We are unable to match the clever data compression algorithms that generate MP3 formats or photograph compressions. Medical texts do indeed contain lots of annoying details — no E=MC2 for us — but it is these details that constitute domain expertise. But we can all agree, that we can alter the chapter length as an explicit function of what we want students to know.
Once you move to a national syllabus (and for tests of professional competence, I am a fan) you need to replace what you have lost; namely, the far more explicit ‘local’ guides such as ‘read my lecture notes’ or ‘use this book but skip chapters x, y and z’ that students could once rely on. The most interesting question is whether this is now better done at the level of the individual medical school or, as for many non-medical professional qualifications, at the national level.
Finally, many year ago, Michael Power, in his book, The Audit Society: Rituals of Verification demolished the sort of thinking that characterises the whole GMC mindset. As the BMJ once said, there is little in British medicine that the GMC cannot make worse. Pity the poor students.
The following is from Pulse, a magazine that is aimed at GPs. My point is not so much about the specifics but a more general point.
Headache, runny nose and sore throat top three symptoms of Delta variant, says researcher – Pulse Today
Professor Spector said cases were rising exponentially and people who have only had one vaccine dose should not be complacent.
The UK really does now have a problem and we’ll probably be seeing, in a week, 20,000 cases and by 21st June well in excess of that number,’ he said. ‘Most of these infections are occurring in unvaccinated people. We’re only seeing slight increases in the vaccinated group and most of those in the single vaccinated group,’ he said.
He goes on to say:
Covid is also acting differently now. Its more like a bad cold in this younger population and people don’t realise that and it hasn’t come across in any of the government information.This means that people might think they’ve got some sort of seasonal cold and they still go out to parties and might spread around to six other people and we think this is fuelling a lot of the problem.
The number one symptom is headache, followed by runny nose, sore throat and fever. Not the old classic symptoms. We don’t see loss of smell in the top ten any more, this variant seems to be working slightly differently.
He advised people:
who were feeling unwell to stay at home for a few days, use lateral flow tests with a confirmation PCR test if they get a positive result.
Now comes the boilerplate Orwellian response from the Department of Health and Social Care
[A] spokesperson said: ‘Everyone in England, regardless of whether they are showing symptoms, can now access rapid testing twice a week for free, in line with clinical guidance.
Experts keep the symptoms of Covid-19 under constant review and anyone experiencing the key symptoms – a high temperature, a new continuous cough, or a loss or change to sense of smell or taste – should get a PCR test as soon as possible and immediately self-isolate along with their household.’ [emphasis added]
It is one of dermatology’s tedious and fun facts that, in contradistinction to say scabies or head lice, you treat body lice by treating not the patient (directly) but the clothing. The pharmacological agent is a washing machine. But the excerpt quoted below tells you something wonderful about science: you get things out that you never expected. Spontaneous generation — not in the real world — but in the world of ideas. Well, almost.
How clothing and climate change kickstarted agriculture | Aeon Essays
Scientific efforts to shed light on the prehistory of clothes have received an unexpected boost from another line of research, the study of clothing lice, or body lice. These blood-sucking insects make their home mainly on clothes and they evolved from head lice when people began to use clothes on a regular basis. Research teams in Germany and the United States analysed the genomes of head and clothing lice to estimate when the clothing parasites split from the head ones. One advantage of the lice research is that the results are independent from other sources of evidence about the origin of clothes, such as archaeology and palaeoclimatology. The German team, led by Mark Stoneking at the Max Planck Institute for Evolutionary Anthropology, came up with a date of 70,000 years ago, revised to 100,000 years ago, early in the last ice age. The US team led by David Reed at the University of Florida reported a similar date of around 80,000 years ago, and maybe as early as 170,000 years ago during the previous ice age. These findings from the lice research suggest that our habit of wearing clothes was established quite late in hominin evolution.
Infected blood scandal: government knew of contaminated plasma ‘long before it admitted it’ | Contaminated blood scandal | The Guardian
From the Guardian.
Among the victims of the contaminated blood scandal, which is the subject of a public inquiry, were 1,240 British haemophilia patients, most of whom have since died. They were infected with HIV in the 1980s through an untreated blood product known as Factor VIII.
In 1983, Ken Clarke, then a health minister, denied any threat was posed by Factor VIII. In one instance, on 14 November 1983, he told parliament: “There is no conclusive evidence that Aids is transmitted by blood products.”
However, documents discovered at the national archives by Jason Evans, whose father died after receiving contaminated blood and who founded the Factor 8 campaign, paint a contrasting picture.
In a letter dated 4 May 1983, Hugh Rossi, then a minister in the Department of Health and Social Security (DHSS), told a constituent: “It is an extremely worrying situation, particularly as I read in the weekend press that the disease is now being transmitted by blood plasma which has been imported from the United States.”
(HIV screening for all blood donated in the UK only began on 14 October 1985.)
Rossi’s letter was considered damaging enough for the government to seek to prevent its release in 1990 during legal action over the scandal, by which time Clarke was health secretary.
In another letter uncovered by Evans, dated 22 March 1990, a Department of Health official wrote to government lawyers saying it wanted to withhold Rossi’s letter, despite admitting the legal basis for doing so was “questionable”.
Clarke has a legal background. There is a large logical gap between between denying ‘any threat’ and the statement that there is ‘no conclusive evidence’. The Department of Health would be better named the Department without Integrity. Recent events suggest things are no better now. It didn’t all start with Johnson.
Jonathan Flint · Testing Woes · LRB 6 May 2021
Terrific article from Jonathan Flint in the LRB. He is an English psychiatrist and geneticist (mouse models of behaviour) based in UCLA, but like many, has put his hand to other domains (beyond depression). He writes about Covid-19:
Perhaps the real problem is hubris. There have been so many things we thought we knew but didn’t. How many people reassured us Covid-19 would be just like flu? Or insisted that the only viable tests were naso-pharyngeal swabs, preferably administered by a trained clinician? Is that really the only way? After all, if Covid-19 is only detectable by sticking a piece of plastic practically into your brain, how can it be so infectious? We still don’t understand the dynamics of virus transmission. We still don’t know why around 80 per cent of transmissions are caused by just 10 per cent of cases, or why 2 per cent of individuals carry 90 per cent of the virus. If you live with someone diagnosed with Covid-19, the chances are that you won’t be infected (60 to 90 per cent of cohabitees don’t contract the virus). Yet in the right setting, a crowded bar for example, one person can infect scores of others. What makes a superspreader? How do we detect them? And what can we learn from the relatively low death rates in African countries, despite their meagre testing and limited access to hospitals?
That we are still scrambling to answer these questions is deeply worrying, not just because it shows we aren’t ready for the next pandemic. The virus has revealed the depth of our ignorance when it comes to the biology of genomes. I’ve written too many grant applications where I’ve stated confidently that we will be able to determine the function of a gene with a DNA sequence much bigger than that of Sars-CoV-2. If we can’t even work out how Sars-CoV-2 works, what chance do we have with the mammalian genome? Let’s hope none of my grant reviewers reads this.
Medicine is always messier that people want to imagine. It is a hotchpotch of kludges. For those who aspire to absolute clarity, it should be a relief that we manage effective action based on such a paucity of insight. Cheap body-hacks sometimes work. But the worry remains.
Wonderful piece by Janan Ganesh in the FT on the life choices made by young bankers and corporate lawyers, and the crazy (work) demands placed on them. I was surprised that he also has junior doctors in his sights.
Yes, the graduates knew the deal when they joined, but the appeal to free will is an argument against almost any labour standards whatever. Nine-year-old Victorian chimney sweeps knew the deal. As for all the talk of character-forging, of battle-hardening: to what end, exactly? The point of a corporate career arc is that work becomes more strategic, less frenzied over time. The early hazing should not be passed off as a kind of Spartan agoge.
The ageing process — as I have lived it, as I have observed it in friends — has convinced me of one thing above all. The deferral of gratification is the easiest life mistake to make. And by definition among the least reversible. A unit of leisure is not worth nearly as much in late or even middle age as it is in one’s twenties. To put it in Goldman-ese, the young should discount the future more sharply than prevailing sentiment suggests.
The first reason should be obvious enough, at least after the past 12 months. There is no certainty at all of being around to savour any hard-won spoils. The career logic of an investment banker (or commercial lawyer, or junior doctor) assumes a normal lifespan, or thereabouts. And even if a much-shortened one is an actuarial improbability, a sheer physical drop-off in the mid-thirties is near-certain. Drink, sex and travel are among the pleasures that call on energies that peak exactly as graduate bankers are wasting them on work.
I don’t know enough to be confident about clinical medicine but I do often wonder how things will look in a decade or so. Many junior medical jobs are awful, the ties and bonds between the beginning, middle and end of medical careers sundered. Many drop out of training, some treading water in warmer climes, but with what proportion returning? Some — a small percentage perhaps— move into other jobs, and the few I know who have done this, I would rate among the best of their cohort. Of those who stick to the straight and narrow, many now wish to work less than full time, although whether this survives the costs of parenthood, I do not know. At the other end all is clear: many get out as soon as they can, the fun long gone, and the fear of more pension changes casting an ever larger shadow, before the final shadow descends.
Medicine remains — in one sense — a narrow and technical career. The training is long, and full confidence in one’s own skills may take twenty years or more to mature. By that time, it is hard to change course. This personal investment in what is a craft, leaves one vulnerable to all those around you who believe success is all about line-managing others and generic skills.
I am unsure how conscious (or how rational) many decisions about careers are, but there may well be an invisible hand at play here, too. I imagine we may see less personal investment in medical careers than we once did. It’s no longer a vocation, just a job, albeit an important one. Less than comforting words, I know — especially if you are ill.
For some reason — COVID of course — I keep coming back to perhaps the greatest vaccine success ever: the eradication of smallpox (here, here and here). But the figures quoted by Scott Galloway make you sit up and notice both the magic — and value — of science.
Values in America – Scott Galloway on recasting American individualism and institutions | By Invitation | The Economist
International bodies are immolated. Consider the World Health Organisation. Mr Trump’s decision to pull America out of the WHO in the midst of a pandemic (reversed under President Joe Biden) was galling, particularly as the WHO is responsible for one of humanity’s greatest public-health accomplishments: the eradication of smallpox in the 1970s. To appreciate the magnitude of this, Google images of “smallpox” and glimpse the horror that once killed millions each year. It was a victory for co-operative, state-funded projects and it cost a mere $300m. By one estimate, America, the largest contributor, recoups that value every 26 days from savings in vaccinations, lost work and other costs. [emphasis added]
Or not, as the case may be.
Smallpox is the greatest success story in the history of medicine. It once took huge numbers of lives — as many as half a billion people in the 20th century alone — and blinded and disfigured many more.
So writes the distinguished historian of science, Steven Shapin in the LRB (A Pox on the Poor, February 4, 2021). He is reviewing The Great Inoculator: The Untold Story of Daniel Sutton and His Medical Revolution).
In historical times you had a one in three chance of getting smallpox, and, if you got it, the case-fatality was 20%. Some outbreaks, however, had a case-fatality of 50% and, unlike Covid-19, its preferred targets were children.
My exposure to smallpox was (thankfully) limited. My mother told me that there was an outbreak in South Wales and the West of England when I was around five or six. There was widespread vaccination, but kids like me with bad eczema, were spared, with the parent advised to ‘shield’ the child indoors (my sympathies go to my mother). (The risk was from the grandly named, but sometimes fatal, Kaposi’s varicelliform reaction, which was due to the vaccinia virus — not smallpox — running riot on the diseased skin).
As a med student, I remember learning how to distinguish the cutaneous lesions of smallpox from chicken pox. Smallpox was declared eradicated in 1980, but, as a dermatology registrar, seeing the occasional adult with chickenpox who seemed so ill (in comparison with kids), I often had doubts that I had to reason away. Perhaps those stores held by the US and USSR were not so secure…
Jenner and smallpox vaccination go together in popular accounts, but the history of this particular clinical discovery is much older and richer — at least to me.
As ever, in medicine, and in particular for anything involving the skin, the terminology is confusing. The Latin name for cowpox is Variolae vaccinae, meaning the pox from the cow (vacca). It was Pasteur who subsequently honoured Jenner by deciding that all forms of inoculation be called vaccination.
Edward Jenner took advantage of the already-known fact that whilst milkmaids tended to be afflicted with the far more mild cowpox virus, they rarely suffered from the more serious, smallpox (they are different, but related, viruses). Jenner, in 1796, inoculated an eight-year-old boy with the pus from a milkmaid’s cowpox sore. After being exposed to smallpox material the boy appeared immune, in that he did not suffer adversely when subsequently exposed to smallpox.
Once Jenner’s method was accepted as safe, Acts of Parliament introduced free vaccination n 1840, and vaccination became obligatory in 1853.
I had never been quite certain of the distinction between inoculation and vaccination, but there is history here too. Shapin writes that the term inoculation was borrowed from horticulture — the grafting of a bud (or ‘eye’) to propagate a plant (something I was taught how to do in horticulture lessons when I was aged 11, in school in Cardiff, by Brother Luke, who, I thought so old, he might have been a contemporary of Jenner). Why the name is apt is explained below.
Before vaccination, inoculation was actually meant to give you a direct form of smallpox (this was also referred to as variolation, after variola, the term for smallpox). The source material, again, was from a lesion of somebody with smallpox. The recipient it was hoped would develop a milder version of smallpox. Shapin writes:
The contract with the inoculator was to accept a milder form of the disease, and a lower chance of death, in exchange for a future secured from the naturally occurring disease, which carried a high chance of killing or disfiguring.
Shapin tells us that Lady Mary Wortley Montagu, when holed up with her husband in Constantinople in 1717, heard stories about how such ‘in-grafting’ was in widespread use by the ‘locals’. She was scarred from smallpox, and therefor she had the procedure carried out on her then five-year-old son. The needle was blunt and rusty, but her son suffering just a few spots and the procedure was judged a success. He was now immune to smallpox.
Not surprisingly, the story goes back further: inoculation was folk medicine practice in Pembrokeshire as early as 1600, and the Chinese had been blowing dried, ground-up smallpox material up the nose for many centuries.
The London medical establishment were apparently not too impressed with the non-British origin of such scientific advance, nor its apparent simplicity (and hence low cost). So, they made the procedure made much more complicated, with specific diets being required, along with advice on behaviour, and, of course, blood-lettings and laxatives, all in addition to not just a ‘prick’ but a bigger incision (payment by the inch). The physician’s ‘fees’ no doubt rose in parallel. Not a bad business model, until…
The London physicians’ ‘add-ons’ put the treatment costs of inoculation out of reach of most of the population, restricting it, for decades, to the ‘medical carriage trade’. Along comes Richard Sutton, a provincial barber-surgeon, with no Latin or Greek, no doubt, who effectively industrialised the whole process, making it both more profitable and cheaper for the customer.
Based in a village near Chelmsford, he inoculated tens of thousands locally. The method was named after him, the Suttonian Method. On one day he inoculated 700 persons. Incisions (favoured by the physicians) were replaced by the simpler prick, and patients were not confined, but instead told to go out in the fresh air (day-case, anybody?). Product differentiation was of course possible:spa-like pampering in local accommodation was available for the top end of the market, with a set of sliding fees depending on the degree of luxury.
Splitting the market and niche pricing were aspects of Sutton’s business success, but so too was control of supply. The extended Sutton clan could satisfy a significant chunk of provincial demand, but Daniel also worked out a franchising system, which ‘authorised’ more than fifty partners throughout Britain and abroad to advertise their use of the ‘Suttonian System’ — provided they paid fees for Sutton-compounded purgatives, kicked back a slice of their take, and kept the trade secrets. Control was especially important, since practically, anyone could, and did, set up as an inoculator. The Suttons themselves had become surgeons through apprenticeship, but apothecaries, clergymen, artisans and farmers were inoculating, and sometimes parents inoculated their own children. The profits of the provincial press were considerably boosted as practitioners advertised their skills at inoculation and their keen prices. Daniel went after competitors — including his own father-in-law and a younger brother — with vigour, putting it about that the Suttonian Method depended on a set of closely held secrets, to which only he and his approved partners had access. His competitors sought to winkle out these secrets, occasionally pouncing on Sutton’s patients to quiz them about their experiences.
Sutton admitted that had ‘lost’ five out of forty thousand patients (due to smallpox). He offered a prize of 100 guineas to anybody who could prove he ever lost a patient due to inoculation, or that any patient got smallpox a second time. More confident, and perhaps more generous, than modern Pharma, I think. By 1764, Sutton had an annual income of 6300 guineas — over one million sterling in today’s money.
Metabolic surgery versus conventional therapy in type 2 diabetes – The Lancet
I like the parlour game of inventing collective nouns for doctors — a ganglion of neurologists, a scab of dermatologists, and so on— and I also cannot help but smile when Mr Butcher turns out to be, well, a surgeon, and Lord Brain is a…. You get it.
I saw this article when I was skimming through the Lancet the other week, and something tweaked in my mind from long-back.
Metabolic surgery versus conventional therapy in type 2 diabetes. Alexander D Miras & Carel W le Roux
A few more details:… “report their trial in which they randomly assigned patients to metabolic surgery or medical therapy for type 2 diabetes.1 60 white European patients (32 [53%] women) were evaluated 10 years after Roux-en-Y gastric bypass (RYGB), biliopancreatic diversion (BPD), or conventional medical therapy.
Now, I suspect the name Roux is not rare but of course checking with Wikipedia, I can remind you:
César Roux (23 March 1857, in Mont-la-Ville – 21 December 1934, in Lausanne) was a Swiss surgeon, who described the Roux-en-Y procedure.
No relation to Ross-on-Wye.
I am probably biased as my mother was Irish, one of a large O’Mahony clan who were born in or around Cork. She moved to Dublin in her early teens, crossed the water in 1941, and, a few years later, after meeting a Welshman from Neath, set up home together in Cardiff. Cardiff had a long-established Irish community, one that was good at replenishing itself with fresh blood from the motherland, in tune with the rhythms of economic cycles. I was sent to a Christian Brothers’ school where many of the brothers were Irish exports. In junior school, at least, I can remember having some green plant pinned to my lapels on March 17. I didn’t stand out, many of the other kids were similarly tattooed. If I count my extended family — including my brother — they mainly reside across the water.
Without any grand theory at hand I have always thought there must be something special about schooling in Ireland, even if the supposed benefits are not intentional, nor shared equally. Historically, there are many bad things; this is well known. But if I cast an eye over medics who I judge to write deeply about medicine, there is a disproportionate number of Irish doctors.
Anthony Clare was the first example I came across. I was a wannabe psychiatrist when I was a medical student in Newcastle, and I spent undoubtedly the most enjoyable three months of my undergraduate medical course, in Edinburgh, on an elective on the unit headed by Prof Bob Kendell. For most of this time, since it was summer, there were no Edinburgh students, and so I was viewed by the staff as useful. One of the tragedies of being a medical student nowadays is that you don’t feel useful simply because for most of the time you are not useful. The thousand-year-old laws that guide apprenticeship learning have not so much been forgotten but well and truly trashed.
Clare wrote a wonderful book Psychiatry in Dissent when he was still a Senior Registrar at the Maudsley. Despite the title, it was a calm and measured critique of medicine and psychiatry. Reading it felt dangerous, but more than that, it was the voice of quiet reason and a call to arms. It stands as an example of the difference between a medical education and a medical training. The GMC don’t do the former, nor does the NHS.
Another Irish psychiatrist whose writings I have admired is David Healy. Healy is now in Canada and, as far as I can see, has been blackballed by the UK medical and psychiatric establishment. His three-volume history of Psychopharmacology (The Psychopharmacologists) is a masterpiece. Sam Shuster was the first person to introduce me — over coffee— to how many of the revolutionary drugs that changed medicine in the middle of the last century were developed, but Healy’s scholarship cast it in a larger and richer framework. Healy has done lots of other things, too, and possesses a well of moral courage that puts to shame most of the so-called leaders of the profession.
James McCormick was a professor of general practice at Trinity, in Dublin. I first came across him when he contributed a chapter to Bruce Charlton’s and Robert Downie’s book on Medical Education (The Making of a Doctor: Medical Education in Theory and Practice). I have only recently returned to this book, but reading his chapter is disturbing because it makes you shocked that you can ever have been taken in by the pabulum of the modern world of ‘primary care’ and its apologists. The late and singular Petr Skrabenek found his academic home with McCormick in Trinity. Petr was on holiday in Ireland when the Russian tanks rolled into his hometown of Prague n 1968. Yes, not Irish, but if you read his writings about medicine (check out Wikipedia), and have dipped into Havel and Flann O’Brien, you see he is of that place.
Seamus O’Mahony, a ‘Cork-man’, who used to work in Edinburgh before returning to Ireland, has published two books about medicine. The first — which I haven’t read — is named The Way We Die Now. His more recent book, published in 2019, is titled Can Medicine Be Cured? The full title is Can Medicine Be Cured?: The Corruption of a Profession. You get the message, and the jury didn’t take long to realise which clause required an affirmative verdict.
The book covers a lot of ground, yet the pages fly by. There are chapters on how much medical research is dysfunctional — when it is not criminal; on the corruption of the academy; and the oft hidden problems of combining the practice of science and medicine. He talks about pharma (of course), the invention of disease (there isn’t enough money in treating the sick, we need to treat the non-sick), the McNamara fallacy (data, just data, dear boy), and the never-ending bloody ‘War on Cancer’. He picks apart the lazy confusion between needs, wants, and consumerism, and highlights the fact that the directionless NHS is run by politicians who want to do everything— except politics: they just want to be loved. Meanwhile, the medics tired of the ever faddish evidence-based medicine turned to sentimentality-based medicine allowing ‘empathy’ and superstition to ride roughshod over the ability to reason about the natural world, and their patients. Among doctors and medical students, a facile sense of feeling good about yourself has overtaken technical mastery, allowing all to wallow in kumbaya around the campfire they now pretend to share with their charges. Not so. If doctors were once labelled as priests with a stethoscope, we have cast our tool away.
O’Mahony writes well. I particularly liked his metaphors that are familiar to anybody brought up in Catholicism even if they left the bus before it (and they) returned to the terminus. A few examples:
The decadence of contemporary biomedical science has a historic parallel in the mediaeval pre-Reformation papacy. Both began with high ideals. Both were taken over by careerists who corrupted these ideals, while simultaneously paid lip-service to them. Both saw the trappings of worldly success as more important than the original ideal. Both created a self-serving high priesthood. The agenda for the profession is set by an academic elite (the hierarchy of bishops and cardinals), all the day-to-day work is done by low status GPs and hospital doctors (curates, monks). This elite, despite having little to do with actual patient care, is immensely powerful in the appointment of doctors.”
The Czech polymath and contrarian Peter Skrabanek (1940–94) taught these skills at Trinity College Dublin medical school during the 1980s and early 1990s, and lamented that “my course on the critical appraisal evidence for medical students can be compared to a course on miracles by a Humean sceptic for prospective priests in a theological seminary”.
And on attending a consensus conference of medical experts in Salerno (you only have a consensus when there bloody-well isn’t any…).
I found a picture online of the experts gathered at Salermo, and was reminded of a fresco in the Sistine chapel depicting the first Council of Nicea in A.D. 325. The council was convened by the Emperor Constantine to establish doctrinal orthodoxy within the early Christian Church. The industry-sponsored get-together in Salerno had similar aims.… The aim is expansionist: the establishment of a new disease by consensus statement, the Big Science equivalent of a papal bull. Non-coeliac gluten sensitivity has been decreed by this edict, just as papal infallibility was decreed by the first Vatican Council in 1870.
As for the sick and needy
Meanwhile, the sick languish. The population are subjected to more and more screening programs (for breast cancer, cervical cancer, colon cancer, high blood pressure, cholesterol levels etc.), but if they become acutely ill and need to go to hospital, it is likely that they will spend hours on a trolley in an emergency department. When they are finally admitted to a ward, it is often chaotic, squalid and understaffed. Hospices have to rely on charity just to keep going, and have so few beds that ten times as many people die in general hospitals than hospices.
And, as for David Cameron (lol!) and his plans to fund cancer drugs that were rejected by NICE, well, he was a nice (not NICE) guy, and he was on the side of the people. O’Mahony points out that this money alone would have funded all UK hospices for over a year.
Populism doesn’t cure cancer, but it trumps justice, evidence and fairness every time.
Along with Henry Marsh’s Do No Harm, O’Mahony’s book deserves to become a classic. Buy it and read it. Just don’t turn it into a multiple-choice exam. A brave medical school might even add it to the freshers’ pack — well, I can still dream.
The photo, facing towards Kerry, is from Penglas, a mainly Welsh hamlet in West Cork.
UK COVID-19 public inquiry needed to learn lessons and save lives – The Lancet
It is hard not to be moved nor not be angry on reading the editorial in this week’s Lancet, written by three members of the Covid-19 Bereaved Families for Justice group.
The UK Prime Minister Boris Johnson has previously suggested that an immediate public inquiry into the government’s handling of COVID-19 would be a distraction7 or diversion of resources in the fight against COVID-19. We have long proposed that quite the opposite is true: an effective rapid review phase would be an essential element in combating COVID-19.
An independent and judge-led statutory public inquiry with a swift interim review would yield lessons that can be applied immediately and help prevent deaths in this tough winter period in the UK. Such a rapid review would help to minimise further loss of life now and in the event of future pandemics. In the wake of the Hillsborough football stadium disaster on April 15, 1989, for example, the Inquiry of Lord Justice Taylor delivered interim findings within 11 weeks, allowing life-saving measures to be introduced in stadiums ahead of the next football season.
I will quote Max Hastings, a former editor of the Daily Telegraph and Evening Standard, and a distinguished military historian, writing in the Guardian many years ago. He was describing how he had overruled some of his own journalists who had suspected Peter Mandelson of telling lies.
I say this with regret. I am more instinctively supportive of institutions, less iconoclastic, than most of the people who write for the Guardian, never mind read it. I am a small “c” conservative, who started out as a newspaper editor 18 years ago much influenced by a remark Robin Day once made to me: “Even when I am giving politicians a hard time on camera,” he said, “I try to remember that they are trying to do something very difficult – govern the country.” Yet over the years that followed, I came to believe that for working journalists the late Nicholas Tomalin’s words, offered before I took off for Vietnam for the first time back in 1970, are more relevant: “they lie”, he said. “Never forget that they lie, they lie, they lie.” Max Hastings
Two of Hasting’s journalists at the Evening Standard were investigating the funds Peter Mandelson had used to purchase a house.
One morning, Peter Mandelson rang me at the Evening Standard. “Some of your journalists are investigating my house purchase,” he said. “It really is nonsense. There’s no story about where I got the funds. I’m buying the house with family money.”
I knew nothing about any of this, but went out on the newsroom floor and asked some questions. Two of our writers were indeed probing Mandelson’s house purchase. Forget it, I said. Mandelson assures me there is no story. Our journalists remonstrated: I was mad to believe a word Mandelson said. I responded: “Any politician who makes a private call to an editor has a right to be believed until he is proved a liar.” We dropped the story.
Several months later
…when the Mandelson story hit the headlines, I faced a reproachful morning editorial conference. A few minutes later, the secretary of state for industry called. “What do I have to do to convince you I’m not a crook ?” he said.
I answered: “Your problem, Peter, is not to convince me that you are not a crook, but that you are not a liar.”
The default, and most sensible course of action, is to assume that the government and many of those who answer directly to the government have lied and will continue to lie.
Where Health Care Is a Human Right | by Nathan Whitlock | The New York Review of Books
An article discussing Canadian health care with echoes of the UK’s own parochial attitude to health care (and don’t mention Holland, Germany, France, Switzerland…).
How do such gaps and problems persist? Part of the problem, ironically, is the system’s high approval ratings: with such enthusiasm for the existing system, and with responsibility for it shared between federal and provincial or territorial governments, it’s easy for officials to avoid making necessary changes. Picard sees our narrowness of perspective as a big obstacle to reform: “Canadians are also incredibly tolerant of mediocrity because they fear that the alternative to what we have is the evil US system.” Philpott agrees that Canadians’ tendency to judge our system solely against that of the United States can be counterproductive. “If you always compare yourself to the people who pay the most per capita and get some of the worst outcomes,” she told me in a recent Zoom call, “then you’re not looking at the fact that there are a dozen other countries that pay less per capita and have far better outcomes than we do.”
The Holy See is thus viewed as the central government of the Catholic Church. The Catholic Church, in turn, is the largest non-government provider of education and health care in the world. The diplomatic status of the Holy See facilitates the access of its vast international network of charities.[emphasis added]
There is a famous quote ( I don’t have a primary source) by the great Rudolf Virchow
“Medicine is a social science, and politics is nothing more than medicine on a large scale.”
I know what Virchow was getting at, but if only.
I think the quip was from the series Cardiac Arrest: the ITU used to be called the ICU (intensive care unit) until they realised nobody did.
In March, 2019, a doctor informed 78-year-old Ernest Quintana, an inpatient at a hospital in California, USA, that he was going to die. His ravaged lungs could not survive his latest exacerbation of chronic obstructive pulmonary disease, so he would be placed on a morphine drip until, in the next few days, he would inevitably perish. There was a twist. A robot had delivered the bombshell. There, on a portable machine bearing a video screen, crackled the pixelated image of a distant practitioner who had just used cutting-edge technology to give, of all things, a terminal diagnosis. The hospital insisted that earlier conversations with medical staff had occurred in person, but as Mr Quintana’s daughter put it: “I just don’t think that critically ill patients should see a screen”, she said. “It should be a human being with compassion.”
According to a helpful app on my phone that I like to think acts as a brake on my sloth, I retired 313 days ago. One of the reasons I retired was so that I could get some serious work done; I increasingly felt that professional academic life was incompatible with the sort of academic life I signed up for. If you read my previous post, you will see this was not the only reason, but since I have always been more of an academic than clinician, my argument still stands.
Over twenty years ago, my friend and former colleague, Bruce Charlton, observed wryly that academics felt embarrassed — as though they had been caught taking a sly drag round the back of the respiratory ward — if they were surprised in their office and found only to be reading. No grant applications open; no Gantt charts being followed; no QA assessments being written. Whatever next.
I thought about retirement from two frames of reference. The first, was about finding reasons to leave. After all, until I was about 50, I never imagined that I would want to retire. I should therefore be thrilled that I need not be forced out at the old mandatory age of 65. The second, was about finding reasons to stay, or better still, ‘why keep going to work?’. Imagine you had a modest private income (aka a pension), what would belonging to an institution as a paid employee offer beyond that achievable as a private scholar or an emeritus professor? Forget sunk cost, why bother to move from my study?
Many answers straddle both frames of reference, and will be familiar to those within the universities as well as to others outwith them. Indeed, there is a whole new genre of blogging about the problems of academia, and employment prospects within it (see alt-acor quit-lit for examples). Sadly, many posts are from those who are desperate to the point of infatuation to enter the academy, but where the love is not reciprocated. There are plenty more fish in the sea, as my late mother always advised. But looking back, I cannot help but feel some sadness at the changing wheels of fortune for those who seek the cloister. I think it is an honourable profession.
Many, if not most, universities are very different places to work in from those of the 1980s when I started work within the quad. They are much larger, they are more corporatised and hierarchical and, in a really profound sense, they are no longer communities of scholars or places that cherish scholarly reason. I began to feel much more like an employee than I ever used to, and yes, that bloody term, line manager, got ever more common. I began to find it harder and harder to characterise universities as academic institutions, although from my limited knowledge, in the UK at least, Oxbridge still manage better than most 1. Yes, universities deliver teaching (just as Amazon or DHL deliver content), and yes, some great research is undertaken in universities (easy KPIs, there), but their modus operandi is not that of a corpus of scholars and students, but rather increasingly bends to the ethos of many modern corporations that self-evidently are failing society. Succinctly put, universities have lost their faith in the primacy of reason and truth, and failed to wrestle sufficiently with the constraints such a faith places on action — and on the bottom line.
Derek Bok, one of Harvard’s most successful recent Presidents, wrote words to the effect that universities appear to always choose institutional survival over morality. There is an externality to this, which society ends up paying. Wissenschaft als Beruf is no longer in the job descriptions or the mission statements2.
A few years back via a circuitous friendship I attended a graduation ceremony at what is widely considered as one of the UK’s finest city universities3. This friend’s son was graduating with a Masters. All the pomp was rolled out and I, and the others present, were given an example of hawking worthy of an East End barrow boy (‘world-beating’ blah blah…). Pure selling, with the market being overseas students: please spread the word. I felt ashamed for the Pro Vice Chancellor who knew much of what he said was untrue. There is an adage that being an intellectual presupposes a certain attitude to the idea of truth, rather than a contract of employment; that intellectuals should aspire to be protectors of integrity. It is not possible to choose one belief system one day, and act on another, the next.
The charge sheet is long. Universities have fed off cheap money — tax subsidised student loans — with promises about social mobility that their own academics have shown to be untrue. The Russell group, in particular, traducing what Humboldt said about the relation between teaching and research, have sought to diminish teaching in order to subsidise research, or, alternatively, claimed a phoney relation between the two. As for the “student experience”, as one seller of bespoke essays argued4, his business model depended on the fact that in many universities no member of staff could recognise the essay style of a particular student. Compare that with tuition in the sixth form. Universities have grown more and more impersonal, and yet claimed a model of enlightenment that depends on personal tuition. Humboldt did indeed say something about this:
“[the] goals of science and scholarship are worked towards most effectively through the synthesis of the teacher’s and the students’ dispositions”.
As the years have passed by, it has seemed to me that universities are playing intellectual whack-a-mole, rather than re-examining their foundational beliefs in the light of what they offer and what others may offer better. In the age of Trump and mini-Trump, more than ever, we need that which universities once nurtured and protected. It’s just that they don’t need to do everything, nor are they for everybody, nor are they suited to solving all of humankind’s problems. As had been said before, ask any bloody question and the universal answer is ‘education, education, education’. It isn’t.
That is a longer (and more cathartic) answer to my questions than I had intended. I have chosen not to describe the awful position that most UK universities have found themselves in at the hands of hostile politicians, nor the general cultural assault by the media and others on learning, rigour and nuance. The stench of money is the accelerant of what seeks to destroy our once-modern world. And for the record, I have never had any interest in, or facility for, management beyond that required to run a small research group, and teaching in my own discipline. I don’t doubt that if I had been in charge the situation would have been far worse.
Sydney Brenner, one of the handful of scientists who made the revolution in biology of the second half of the 20th century once said words to the effect that scientists no longer read papers they just Xerox them. The problem he was alluding to, was the ever-increasing size of the scientific literature. I was fairly disciplined in the age of photocopying but with the world of online PDFs I too began to sink. Year after year, this reading debt has increased, and not just with ‘papers’ but with monographs and books too. Many years ago, in parallel with what occupied much of my time — skin cancer biology and the genetics of pigmentation, and computerised skin cancer diagnostic systems — I had started to write about topics related to science and medicine that gradually bugged me more and more. It was an itch I felt compelled to scratch. I wrote a paper in the Lancet on the nature of patents in clinical medicine and the effect intellectual property rights had on the patterns of clinical discovery; several papers on the nature of clinical discovery and the relations between biology and medicine in Science and elsewhere. I also wrote about why you cannot use “spreadsheets to measure suffering” and why there is no universal calculus of suffering or dis-ease for skin disease ( here and here ); and several papers on the misuse of statistics and evidence by the evidence-based-medicine cult (here and here). Finally, I ventured some thoughts on the industrialisation of medicine, and the relation between teaching and learning, industry, and clinical practice (here), as well as the nature of clinical medicine and clinical academia (here and here ). I got invited to the NIH and to a couple of AAAS meetings to talk about some of these topics. But there was no interest on this side of the pond. It is fair to say that the world was not overwhelmed with my efforts.
At one level, most academic careers end in failure, or at last they should if we are doing things right. Some colleagues thought I was losing my marbles, some viewed me as a closet philosopher who was now out, and partying wildly, and some, I suspect, expressed pity for my state. Closer to home — with one notable exception — the work was treated with what I call the Petit-mal phenomenon — there is a brief pause or ‘silence’ in the conversation, before normal life returns after this ‘absence’, with no apparent memory of the offending event. After all, nobody would enter such papers for the RAE/REF — they weren’t science with data and results, and since of course they weren’t supported by external funding, they were considered worthless. Pace Brenner, in terms of research assessment you don’t really need to read papers, just look at the impact factor and the amount and source of funding: sexy, or not?5
You have to continually check-in with your own personal lodestar; dead-reckoning over the course of a career is not wise. I thought there was some merit in what I had written, but I didn’t think I had gone deep enough into the problems I kept seeing all around me (an occupational hazard of a skin biologist, you might say). Lack of time was one issue, another was that I had little experience of the sorts of research methods I needed. The two problems are not totally unrelated; the day-job kept getting in the way.