Late night thoughts

Not leaving your kids alone

UN aid chief Martin Griffiths: ‘The war in Gaza isn’t halfway through’

At least 136 UN workers have been killed; staff bring their children to work, so they might survive or at least die together.

And what to do?

We have to get much better at pitching into people’s souls.”

This is a new civilization…

Vaclav Smil on the Need to Abandon Growth

Speaking as an old-fashioned scientist, I think the message is kind of a primitive and, again, old-fashioned message. This is a finite planet. There is a finite amount of energy. There is finite efficiency of converting it by animals and crops. And there are certain sensitivities in terms of biogeochemical cycles, which will tolerate only that much. I mean, that should be obvious to anybody who’s ever taken some kind of kindergarten biology.

Unfortunately, this is a society where nobody’s taking kindergarten biology because everybody’s studying what’s communications, writing in code, economics, business administration, liaising the state office, and things like that. This is a new civilization we have. People are totally detached from reality. If you are attached, at least a bit, to reality, all of this is common sense.

End of an error

The SVB debacle has exposed the hypocrisy of Silicon Valley | John Naughton | The Guardian

The first thing to understand is that “Silicon Valley” is actually a reality-distortion field inhabited by people who inhale their own fumes and believe they’re living through Renaissance 2.0, with Palo Alto as the new Florence. The prevailing religion is founder worship, and its elders live on Sand Hill Road in San Francisco and are called venture capitalists. These elders decide who is to be elevated to the privileged caste of “founders”.

Error? Era? Hope so.

On retirement

I started retirement ( of a sort — but isn’t it always like that?) three years ago today, on the first day of the Gaelic spring. People feign surprise when I say I have never regretted the change. I am busy; I have discovered that even without going to work, there is still not enough time in the day, and most frustratingly of all, that creative work is still really hard. But progress and the learnings (a usage I used to chastise my daughter for her use of) continue. Old dog, new tricks, Lizzy.

TARA over TINA

I hadn’t come across the acronym TARA before, but it seems a hopeful thought for the New Year. Life is indeed more interesting with it set as the default.

TARA: There are real alternatives

TINA: There are no alternatives

(I have forgotten the source — apologies)

Kind words: an obituary

Doc Searls Weblog · Remembering Kim Cameron

We all get our closing parentheses. I’ve gone longer without closing mine than Kim did before closing his. That also makes me sad, not that I’m in a hurry. Being old means knowing you’re in the exit line, but okay with others cutting in. I just wish this time it wasn’t Kim.

Britt Blaser says life is like a loaf of bread. It’s one loaf no matter how many slices are in it. Some people get a few slices, others many. For the sake of us all, I wish Kim had more.

I am reminded of what a friend said of Amos Tversky, another genius of seemingly boundless vitality who died too soon: “Death is unrepresentative of him.”

Via John Naughton

Retirement and the Curse of Lord Acton

by reestheskin on 10/12/2020

Comments are disabled

According to a helpful app on my phone that I like to think acts as a brake on my sloth, I retired 313 days ago. One of the reasons I retired was so that I could get some serious work done; I increasingly felt that professional academic life was incompatible with the sort of academic life I signed up for. If you read my previous post, you will see this was not the only reason, but since I have always been more of an academic than clinician, my argument still stands.

Over twenty years ago, my friend and former colleague, Bruce Charlton, observed wryly that academics felt embarrassed — as though they had been caught taking a sly drag round the back of the respiratory ward — if they were surprised in their office and found only to be reading. No grant applications open; no Gantt charts being followed; no QA assessments being written. Whatever next.

I thought about retirement from two frames of reference. The first, was about finding reasons to leave. After all, until I was about 50, I never imagined that I would want to retire. I should therefore be thrilled that I need not be forced out at the old mandatory age of 65. The second, was about finding reasons to stay, or better still, ‘why keep going to work?’. Imagine you had a modest private income (aka a pension), what would belonging to an institution as a paid employee offer beyond that achievable as a private scholar or an emeritus professor? Forget sunk cost, why bother to move from my study?

Many answers straddle both frames of reference, and will be familiar to those within the universities as well as to others outwith them. Indeed, there is a whole new genre of blogging about the problems of academia, and employment prospects within it (see alt-acor quit-lit for examples). Sadly, many posts are from those who are desperate to the point of infatuation to enter the academy, but where the love is not reciprocated. There are plenty more fish in the sea, as my late mother always advised. But looking back, I cannot help but feel some sadness at the changing wheels of fortune for those who seek the cloister. I think it is an honourable profession.

Many, if not most, universities are very different places to work in from those of the 1980s when I started work within the quad. They are much larger, they are more corporatised and hierarchical and, in a really profound sense, they are no longer communities of scholars or places that cherish scholarly reason. I began to feel much more like an employee than I ever used to, and yes, that bloody term, line manager, got ever more common. I began to find it harder and harder to characterise universities as academic institutions, although from my limited knowledge, in the UK at least, Oxbridge still manage better than most 1. Yes, universities deliver teaching (just as Amazon or DHL deliver content), and yes, some great research is undertaken in universities (easy KPIs, there), but their modus operandi is not that of a corpus of scholars and students, but rather increasingly bends to the ethos of many modern corporations that self-evidently are failing society. Succinctly put, universities have lost their faith in the primacy of reason and truth, and failed to wrestle sufficiently with the constraints such a faith places on action — and on the bottom line.

Derek Bok, one of Harvard’s most successful recent Presidents, wrote words to the effect that universities appear to always choose institutional survival over morality. There is an externality to this, which society ends up paying. Wissenschaft als Beruf is no longer in the job descriptions or the mission statements2.

A few years back via a circuitous friendship I attended a graduation ceremony at what is widely considered as one of the UK’s finest city universities3. This friend’s son was graduating with a Masters. All the pomp was rolled out and I, and the others present, were given an example of hawking worthy of an East End barrow boy (‘world-beating’ blah blah…). Pure selling, with the market being overseas students: please spread the word. I felt ashamed for the Pro Vice Chancellor who knew much of what he said was untrue. There is an adage that being an intellectual presupposes a certain attitude to the idea of truth, rather than a contract of employment; that intellectuals should aspire to be protectors of integrity. It is not possible to choose one belief system one day, and act on another, the next.

The charge sheet is long. Universities have fed off cheap money — tax subsidised student loans — with promises about social mobility that their own academics have shown to be untrue. The Russell group, in particular, traducing what Humboldt said about the relation between teaching and research, have sought to diminish teaching in order to subsidise research, or, alternatively, claimed a phoney relation between the two. As for the “student experience”, as one seller of bespoke essays argued4, his business model depended on the fact that in many universities no member of staff could recognise the essay style of a particular student. Compare that with tuition in the sixth form. Universities have grown more and more impersonal, and yet claimed a model of enlightenment that depends on personal tuition. Humboldt did indeed say something about this:

“[the] goals of science and scholarship are worked towards most effectively through the synthesis of the teacher’s and the students’ dispositions”.

As the years have passed by, it has seemed to me that universities are playing intellectual whack-a-mole, rather than re-examining their foundational beliefs in the light of what they offer and what others may offer better. In the age of Trump and mini-Trump, more than ever, we need that which universities once nurtured and protected. It’s just that they don’t need to do everything, nor are they for everybody, nor are they suited to solving all of humankind’s problems. As had been said before, ask any bloody question and the universal answer is ‘education, education, education’. It isn’t.

That is a longer (and more cathartic) answer to my questions than I had intended. I have chosen not to describe the awful position that most UK universities have found themselves in at the hands of hostile politicians, nor the general cultural assault by the media and others on learning, rigour and nuance. The stench of money is the accelerant of what seeks to destroy our once-modern world. And for the record, I have never had any interest in, or facility for, management beyond that required to run a small research group, and teaching in my own discipline. I don’t doubt that if I had been in charge the situation would have been far worse.

 

Reading debt

 

Sydney Brenner, one of the handful of scientists who made the revolution in biology of the second half of the 20th century once said words to the effect that scientists no longer read papers they just Xerox them. The problem he was alluding to, was the ever-increasing size of the scientific literature. I was fairly disciplined in the age of photocopying but with the world of online PDFs I too began to sink. Year after year, this reading debt has increased, and not just with ‘papers’ but with monographs and books too. Many years ago, in parallel with what occupied much of my time — skin cancer biology and the genetics of pigmentation, and computerised skin cancer diagnostic systems — I had started to write about topics related to science and medicine that gradually bugged me more and more. It was an itch I felt compelled to scratch. I wrote a paper in the Lancet   on the nature of patents in clinical medicine and the effect intellectual property rights had on the patterns of clinical discovery; several papers on the nature of clinical discovery and the relations between biology and medicine in Science and elsewhere. I also wrote about why you cannot use “spreadsheets to measure suffering” and why there is no universal calculus of suffering or dis-ease for skin disease ( here and here ); and several papers on the misuse of statistics and evidence by the evidence-based-medicine cult (here and here). Finally, I ventured some thoughts on the industrialisation of medicine, and the relation between teaching and learning, industry, and clinical practice (here), as well as the nature of clinical medicine and clinical academia (here  and here ). I got invited to the NIH and to a couple of AAAS meetings to talk about some of these topics. But there was no interest on this side of the pond. It is fair to say that the world was not overwhelmed with my efforts.

At one level, most academic careers end in failure, or at last they should if we are doing things right. Some colleagues thought I was losing my marbles, some viewed me as a closet philosopher who was now out, and partying wildly, and some, I suspect, expressed pity for my state. Closer to home — with one notable exception — the work was treated with what I call the Petit-mal phenomenon — there is a brief pause or ‘silence’ in the conversation, before normal life returns after this ‘absence’, with no apparent memory of the offending event. After all, nobody would enter such papers for the RAE/REF — they weren’t science with data and results, and since of course they weren’t supported by external funding, they were considered worthless. Pace Brenner, in terms of research assessment you don’t really need to read papers, just look at the impact factor and the amount and source of funding: sexy, or not?5

You have to continually check-in with your own personal lodestar; dead-reckoning over the course of a career is not wise. I thought there was some merit in what I had written, but I didn’t think I had gone deep enough into the problems I kept seeing all around me (an occupational hazard of a skin biologist, you might say). Lack of time was one issue, another was that I had little experience of the sorts of research methods I needed. The two problems are not totally unrelated; the day-job kept getting in the way.

 

 

Continue Reading

Finding your way in your world

by reestheskin on 19/08/2020

Comments are disabled

I read Educated by Tara Westover earlier this year (it was published in 2018 and was a best seller). It is both frightening and inspiring. And important. Her story is remarkable, and it says more about real education than all the government-subjugated institutions like schools and universities can cobble together in their mission statements. WikiP provides some background on her.

Westover was the youngest of seven children born in Clifton, Idaho (population 259) to Mormon survivalist parents. She has five older brothers and an older sister. Her parents were suspicious of doctors, hospitals, public schools, and the federal government. Westover was born at home, delivered by a midwife, and was never taken to a doctor or nurse. She was not registered for a birth certificate until she was nine years old. Their father resisted getting formal medical treatment for any of the family. Even when seriously injured, the children were treated only by their mother, who had studied herbalism and other methods of alternative healing.

All the siblings were loosely homeschooled by their mother. Westover has said an older brother taught her to read, and she studied the scriptures of The Church of Jesus Christ of Latter-day Saints to which her family belonged. But she never attended a lecture, wrote an essay, or took an exam. There were few textbooks in their house.

As a teenager, Westover began to want to enter the larger world and attend college.

The last sentence above has it, as The Speaker of the House of Commons might say.

She gained entry to Brigham Young University (BYU), Utah, without a high school diploma and her career there was deeply influenced by a few individuals who saw something in her. She was awarded a Gates scholarship to the University of Cambridge to undertake a Masters and was tutored there by Professor Jonathan Steinberg. Some of their exchanges attest to the qualities of both individuals, and not a little about a genuine education.

‘I am Professor Steinberg,’ he said. ‘What would you like to read?’

‘For two months I had weekly meetings with Professor Steinberg. I was never assigned readings. We read only what I asked to read, whether it was a book or a page. None of my professors at BYU had examined my writing the way Professor Steinberg did. No comma, no period, no adjective or adverb was beneath his interest. He made no distinction between grammar and content, between form and substance. A poorly written sentence, a poorly conceived idea, and in his view the grammatical logic was as much in need of correction.’

‘After I’ve been meeting with Professor Steinberg for a month, he suggested I write an essay comparing Edmund Burke with Publius, the persona under which James Madison, Alexander Hamilton and John Jay had written the Federalist papers.’

‘I finished the essay and sent it to Professor Steinberg. Two days later, when I arrived for our meeting, he was subdued. He peered at me from across the room. I waited for him to say the essay was a disaster, the product of an ignorant mind, that it had overreached, drawn to many conclusions from too little material.’

“I have been teaching in Cambridge for 30 years,” he said. “And this is one of the best essays I’ve read.” I was prepared for insults but not for this.

At my next supervision, Professor Steinberg said that when I apply for graduate school, he would make sure I was accepted to whatever institution I chose. “Have you visited Harvard?” he said. “Or perhaps you prefer Cambridge?”…

“I can’t go,” I said. “I can’t pay the fees.” “Let me worry about the fees,” Professor Steinbeck said.

You can read her book and feel what is says about the value of education on many levels, but I want to pick out a passage that echoed something else I was reading at the same time. Tara Westover writes of her time as a child teaching herself at home despite the best attempts of most of her family.

In retrospect, I see that this was my education, the one that would matter: the hours I spent sitting at the borrowed desk, struggling to parse narrow strands of Mormon doctrine in mimicry of a brother who’d deserted me. The skill I was learning was a crucial one, the patience to read things I could not yet understand [emphasis added].

At the same time as I was reading Educated  I was looking at English Grammar: A Student’s Introduction by Huddleston & Pullum (the latter of the University of Edinburgh). This is a textbook, and early on the authors set out to state a problem that crops up in many areas of learning but which I have not seen described so succinctly and bluntly.

We may give that explanation just before we first used the term, or immediately following it, or you may need to set the term aside for a few paragraphs until we can get to a full explanation of it. This happens fairly often, because the vocabulary of grammar can’t all be explained at once, and the meanings of grammatical terms are very tightly connected to each other; sometimes neither member of a pair of terms can be properly understood unless you also understand the other, which makes it impossible to define every term before it first appears, no matter what order is chosen [emphasis added].

Continue Reading

Late night thoughts of a clinical scientist.

by reestheskin on 30/07/2020

Comments are disabled

More accurately, late night thoughts from 26 years ago. I have no written record of my Edinburgh inaugural, but my Newcastle inaugural given in 1994 was edited and published by Bruce Charlton in the Northern Review. As I continue to sift through the detritus of a lifetime of work, I have just come across it. I haven’t looked at it for over 20 years, and it is interesting to reread it and muse over some of the (for me) familiar themes. There is plenty to criticise. I am not certain all the metaphors should survive, and I fear some examples I quote from out with my field may not be as sound as I imply. But it is a product of its time, a time when there was some unity of purpose in being a clinical academic, when teaching, research and praxis were of a piece. No more. Feinstein was right. It is probably for the best, but I couldn’t see this at the time.

 

Late night thoughts of a clinical scientist

The practice of medicine is made up of two elements. The first is an ability to identify with the patient: a sense of a common humanity, of compassion. The second is intellectual, and is based on an ethic that states you must make a clear judgement of what is at stake before acting. That, without a trace of deception, you must know the result of your actions. In Leo Szilard’s words, you must “recognise the connections of things and the laws and conduct of men so that you may know what you are doing”.

This is the ethic of science. William Gifford, the 19th century mathematician, described scientific thought as “the guide of action”: “that the truth at which it arrives is not that which we can ideally contemplate without error, but that which we may act upon without fear”.

Late last year when I was starting to think what I wanted to say in my inaugural lecture, the BBC Late Show devoted a few programmes to science. One of these concerned itself with medical practice and the opportunities offered by advances in medical science. On the one side. Professor Lewis Wolpert, a developmental biologist, and Dr Markus Pembrey, a clinical geneticist, described how they went about their work. How, they asked, can you decide whether novel treatments are appropriate for a patient except by a judgement based on your assessment of the patient’s wishes, and imperfect knowledge. Science always comes with confidence limits attached.

On the opposing side were two academic ethicists, including the barrister and former Reith Lecturer Professor Ian Kennedy. You may remember it was Kennedy in his Reith lectures who quoting Ivan Illicit described medicine itself as the biggest threat to people’s health. The debate, or at least the lack of it. clearly showed that we haven’t moved on very far from when C P Snow (in the year I was born) gave his Two cultures lecture. What do I mean by two cultures? Is it that people are not aware of the facts of science or new techniques?… It was recently reported in the journal Science that over half the graduates of Harvard University were unable to explain why it is warmer in summer than winter. A third of the British population still believe that the sun goes round the earth.

But, in a really crucial way, this absence of cultural knowledge is not nearly so depressing as the failure to understand the activity rather the artefacts of science. Kennedy in a memorable phrase described knowledge as a ‘tyranny’1. It is as though he wanted us back with Galen and Aristotle, safe in our dogma, our knowledge fossilised and therefore ethically safe and neutered. There is, however, with any practical knowledge always a sense of uncertainly. When you lift your foot off the ground you never quite know where it is going to come down. And, as in Alice in Wonderland, “it takes all the running you can do to stay in the same place”.

It is this relationship, between practice and knowledge and how if affects my subject that I want to talk about. And in turn, I shall talk about clinical teaching and diagnosis, research and the treatment of skin disease.

 
Continue Reading

Mathiness, scientism and give me the money.

by reestheskin on 06/07/2020

Comments are disabled

Two nice quotes from Paul Romer about his paper Mathiness in the Theory of Economic Growth

The alternative to science is academic politics, where persistent disagreement is encouraged as a way to create distinctive sub-group identities.

The usual way to protect a scientific discussion from the factionalism of academic politics is to exclude people who opt out of the norms of science. The challenge lies in knowing how to identify them.

I can agree go along with both, but it is in the details that the daemons feast. It appears to me that the ‘norms of science’ argument is itself problematic, reminding me of those silly things you learn at school about the scientific method 1. The historical origin of the concept of the scientific method owed more to attempts to brand certain activities in the eyes of those who were not practicing scientists 2. As a rough approximation, the people who talk about the scientific method tend not to do science. Of course, in more recent times, the use of the term ‘science’ itself has been a flag for obtaining funding, status or approval. Dermatology is now dermatological sciences ; pharmacology is now pharmacological sciences. Even more absurd, in the medical literature I see the term delivery science (and I don’t mean Amazon), or reproducibility science. The demarcation of science from non-science is a hard philosophical problem going back way before Popper; I will not solve it. The danger is that we might end up exiling all those meaningful areas of human rationality that we once — rightly — considered outwith science, but still valued. There is indeed a subject that we might reasonably call medical science(s). It is just not synonymous with the principles and practice of medicine. It is also why political economy is a more useful subject than economics (or worse still, economic sciences).

  1. I guess this depends on how you interpret the ‘they’ in Romer’s second quote. It is the people or the norms that are the problem? I tend to think of both.
  2. There is an excellent recent article in the New York Review of Books that touches upon this issue Just Use Your Thinking Pump! by Jessica Riskin.

Advice for graduating doctors…

by reestheskin on 03/07/2020

Comments are disabled

I have forgotten who asked me to write the following. I think it was from a couple of years ago and was meant for graduating medics here in Edinburgh. (I am still sifting through the detritus of academic droppings)

As Rudolf Virchow was reported to say: sometimes the young are more right than the old. So, beware. This is my — and not his — triad.

First, when you do not know, ask for help. And sometimes ask for help when you do know (for how else would you check the circumference of your competence?).

Second, much as though science and technology changes, the organisation of care will change faster. Think on this in any quiet moments you have, for it may be the biggest influence on your career — for good and bad (sadly).

Third, look around you and do not be afraid to stray. The future is always on the periphery along a rocky path to nowhere in particular.

Teach yourself medicine

by reestheskin on 10/06/2020

Comments are disabled

Being an emeritus professor has lots of advantages. You have time to follow your thoughts and allow your reading to take you where it goes. Bruce Charlton pointed out to me many years ago that increasingly academics were embarrassed if you caught them just reading in their office (worse than having a sly fag…). It was looked upon as a form of daydreaming. Much better to fire up the excel spreadsheet or scour the web for funding opportunities. Best of all, you should be grant writing or ensuring that the once wonderful idea that only produced some not-so-shiny results can be veneered into a glossy journal.

Of course, being retired means you don’t have to go to management meetings. For most of career I could reasonably avoid meetings simply because if you spend most of your time researching (as I did), all you care about is publishing and getting funded. The university is just a little bit like WeWork — only the finances are were stronger.

One aspect of teaching-related meetings particularly irked me: student representatives, and how people misunderstand what representatives should and shouldn’t contribute. This is not specific to meetings — the same problem exists in the ‘happy sheets’ that pass for feedback — but is what I see as a problem in inference. Humans are very capable of telling you how they feel about something especially if they are asked at the time of, or soon after, a particular event. What is much harder is to imagine what the results will be if a change is made in how a particular event is undertaken, and how this will relate to underlying goals. This is a problem of inference. It needs some theory and data. So, if students say Professor Rees doesn’t turn up for teaching sessions, or doesn’t use a microphone or uses slides with minuscule text in lectures, this is useful knowledge. What is less helpful, is when you wish to appear to be empathetic (‘student centred’) and allow students to demand that you accept their views on pedagogy. This is akin to the patient telling the surgeon how to perform the operation. Contrary to what many believe, a lot is known about learning and expertise acquisition, and much of it is very definitely not common sense. And do not get me started on bloody focus groups.

Having got that bitching out of the way, I will add that one of my jobs over the last few years was to read virtually all the formal feedback that students produced for the medical school. Contrary to what you might think, it was an enjoyable task and I learned a lot. The biggest surprise was how restrained and polite students were (I wished they would get a little more angry about some things), and often how thoughtful they were. There were the occasional gems, too; my favourite being a comment about a clinical attachment: ‘I am sure the teaching would have been of a high standard — if we had had any.’ Still makes me smile (and the latter clause was accurate, but I am not so sure about the rest).

Now, I don’t want to feign any humblebragging but a few weeks back I received this comment from a former (anonymous) student (yes, the university is efficient at stopping your pay-cheque but thankfully is not good at terminating staff and in any case I still do some teaching..).

“Honestly you just need to look through the website he has built (http://reestheskin.me/teaching/). Who else has created an open-access textbook, lord knows how many videos (that are all engaging and detailed enough without being overwhelmingly complex) and entire Soundcloud playlists that I listen to while I’m driving for revision. I bet you could learn to spot-diagnose skin cancers without even being medical, just learn from his websites.”

Now of course this is the sort of feedback I like ?. But it’s the last sentence that pleases and impresses me most. The student has grasped the ‘meta’ of what I spent about seven years trying to do. There is an old aphorism that medical students turn into good doctors despite the best attempts of their medical school. Like many such aphorisms they are deeper than they seem. One of the foundation myths of medical schools is that undergraduate medicine really is as is was portrayed in Doctor in the House with just a smattering of modern political correctness thrown in. Sadly, no. Even without covid-19 universities and medical schools in particular are weaker than they seem. Demarcating what they can do well from things that others might do better needs to be much higher up the agenda. This particular student wasn’t taught that but learned it herself. Good universities can get that bit right occasionally.

A working life

by reestheskin on 17/01/2020

Comments are disabled

I don’t like the work-life balance meme. I know what it means, but I never wanted it. Medicine was once talked of as a vocation, and when I was a medical student I can remember many doctors who clearly believed so as well. Neonatologists who appeared to live on the special care baby unit; surgeons whose idea of Christmas day was to do a ward round and bring their children with them; and ‘be a paediatrician and bring up other people’s children’. The job was not just any job. I remember the wife of one professor who appeared on the ward when I was a houseman very late one night. “Had I seem the professor, her husband?” I had: I saw him there most evenings when I was on call. On this night, for whatever reason, she had accompanied him. Sadly for her, he had forgotten, and gone home without her. Thales and the well.

For me being an academic was a ‘calling’. A grand phrase, I know. But it has for most of my career been a way of life beyond the paycheque. I believe in the academic ideal, but increasingly fear the institutions no longer do. For me, home and office were not distinct. I vaguely remember — and it is quite possible I am mistaken here — that my first Professorial contract at the University of Newcastle stated ‘that by the nature of the work no hours of work are stipulated’. As my children would testify, weekend mornings were spent in the (work) office, and the afternoon in the gym and pool with them.

I retire* in the near future, and I face a practical problem. Much of my ‘work’ is at home — books and papers of course, but also the high spec iMac Pro that I have used to produce videos, alongside video cameras and lights. On the other hand, my office is full of things that strictly speaking are personal, in that I bought them with my own money rather than with a univeristy purchase order. But my work space — measured in square metres if not mental capacity, I hope — is diminishing. A domestic negotiation is required.

*From paid employment, not from my work.

Late night thoughts #9

by reestheskin on 06/06/2019

Comments are disabled

Late night thoughts on medical education #9: The Great Schism

Our present pattern of medical education is only one of several that are operating more or less successfully at the present time: good medicine can be taught and practised under widely varying conditions.

Henry Miller. ‘Fifty Years after Flexner’, 1966.

In my last post, I used a familiar Newton quote: ‘the job of science is to cleave nature at the joints’. We can never understand the entirety of the universe, all we can do is to fragment it, in order to make it amenable to experimentation or rational scrutiny. Before you can build anything you have to have taken other familiar things apart. Understanding always does violence to the natural world.

In this series of posts I have already listed some of the many things that confound attempts to improve medical education. But I don’t think we now need just a series of bug fixes. On the contrary, we need radical change — as in a new operating system — but radical change we have had before, and there are plenty of examples that we can use to model what we want. And as I hinted at in my last post, medical exceptionalism (and in truth pride) blind medical educationalists to how other domains of professional life operate. This soul searching about professional schools is not confined to medicine. There are debates taking place about law schools [1] and engineering schools [2], and corresponding debates about the role of the professions in society more generally (have the professions a future — professional work has, but who is going to do it?) [3][4].

Where to wield the scalpel

The conventional medical degree has two components: the preclinical years (which I used to to call the prescientific years, simply because rote learning is so favoured in them); and the clinical years. This divide has been blurred a little, but does not seriously alter my argument — the blurring has in any case been a mistake IMHO. The preclinical years have some similarities with other university courses, for good and bad. The clinical years are simply a mess. They aspire to a model of apprenticeship learning that is impossible to deliver on.

A positive

All is not lost, however. We know we can do some things well. Let me consider the ‘clinical’ first, before moving back to the ‘preclinical’.

Registrar training day in any speciality can work well. We know how to do it. There is a combination of private study, formal courses, and day-to-day supervised and graded practice. Classic apprenticeship. This doesn’t mean it is always done well — it isn’t — but in practice we know how to put it together. Let me use dermatology as an example.

In the UK and a few other countries, you enter dermatology after having done FY (foundation years 1 & 2) and a few years of internal medicine, having passed the MRCP exams along the way (the College tax). I refer to this as pre-dermatology training. At this stage, you compete nationally for training slots in dermatology.

This pre-dermatology training is unnecessary. We know this to be the case because most of the world does not follow this pattern, and seems to manage OK in terms of quality of their dermatologists. (This ‘wasted years’ period was painfully pointed out to me when I started dermatology training in Vienna: ‘you have wasted four years of your life’, I was told. I wasn’t pleased, but they were right and I was wrong)[5]. Why you ask, does the UK persist? Three explanations come to mind. First, the need for cheap labour to staff hospitals. Second, the failure to understand that staff on a training path need to supplement those who provide ‘core service’: much as senior registrars were supernumerary in some parts of the UK at one time. Finally, an inability to realise that we might learn from others.

Providing good apprenticeship training in dermatology is (in theory) very straightforward. Book learning is required, formal courses online can supplement this book learning, and since trainees are grouped in centres, interpersonal learning and discussion is easy to organise. Most importantly, trainees work with consultants, over extended periods of time, who know what they are trying to achieve: the purpose of the apprenticeship is to produce somebody who can replace them in a few years time. You do not need to be deep into educational theory to work well within this sort of environment, indeed you should keep any ‘educationalists’ at arms length.

Where this model does not work well, is in the ‘predermatology’ training. The obvious point is that much of this pre-dermatology work is not necessary and where it is, it should be carried out by those who are embarking on a particular career or by non-training staff (who may or may not be doctors). In the UK, if you have a FY doctor attached to a dermatology ward, they will rotate every few months through a range of specialties, and it is likely that they will have no affinity for most of them. Such jobs are educationally worthless as dermatology is an outpatient specialty. Ironically the only value of such jobs, is for those who have already committed to a career in dermatology. I will return to the all too familiar objections of what I propose in another blog post, but for training in many areas of medicine, including GP, radiology, pathology, psychiatry, what I have said of dermatology, holds.

We could frame my argument in another way. If you cannot hold onto the tenets of apprenticeship learning — extended periods of graded practice under the close supervision of a small group of masters and novices, it is not a training post.

University and the function of medical schools

I am now going to jump to the other end of medical education: what are medical schools for?

Current undergraduate medical education is a hybrid of ‘education’ and ‘training’. Universities can deliver high class education (I said can, not do), but they cannot deliver high class clinical training. They do not have the staff to do it, and they do not own the ‘means of production’. Apprenticeship learning does not work given the number of students, and in any case, teaching of medical students is a low priority for NHS hospitals who have been in a ‘subsistence’ mode for decades. Things will only get worse.

Other professions

Some (but not all) other professional schools or professions organise things differently. A degree may be necessary, but the bond between degree and subsequent training is loose. Unlike medicine, it is not the job of the university to produce somebody who is ‘safe’ and ‘certified’ on the day of graduation.

What I propose is that virtually all the non-foundational learning is shifted into the early years of apprenticeship learning where the individuals are paid employees of the NHS (or other employer). I talked about what foundational learning is in an earlier post, and here I am arguing that it is the foundational learning which universities should deliver. Just as professional service firms, law firms or engineering schools may prefer graduates with particular degrees, they know that they need to train their apprentices in a work environment, an environment in which they are paid (as with all apprenticeships the training salary reflects the market value to the individual of the professional training they receive). What becomes of medical schools?

Schools of health

The corpus of knowledge of the determinants of health and how to promote health, as well as how to diagnose and care for those who are sick is vast. Looked at in financial terms, or numbers of workers, it is a large part of the modern economy, and is of interest way beyond the narrow craft of clinical medicine. The fundamental knowledge underpinning ‘health’ includes sciences and arts. Although modern medicine likes to ride on the coat-tails of science, it is in terms of practice, a professional domain that draws eclectically from a broad scholarship and habits of mind. Medical science has indeed grown, but as a proportion of the domains of knowledge that make up ‘health’ it has shrunk.

Simply put, we might expect many students to study ‘health’, and for the subset of those who want to become doctors we need to think about the domains that are most suitable for ‘practising doctors’. Not all who study ‘health’, will want to be ‘practising doctors’, but of those who do, there may be constraints on what modules they should take. The goal is to produce individuals who can be admitted into a medical apprenticeship when they leave university.

Wrap up

I will write more about ‘health’ in the next post, and contrast it with what we currently teach (and how we teach it). The later part of training (genuine apprenticeship), as in the dermatology example, I would leave alone. But what I am suggesting is that we totally change the demands put on medical schools, and place apprenticeship learning back where it belongs.

[1] Stolker C. Rethinking the Law School. Cambridge University Press; 2014

[2] Goldberg DE, Somerville M, Whitney C. A Whole New Engineer: The Coming Revolution in Engineering Education. Threejoy Associates; 2014

[3] Susskind RE. The end of lawyers? : rethinking the nature of legal services. Oxford; New York: Oxford University Press; 2010

[4] Susskind R, Susskind D. The Future of the Professions. Oxford University Press, USA; 2015

[5] Rees J. The UK needs office dermatologists. BMJ. 2012;345:35.

Late night thoughts #8

by reestheskin on 31/05/2019

Comments are disabled

Late night thoughts on medical education #8: Where to draw the line?

In the previous post, I talked about some of the details of how undergraduate clinical teaching is organised. It is not an apprenticeship, but rather an alienating series of short attachments characterised by a lack of continuity of teacher-pupil contact. This is not something easily fixed because the structure is geared around the needs of the NHS staff who deliver the bulk of student teaching, rather than what we know makes sense pedagogically. I likened it to the need to put up with getting through security when you travel by plane: you want to get somewhere, but just have to grin and bear the humiliation. This is not a university education. I am not saying that individual teachers are to blame — far from it — as many enjoy teaching students. It is a system problem.

The interdependence of undergraduate education and postgraduate medical training

It is not possible to make sense of either undergraduate medical education or postgraduate training without looking at the forces that act on the other. It is also far too easy to assume that ‘the system’ in the UK is the only way to organise things, or indeed, to think it is anywhere near optimal. A damning critique of medicine (and much else in society) in the UK is our inability to learn from what others do.

The formative influences on (undergraduate) medical education are those conditions that were operating over half a century ago. At that time, a medical degree qualified you to enter clinical practice with — for many students — no further formal study. And much clinical practice was in a group size of n=1.

In the 1950s the house year (usually 6 months surgery and 6 months medicine) was introduced. Theoretically this was under the supervision of the university, but in practice this supervision was poor, and the reality was that this was never going to work in the ‘modern NHS’. How can the University of Edinburgh supervise its graduates who work at the other end of the country? In any case, as has been remarked on many occasions, although the rationale for the house year was ‘education’, the NHS has never taken this seriously. Instead, housepersons became general dogsbodies, working under conditions that could have come from a Dickens novel. In my own health board, the link between master and pupil has been entirely broken: apprenticeship is not only absent from the undergraduate course, but has been exiled from a lot of postgraduate training (sic). House doctors are referred to as ‘ward resources’, not tied to any group of supervising doctors. Like toilet cisterns, or worse…

Nonetheless, the changes in the 1950 and other reforms in the 1960s established the conventional wisdom that the aim of undergraduate medical education was not to produce a ‘final product’ fit to travel the world with their duffel-shaped leather satchel in hand. Rather, there would be a period of postgraduate training leading to specialist certification.

Training versus education

This change should have been momentous. The goal was to refashion the undergraduate component; and allow the postgraduate period to produce the finished product (either in a specialty, or in what was once called general practice). It is worth emphasising what this should have meant.

From the point of view of the public, the key time for certification for medial practice was not graduation, but being placed on the specialist register. The ability to practice independently was something granted to those with higher medical qualification (MRCP, MRCPysch etc) and who were appointed to a consultant post. All other posts were training posts, and practice within such roles was not independent but under supervision. Within an apprenticeship system — which higher professional training largely should be — supervision comes with lots of constraints, constraints that are implicit in the relation between master and pupil, and which have stayed largely unchanged across many guilds and crafts for near on a thousand years.

What went wrong was no surprise. The hospitals needed a cadre of generic dogbodies to staff them given the 24 hour working conditions necessary in health care. Rather than new graduates choosing their final career destination (to keep with my airport metaphor) they were consigned to a holding pattern for 2-7 years of their life. In this service mode, the main function was ‘service’ not supervised training. As one of my former tutees in Edinburgh correctly told me at graduation: (of course!)he was returning to Singapore, because if he stayed in the NHS he would just be exploited until he could start higher professional training. The UK remains an outlier worldwide in this pattern of enforced servitude[1].

What has all this to do with undergraduate education?

The driving force in virtually all decision making with the UK health systems is getting through to the year-end. The systems live hand-to-mouth. They share a subsistence culture, in which it almost appears that their primary role is not to deliver health care, but to reflect an ideology that might prove attractive to voters. As with much UK capitalism, the long term always loses out to the short term. What happened after the realisation that a graduating medical students was neither beast nor fowl, was predictable.

The pressure to produce generic trainees with little meaningful supervision in their day-to-day job, meant that more and more of undergraduate education was sacrificed to the goal of producing ‘safe and competent’ FY (foundation years 1 & 2) doctors, doctors who again work as dogsbodies and cannot learn within a genuine apprenticeship model. The mantra became that you needed five years at medical school, to adopt a transitory role, that you would willingly escape from as soon as possible. Furthermore the undergraduate course was a sitting duck for any failings of the NHS: students should know more about eating disorders, resilience, primary care, terminal care, obesity, drug use… the list is infinite, and the students sitting ducks, and the medical schools politically ineffective.

What we now see is an undergraduate degree effectively trying to emulate a hospital (as learning outside an inpatient setting is rare). The problem is simply stated: it is not possible to do this within a university that does not — and I apologise if I sound like an unreconstructed Marxist — control the means of production. Nor is it sensible to try and meld the whole of a university education in order to produce doctors suitable for a particular time-limited period of medical practice, that all will gladly leave within a few years of vassalage.

 Medical exceptionalism

Medicine is an old profession, (I will pass on GBS’ comments about the oldest profession). In medicine the traditional status of both ‘profession’ and ‘this profession’ in particular has been used to imagine that medicine can stand aloof from other changes in society. There are three points I want to make on this issue: two are germane to my argument, whilst the other, I will return to in another post.

The first is that in the immediate post-Flexner period to the changes in medical education in the 1950s and 1960s, few people in the UK went to university. Doctors did go to university even if the course was deemed heavily vocational, with a guaranteed job at the end of it. Learning lots of senseless anatomy may not have compared well with a liberal arts eduction but there was time for maturing, and exposure to the culture of higher learning. Grand phrases indeed, but many of us have been spoiled by their ubiquity. Our current medical students are bright and mostly capable of hard work, but many lack the breadth and ability to think abstractly of the better students in some other faculties. (It would for instance, be interesting to look at secular changes in degree awards of medical students who have intercalated.) No doubt, medical students are still sought after by non-medical employers, but I suspect this is a highly self-selected group and, in any case, reflects intrinsic abilities and characteristics as much as what the university has provided them with.

The second point, is that all the professions are undergoing change. The specialist roles that were formalised and developed in the 19th century, are under attack from the forces that Max Weber identified a century ago. The ‘terminally differentiated’ individual is treated less kindly in the modern corporate state. Anybody who has practiced medicine in the last half century is aware of the increasing industrialisation of medical practice, in which the battle between professional judgment and the impersonal corporate bureaucracy is being won by the latter [2][3]

My third point is more positive. Although there have been lots of different models of ‘professional training’ the most prevalent today is a degree in a relevant domain (which can be interpreted widely) following by selection for on the job training. Not all those who do a particular degree go onto the same career, and nor have the employers expected the university to make their graduates ‘fit for practice’ on day 1 of their employment. Medicine has shunned this approach, still pretending that universities can deliver apprenticeship training, whilst the GMC and hospitals have assumed that you can deliver a safe level of care by offloading core training that has to be learned in the workplace, to others. No professional services firm that relies on return custom and is subject to the market would behave in this cavalier way. Patients should not be so trusting.

In the next post, I will expand on how — what was said of Newton — we should cleave nature at the joints in order to reorganise medical education (and training).

[1] Re; the enforced servitude. I am not saying this work is not necessary, nor that those within a discipline do not need to know what goes on on the shop floor. But to put it bluntly, the budding dermatologist should not be wasting time admitting patients with IHD or COPD, or inserting central lines or doing lumbar punctures. Nor do I think you can ethically defend a ‘learning curve’ on patients given that the learner has committed not to pursue a career using that procedure. The solution is obvious, and has been discussed for over half a century: most health care workers need not be medically qualified.

[2] Which of course raises the issue of whether certification at an individual rather than an organisational level makes sense. In the UK the government pressure will be to emphasise the former at the expense of the latter: as they say, the beatings will continue until moral improves.

[3] Rewards in modern corporations like the NHS or many universities are directed at generic management skills, not domain expertise. University vice-chancellors get paid more than Nobel prize winners at the LMB. In the NHS there is a real misalignment of rewards for those clinicians who their peers recognise as outstanding, versus those who are medical managers (sic). If we think of some of the traditional crafts — say painting or sculpture – I doubt we can match the technical mastery expertise of Florence. Leonardo would no doubt now by handling Excel spreadsheets as a manager (see this piece on Brian Randell’s homepage on this very topic).

Late night thoughts #7

by reestheskin on 24/05/2019

Comments are disabled

Late night thoughts on medical education #7: Carousels

In the previous post I laid out some of the basic structures of the ‘clinical years’ of undergraduate medical degrees. In this post I want to delve a little deeper and highlight how things have gone wrong. I do not imagine it was ever wonderful, but it is certainly possible to argue that things have got a lot worse. I think things are indeed bad.

When I was a medical student in Newcastle in 1976-1982 the structure of the first two clinical years (years 3 and 4) were similar, whereas the final year (year 5) was distinct. The final year was made up of several long attachments — say ten weeks medicine and 10 weeks surgery — and there were no lectures or any demands on your time except that you effectively worked as an unpaid houseman, attached to a firm of two or three consultants. The apprenticeship system could work well during these attachments. The reasons for this partly reflected the fact that all parties had something to gain. Many if not most students chose where they did their attachments (‘if you like fellwalking, choose Carlisle etc), and had an eye on these units as a place to do your house jobs the following year. The consultants also had skin in the game. Instead of relying on interviews, or just exam results, they and all their staff (junior docs, nurses etc) got a chance to see close up what an individual student was like, and they could use this as a basis for appointing their houseperson the following year. If a houseman was away, you acted up, and got paid a small amount for this. At any time if you didn’t turn up, all hell would break out. You were essential to the functioning of the unit. No doubt there was some variation between units and centres, but this is how it was for me. So, for at least half of final year, you were on trial, immersed in learning by doing / learning on the job / workplace learning etc. All the right buzzwords were in place.

Carousels

As I have said, years 3 and 4 were different from final year, but similar to each other. The mornings would be spent on the ward and the afternoons — apart from Wednesdays — were for lectures. I didn’t like lectures (or at least those sort of lectures) so I skipped them apart from making sure that I collected any handouts which were provided on the first day (see some comments from Henry Miller on lectures below [1]).

The mornings were ‘on the wards’. Four year 3 students might be attached to two 30 bedded wards (one female, one male), and for most of the longer attachments you would be given a patient to go and see, starting at 9:30, breaking for coffee at 10:30 and returning for an hour or more in which one or more of you had to present you findings before visiting the bedside and being taught how to examine the patient. The number of students was small, and there was nowhere to hide, if you didn’t know anything.

For the longer attachments (10 weeks for each of paediatrics, medicine and surgery) this clinical exposure could work well. But the shorter attachments especially in year 4 were a problem, chiefly because you were not there long enough to get to know anybody.

The design problem was of course that the lectures were completely out of synchrony with the clinical attachments. You might be doing surgery in the morning, but listening to lectures on cardiology in the afternoon. Given my lack of love for lectures, I used the afternoons to read about patients I had seen in the morning, and to cover the subject of the afternoon lectures, by reading books.

I don’t want to pretend that all was well. It wasn’t. You might turn up to find that nobody was available to teach you, in which case we would retreat to the nurses canteen to eat the most bacon-rich bacon sandwiches I have ever had the pleasure of meeting (the women in the canteen thought all these young people needed building up with motherly love and food 🙂 ).

The knowledge of what you were supposed to learn was, to say the least, ‘informal’; at worst, anarchic. Some staff were amazingly helpful, but others — how shall I say — not so.

Year 5 envy

In reality, everybody knew that years 3 and 4 were pale imitations of year 5. The students wanted to be in year 5, because year 5 students — or at last most year 5 students — were useful. The problem was that the numbers (students and patients) and the staffing were not available. It was something to get through, but with occasional moments of hope and pleasure. Like going through security at airports: the holiday might be good, but you pay a price.

Present day

The easiest way to summarise what happens now is to provide a snapshot of teaching in my own subject at Edinburgh.

Year 4 (called year 5 now, but the penultimate year of undergraduate medicine) students spend two weeks in dermatology. Each group is made up of 12-15 students. At the beginning of a block of rotations lasting say 18 weeks in total, the students will have 2.5 hours of lectures on dermatology. During the two week dermatology rotation, most teaching will take place in the mornings. On the first morning the students have an orientation session, have to work in groups to answer some questions based on videos they have had to watch along with bespoke reading matter, and then there is an interactive ‘seminar’ going through some of the preparatory work in the videos and text material.

For the rest of the attachment students will attend a daily ‘teaching clinic’, in which they are taught on ‘index’ patients who attend the dermatology outpatients. These patients are selected from those attending the clinic and, if they agree, they pass through to the ‘teaching clinic’. The ‘teacher’ will be a consultant or registrar, and this person is there to teach — not to provide clinical care during this session.

Students will also sit in one ‘normal’ outpatient clinic as a ‘fly on the wall’, and attend one surgical session. At the end of the attachment, there is a quiz in which students attempt to answer questions in small groups of two or three. They also get an opportunity to provide oral feedback as well as anonymous written feedback. Our students rate dermatology highly in comparison with most other disciplines, and our NHS staff are motivated and like teaching.

The problems

When I read through the above it all sounds sort of reasonable, except that…

Students will pass though lots of these individual attachments. Some are four weeks long but many are only 1 or 2 weeks in duration. It is demanding to organise such timetables, and stressful for both students and staff

  • each day a different staff member will teach the students
  • it is unlikely that staff will know the names of most of the students. Students will usually not remember the name of the staff member who taught them in a previous week
  • most teaching is delivered by non-university employed staff. Most of these staff have little detailed knowledge of what students are (now) expected to know. The majority will not be involved in any formal assessments, and reasonably view the teaching as a break from doing clinic after clinic.
  • there is little opportunity to provide meaningful feedback on student performance, or to see student knowledge grow. Students find it easy to ‘hide’, and absenteeism is high and the rate of ‘illness’ seems higher than amongst hospital doctors.
  • teaching the students plays second fiddle to service delivery. The terminology within NHS job plans is telling. When you see a patient it is called ‘direct clinical care (DCC)’. For maybe the remaining 10-20% of your time you have sessions allocated as ‘supporting professional activities (SPA)’. SPA time will include work relating to revalidation, CPD, hospital admin, teaching of registrars, and delivery of undergraduate teaching. Our overseas students pay in excess of 50K per year in fees, and each UK student attracts perhaps 50K from fees and government monies. Teaching undergraduates is merely a ‘supporting activity’ even when 50K is changing hands. Fettes or Winchester might be more careful with their terminology.

My critique is not concerned with the individuals, but the system. It is simply hard to believe that this whole edifice is coherent or designed in the students’ interest. It is, as Flexner described UK medical school teaching a century ago, wonderfully amateur. Pedagogically it makes little sense. Nor in all truthfulness is it enjoyable for many staff or many students. Each two weeks a new batch will arrive and groundhog days begins. Again. And again. And if you believe the figures bandied about for the cost of medical education, the value proposition seems poor. We could do better: we should do better.

[1] Lectures. Henry Miller, who was successively Dean of Medicine and Vice Chancellor at Newcastle described how…

“Afternoon lectures were often avoided in favour of the cinema. The medical school was conveniently placed for at least three large cinemas….in one particularly dull week of lectures we saw the Marx brothers in ‘A Day at the Races’ three times.”

Late night thoughts #6

by reestheskin on 09/05/2019

Comments are disabled

Late night thoughts on medical education #6: Structures

In the previous post in this series (Late night thoughts #5: Foundations) I wrote about the content or material of medical education, hinting at some of the foundational problems (pardon the meta). We have problems distinguishing between knowledge that is essential for some particular domain of medical practice, and knowledge that is genuinely foundational. The latter is largely speciality independent, less immediate than essential knowledge, and is rightly situated within the university. The expertise necessary to teach foundational knowledge lies within universities.

What I have not made explicit so far in this essay is also important. The best place to learn much essential knowledge is within the hospital, and during a genuine apprenticeship. There are various ways we can hone a meaningful definition or description of apprenticeship but key is that you are an employee, that you get paid, and you are useful to your employer. Our current structures do not meet any of these criteria.

How we got here

Kenneth Calman in the introduction to his book ‘Medical Education’ points out that medical education varies enormously between countries, and that there is little evidence showing the superiority of any particular form or system of organisation. It is one of the facts that encourages scepticism about any particular form, and furthermore — especially in the UK — leads to questioning about the exorbitant costs of medical education. It also provides some support for the aphorism that most medical students turn into excellent doctors despite the best attempts of their medical schools.

Across Europe there have been two main models of clinical training (I am referring to undergraduate medical student training, not graduate / junior doctor training). One model relies on large lectures with occasional clinical demonstrations, whereas the UK system — more particularly the traditional English system — relies on ‘ clerkships’ on the wards.

At Newcastle when I was a junior doctor we used to receive a handful of German medical students who studied with us for a year. They were astonished to find that the ‘real clinical material’ was available for them to learn from, with few barriers. They could go and see patients at any time, the patients were willing, and — key point— the clinical material was germane to what they wanted to learn. The shock of discovering this veritable sweetshop put some of our students to shame.

The English (and now UK) system reflects the original guiding influence of the teaching hospitals that were, as the name suggests, hospitals where teaching took place. These hospitals for good and bad were proud of their arms length relationship with the universities and medical schools. The signature pedagogy was the same as for junior doctors. These doctors were paid (poorly), were essential (the place collapsed if they were ill), and of course they were employees. Such doctors learned by doing, supplemented by private study using textbooks, or informal teaching provide locally within the hospital or via the ‘Colleges’ or other medical organisations. Whatever the fees, most learning was within a not-for-profit culture.

 Scale and specialisation

It was natural to imagine or pretend that what worked at the postgraduate level would work at the undergraduate level, too. After all, until the 1950s, medical education for most doctors ended at graduation where, as the phrase goes, a surgeon with his bag full of instruments ventured forth to the four corners of the world.

This system may have worked well at one stage, but I think it fair to say it has been failing for nearer a century than half a century. At present, it is not a system of education that should be accepted. There are two reasons for this.

First, medicine has (rightly) splintered into multiple domains of practice. Most of the advances we have seen over the last century in clinical medicine reflect specialisation, specialisation as a response to the growth of explicit knowledge, and the realisation that high level performance in any craft relies not solely on initial certification, but daily practice (as in the ‘practice of medicine’). Second, what might have worked well when students and teachers were members of one small community, fails within the modern environment. As one physician at Harvard / Mass General Hospital commented a few years back in the New England Journal of Medicine: things started to go awry when the staff and students no longer ate lunch together.

Unpicking the ‘how’ of what has happened (rather than the ‘why’ which is, I think obvious), I will leave to the next post. But here is a warning. I first came across the word meliorism in Peter Medawar’s writing. How could it not be so, I naively thought? But of course, historians or political scientists would lecture me otherwise. It is possible for human affairs to get worse, even when all the humans are ‘good’ or at least have good intentions. The dismal science sees reality even more clearly: we need to only rely on institutions that we have designed to work well — even with bad actors.

Late night thoughts #5

by reestheskin on 02/05/2019

Comments are disabled

Late night thoughts on medical education #5: Foundations

We sought out an examiner who would understand that anatomy was being taught as an educational subject and not simply for the practice of surgery. I thought I had found such a man in an old colleague. I listened while he asked the student to name the successive branches emerging from the abdominal aorta in a cadaver. When we got to the inferior mesenteric he asked what viscera were supplied by that vessel. The student gave a complete and correct answer but did not know the exact amount of the rectum supplied. The examiner asked me what I thought and I said that I thought he was very good, that the only question he had missed was the last one, which in my opinion, was trivial. No, said the anatomist, by no means trivial. You have to know that before you can excis the rectum safely.

My mind still boggles at the thought of a newly graduated doctor undertaking the total excision of the rectum on the faint remembrance of the anatomy he learned as a student.

George Pickering, “Quest for Excellence in Medical Education: A Personal Survey

When I was a medical student I read this book by Sir George Pickering. It was published in 1978, and I suspect I read it soon after the Newcastle university library acquired it. Why I came across it I do not know, but at the time ‘new volumes’ were placed for a week or two on a shelf adjacent to the entrance, before being assigned their proper home (or ‘final resting place’). It was a way to find things you didn’t know you might enjoy. I liked this book greatly, and have returned it on many occasions. Parts of it are wonderfully dated (and charming), but it remains a wonderful young man’s book written by an old man. Now I am an old man, who read it first as a young man.

Roger Schank summarise the problems of education this way:

There are only two things wrong with the education system:

  1. What we teach, and
  2. How we teach it

George Pickering’s quote relates to ‘what we teach’ — or at least what we expect students to know — but in clinical medicine ‘what we teach’ and ‘how we teach’ are intimately bound together. This may be true for much  education, but the nature of clinical exposure and tuition in clinical medicine imposes a boundary on what options we can explore. The other limit is the nature of what we expect of graduates. People may think this is a given, but it is not. If you look worldwide, what roles a newly qualified doctor is asked to fill vary enormously (something I discovered when I worked in Vienna).

Here is another quote, this time from the philosopher, Ian Hacking, who has written widely on epistemology, the nature of causality and the basis of statistics (and much else).

Syphilis is signed by the market place where it is caught; the planet Mercury has signed the market place; the metal mercury, which bears the same name, is therefore the cure for syphilis.

Ian Hacking | The Emergence of Probability

Well, of course, this makes absolutely no sense to the modern mind. We simply do not accept the validity of the concept of entities being ‘signed‘ as a legitimate form of evidence. But no doubt medical students of the time would have been taught this stuff. Please note, those priests of Evidence Based Medicine (EBM), that doctors have always practiced Evidence Based Medicine, it is just that opinions on what constitutes evidence change. Hacking adds:

He [Paracelsus] had established medical practice for three centuries. And his colleagues carried on killing patients.

I am using these quotes to make two points. The first, is that there is content that is correct, relevant to some clinical practice and which medical students do not need to know. This may seem so obvious that it is not necessary to say it. But it is necessary to say it. Pickering’s example has lots of modern counterparts. We could say this knowledge is foundational for some medical practice, but foundational is a loaded term, although to be fair I do not know a better one. The problem with ‘foundational’ is that it is widely used by academic rent seekers and future employers. Students must know this, students ‘must’ know X,Y and Z. I once started to keep a list of such demands, but Excel spreadsheets have limits. You know the sort of thing: ethics, resilience, obesity, child abuse, climate change, oral health, team building, management, leadership, research, EBM, professionalism, heuristics and biases etc. Indeed, there is open season on the poor undergraduate, much of which we can lay blame for at the doors of the specialist societies and the General Medical Council (GMC).

My second point, stemming from the second quote, is to remind that much of what we teach or at least ask students to know is wrong. There is a feigned ignorance on this issue, as though people in the past were stupid, whereas we are smart. Yes, anatomy has not changed much, and I am not chucking out all the biochemistry, but pace Hacking, our understanding of the relation between ‘how doctors work’ and ‘what underpins that knowledge’ is opaque. We can — and do — tell lots of ‘just-so’ stories that we think explain clinical behaviour, that have little rational or experimental foundation. Clinicians often hold strong opinions on how they arrive at particularly decisions: there is a lot of data to suggest that whilst you can objectively demonstrate clinical expertise, clinicians often have little insight into how they actually arrive at the (correct) diagnosis (beyond dustbin concepts such as ‘pattern recognition’ or ‘clinical reasoning’).

What is foundational knowledge?

If you are a dermatologist, and you wish to excise a basal cell carcinoma (BCC, a common skin cancer) from the temple, you need to be aware of certain important anatomical structures (specifically the superficial temporal artery, and the temporal branch of the facial nerve). This knowledge is essential for clinical practice. It is simple to demonstrate this: ask any surgeon who operates in this area. Of course, if you are a lower GI surgeon, this knowledge may not be at your finger tips. Looked at the other way, this knowledge is in large part specialty specific (or at least necessary for a subset of all medical specialties). What happens if you damage these structures is important to know, but the level of explanation is not very deep (pardon the pun). If you cut any nerve, you may get a motor or sensory defect, and in this example, you may therefore get a failure in frontalis muscle action.

This knowledge is not foundational because it is local to certain areas of practice, and it does not form the basis or foundation of any higher level concepts (more on this below). The Pickering example, tells us about what a GI surgeon might need to know, but not the dermatologist. Their world views remains unrelated, although the I prefer the view of the latter. There is however another point. We should be very careful about asking medical students to know such things. So what do we expect of them?

Beyond essential

I find the example of anatomical knowledge as being essential compelling. But only in terms of particular domains of activity. Now, you may say you want students to know about ‘joints’ in general, and there may well be merit in this (Pickering, I suspect, thought so), but knowing the names of all the bones in the hand or foot is not essential for most doctors. If we move beyond ‘essential’ what is left?

At one time anatomy was both essential and foundational. And I am using the term foundational here to mean those concepts that underpin not just specialty specific medicine, but medicine in the round. A few examples may help.

Whatever branch of medicine you practice, it is hard to do so without some knowledge of pharmacology. How deep you venture , is subject to debate, but we do not think knowing the doses and the drug names in the BNF is the same as knowing some pharmacology. 

Another example. I would find it very hard to converse with a dermatologist colleague without a (somewhat) shared view of immunology or carcinogenesis. Every sentence we use to discuss a patient, will refer and make use of concepts that we use to argue and cast light on clinical decisions. If you want to explain to a patient with a squamous cell carcinoma (SCC) who has had an organ transplant why they are at such increased risk of tumours, it is simply not possible to have a meaningful conversation without immunology or carcinogenesis (and in turn, genetics, virology, and histopathology). And for brevity, I am putting to one side, other key domains such as behaviour and behaviour modification, ethics, economics and statistics etc.

To return to my simple anatomical example of the excision of the BCC. The local anatomy is essential knowledge, but it is not foundational. What is foundational is knowing what might happen if you cut any nerve.

Sequencing of learning

Let me try and put the above in the context of how we might think about medical education and medical training.

Foundational knowledge is specialty (and hence career) independent. Its function is to provide the conceptual framework that underpin much clinical practice. This not to say that the exact mix of such knowledge applies to all clinical domains, but we might expect most of it to be familiar to most doctors. But none of it will, years later, have the same day-to-day immediacy of ‘essential knowledge’ — think of my example of the temporal branch of the facial nerve for the dermatologist excising facial tumours on a weekly basis.

In this formulation, the core purpose of undergraduate medical education is to educate students in such knowledge. The purpose is not therefore to produce doctors at graduation who are ‘just not very good doctors’ but graduates who are able to pursue specialty training and make sense of the clinical world around them. The job of a medical school is to produce graduates who can start clinical training in an area of their choice. They are now in a position to — literally — understand the language of the practising doctors that surround them. They are not mini-doctors, but graduates, embarking on a professional career.

By contrast most specialty knowledge is not foundational, but essential for those within that specialty — not medical students. If you learn dermatology, you might come across things that help you learn respiratory medicine or cardiology but to be blunt, not very often. Specialties are not foundational domains of knowledge. You do not need to know dermatology to understand cardiology or vice versa.

Place of learning

The best place to learn the ‘foundations’ are universities. Anatomy, again may be an exception, but if you want to learn immunology, genetics, statistics or psychology you have, I think, no alternative. Hospitals simply cannot provide this.

On the other hand, using Seymour Papert’s metaphor, if you to want learn French you should go to Frenchland, if you want to learn maths, you should go to Mathland and if you want to learn doctoring, you need to go to doctorland. Medical schools are not the place to learn how to find you way around doctorland — how could they be?

NB: I will use the epithet TIJABP, but as subsequent posts will confirm, I am serious.

Late night thoughts #4

by reestheskin on 11/04/2019

Comments are disabled

Late night thoughts on medical education #4: Maps and scheming over schemas

One of the problems in learning clinical medicine is the relation between an overall schema of what you have to learn and the detail of the various components that make up the schema. I can remember very early in my first clinical year, seeing a child with Crohn’s disease, and subsequently trying to read a little about this disorder. My difficulty was that much of what I read, contrasted Crohn’s with various other conditions — ulcerative colitis, Coeliac and so on. The problem was that I didn’t know much about these conditions either. Where was I too start? A wood and the trees, issue.

I have, pace Borges written about maps and learning before. This is my current riff on that theme. I am going to use learning how to find your way around Edinburgh as my example. There is a simple map here.

That fine city

The centre of Edinburgh is laid out west to east, with three key roads north of the railway station. You can imagine a simple line map — like a London underground map — with three parallel main roads: Prince’s street, George Street and Queen street. You can then add in a greater level of detail, and some arterial routes in and out of the city centre.

If you were visiting Edinburgh for the first time, you could use this simple schema to try and locate places of interest. If you were lost and asked for help, it night prove useful. You could of course remember this simple plan — which is the most northerly of these three streets and so on — or perhaps use a simple cognitive prosthesis such as a paper map.

Students learn lots of these maps when they study medicine, because they are asked to find their way around lots of cities. They also forget many of them. The more complete the map, the harder it is to recall. If they have to navigate the same terrain most days, their recall is better. No surprises there. If you challenge a student you can literally see them reproducing the ‘map tool’ as they try and answer your question. Just like if you ask them the causes of erythema nodosum, you can literally see them counting their list on their fingers.

Novices versus experts

There are obvious differences between novices and experts. Experts don’t know need to recall the maps for multiple cities, instead they reside in the city of their specialty. Experts also tend not be good at recalling long lists of the causes of erythema nodosum, rather they just seem to recall a few that are relevant in any particular context. The map mataphor provides clues to this process.

If you challenge experts they can redraw the simple line figure that I started this piece with. They can reproduce it, although as the area of coverage is increased I suspect their map may begin to break the rules of 2D geometry: they move through the city professionally, but they are not professional cartographers.

The reason for this failure is that experts do not see the ‘line map’ in the mind’s eye, but actually see the images of the real geography in their mind as they move through it. They can deduce the simple line graph, but this is not what they use diagnostically to find their way around. By contrast, they see the images of the roads and building and can navigate based on those images. They have their own simulation, that they can usually navigate without effort. Of course, when they first visited Edinburgh, they too probably crammed a simple line graph, but as they spent time in the city, this simple cognitive tool, was replaced by experience.

This sort of way of thinking was AFAIK first highlighted by the US philosophers Dreyfus and Dreyfus. They pointed out novices use ‘rule based’ formal structures, whereas experts did not. This is obvious in craft based perceptual subjects such as dermatology (or radiology or histopathology). Experts don’t use check list to diagnose basal cell carcinomas or melanoma, they just compare what they see with a personal library of exemplars. The cognitive basis for this ability, taking advantage of the idea of ‘familial likeness’, has been studied for a long time, although I do not think the problem is solved in any sort of formal way. It is usually very fast — too fast for the explicit scoring methods promoted by most clinicians and educators.

Although this way of thinking is easiest to appreciate in perceptual subjects such as dermatology, most clinicians do not view things this way — even when the experimental evidence is compelling. Some believe the explicit rules they use to teach students, are how they do it themselves. Others believe that experts are fluent in some high level reasoning that students to not possess . They like to think that their exams can test this higher level ‘deep’ reasoning. I think they may be mistaken.

Finding the takeaway

There are some ideas that follow from my story.

  1. Without wishing to open up the delusion that factual recall is not critical to expertise, experts and novices do not possess the same methodology for working out what is going on. This means that we might promote simple structures that are placeholders for expert knowledge that will come through experience. These placeholders are temporary and meant to be replaced. We should be very careful about making them play a central role in assessment. To me this is akin to the way that some written Asian languages have different systems for children and adults.
  2. Some of these placeholders might need to be learned, but some can be external cognitive prostheses, such as a paper map or a BNF.
  3. Having to memorise lots of simple line-maps for lots of different cities imposes a heavy cognitive load on students. Long term memorisation of meaningful concepts works best when you don’t know you are trying to memorise things, but rather, you were trying to understand things. Our students are all too often held hostage by getting on by ‘reproducing’ concepts rather than understanding things.
  4. Becoming expert means minimising the distance between rote learning of line-maps and building up your library of exemplars. Distance here refers to time. In other words, the purpose of prior learning is to give you the ability to try and navigate around the city so that you can start the ‘real’ learning. Some cities are safer than others — especially if you might get lost. Better to start in Edinburgh than Jo’burg (the ITU is not the place to be a novice).
  5. If you look at the process of moving from being a student to acquiring high professional domain expertise (as a registrar), it would seem better to focus on a limited number of cities. What we should not do is to expect students to be at home in lots of different places. Better to find you feet, and then when they get itchy, move on.

Late night thoughts #3

by reestheskin on 05/04/2019

Comments are disabled

Late night thoughts on medical education #3: Touching the void

Clayton Christensen gets mixed press: he cannot be accused of not pushing his ideas on ‘disruption’ to — well — disruption. So, his long history of predicting how a large number of universities will be bankrupt in a few years due to ‘innovation’ and ‘digital disruption’ I take with a pinch of salt (except I would add: an awful lot should be bankrupt). But I am glad I have read what he writes, and what he says in the following excepts from an interview makes sense to me:

Fortunately, Christensen says that there is one thing that online education will not be able to replace. In his research, he found that most of the successful alumni who gave generous donations to their alma maters did so because a specific professor or coach inspired them.

Among all of these donors, “Their connection wasn’t their discipline, it wasn’t even the college,” says Christensen. “It was an individual member of the faculty who had changed their lives.”

“Maybe the most important thing that we add value to our students is the ability to change their lives,” he explained. “It’s not clear that that can be disrupted.”

Half of US colleges will be bankrupt in 10 to 15 years.

We know several factors that are dramatically important in promoting learning in university students: the correct sort of feedback, and students who understand what feedback is about (and hence can use it); and close contact. Implicit in the latter is that there is continued contact with full time staff. When stated like this it is easy to understand why the student experience and faculty guided learning is so poor in most UK medical schools. The traditional way of giving timely feedback has collapsed as the ward / bedside model of teaching has almost disappeared; and teaching is horribly fragmented because we have organised teaching around the working lives of full time clinicians, rather than what students need (or what they pay for). When waiting times are out of control, when ‘bodies’ are queued up on trolleys, and when for many people getting a timely appointment to see a NHS doctor is impossible, it is self evident that a tweak here and there will achieve very little. Without major change things will get much worse.

When MIT under Chuck Vest put all of their coursewhere on line it merely served to illustrate that the benefits of MIT were not just in the materials, but in ‘being there’. And ‘being there’ is made up on other students, staff, and the interactions between these two groups.

Medical schools were much smaller when I was a medical student (1976-1982). Nevertheless, there was remarkably little personal contact, even then. Lectures were to 130+ students, and occasional seminars were with groups of 10-12. Changing perspective, students did recognise the Dean of Medicine, and could name many of the lecturers who taught them. Integration of the curriculum had not totally disrupted the need for a course of lectures from a single person, and the whole environment for learning was within a physical space that was — appropriately enough — called a medical school: something obvious to the students was that research and teaching took place in the same location. For the first two years, with one possible exception, I was fairly confident that nobody knew my name. If a student passed a lecturer in the street, I doubt if the lecturer would recognise the student, let alone be able to identify them by name.

Two members of staff got to know me in the first term of my opening clinical year (year 3): Nigel Speight, a ‘first assistant’ (senior registrar / lecturer) in paediatrics; and Sam Shuster, the Professor of Dermatology in Newcastle, who I started a research project with. For paediatrics, I was one of four junior students attached to two 30-bedded-wards, for ten weeks. It was very clear that Nigel Speight was in charge of us, and the four of us were invited around to his house to meet his kids and his wife. It was interesting in all sorts of ways — “home visits” as we discovered in general practice, often are — but I will not go into detail here.

Sam invited me around for an early evening dinner and I met his wife (Bobby), and we talked science, and never stopped — except to slag off Margaret Thatcher, and Milton Friedman. Meeting Sam was — using Christensen’s phrase — my ‘change of life’ moment. As I have written elsewhere, being around Sam, was electric: my pulse rate stepped up a few gears, and in one sense my cortical bradycardia was cured.

There are those who say that meaningful personal contact is impossible in the modern ‘bums on seats’ research university. I do not agree, although it is not going to happen unless we create the necessary structures, and this does not involve bloody spreadsheets and targets. First, even in mega-universities like the Open University, with distance learners, it was shown to be possible. Second, in some collegial systems, close personal contact (and rapid verbal feedback!) is used to leverage a lot of private study from students. In the two years I did research under Sam’s supervision (as an undergraduate — not later when I worked for him as a full time researcher), I doubt that I spent more than six hours one-to-one with him.

How you leverage staff time to promote engagement and learning is the the single most important factor in giving students what they need (and often what they want, once they know what that is ). We will continue to fail students until we realise what we have lost.

Late night thoughts #2

by reestheskin on 12/03/2019

Comments are disabled

Late night thoughts on medical education #2: Apprenticeship

We have a very clear idea of how apprenticeship has worked over the last nine hundred years or so within Europe. The core ideas are of course much older, and the geography wider. But we have written records of the creation of the various social structures that led to the rapid changes in society that led in turn via the Renaissance to the Enlightenment and modern capitalism. We can trace so many of the norms that have guided my professional life: Royal Colleges, corporations, guilds, “masters and apprentices”, universities, certification and the granting of monopoly, and ‘professionalism’, to name but a few.

Apprenticeship is a powerful pedagogical model, but one that can only take place when a number of conditions are met. In medicine the ‘apprentice’ model is widely discussed, assumed, and contrasted with the ‘bums on seats’ lecture, the latter, the now signature pedagogy of the modern ‘massified’ university. It is also used to justify the high costs of training of education in medicine and some craft university courses.

At the level of higher professional training in medicine (or in the training of research academics) apprenticeship still can work well. There is an asymmetry between master and pupil (the master does know best, but cannot always justify why he knows best); long term interaction between both parties is required; and, at its best the pupils will model their behaviours on the master. Apprenticeship is not passive — it is not ‘shadowing’ (although a period of shadowing may be required); it will require the pupil to undertake tasks that can be observed and critiqued — you cannot learn complicated tasks based on passive observation. Chimps are highly intelligent, and yet learning to crack nuts using stones takes years and years, not because the young chimps do not watch their mothers, but because the mothers never watch (and hence correct) the young chimps. This requirement is not just required for motor tasks but for any complicated set of ‘thinking’ procedures that require accuracy and fluency. In medicine, surgeons are ahead of physicians on this, and have been for a long time.

In medieval times, becoming a master meant more than being a ‘journeyman’ — the level of professional expertise was greater, and it was recognised that teaching required another level of competence, and breadth. The master is not one step ahead on the way to perfection, but several. We prefer those teaching ‘A’ level physics, to have more than an ‘A’ level in physics themselves. And whatever domain expertise a master possesses, we know that experience of the problems or difficulties learners face, is important.

Still, in comparison with say school teaching the demands on the master (with regard to being a ‘professional’ educator) are modest. They know the job — they do not need to check out the syllabus — as they are effectively training people to do the same job they do day-to-day. They probably also have little need of theory and, in a sensible system, their reputation may be accurate.

In higher professional training in medicine, apprenticeship is still possible — it is just that it is harder than it once was (as to why, that is for another day). Similarly, at one time higher education was in large part viewed as a type of apprenticeship. Students were not staff, but they were not treated as schoolchildren, rather they were —at best— viewed as co-producers of knowledge within a university. If you were studying physics, the goal was to get you to approach the world like a physicist might. This may persist in a few institutions for a minority of students, but it is not the norm anymore.

In undergraduate medicine apprenticeship died a long time ago, although its previous health may well have been exaggerated. There is little long term personal interaction, with students passed around from one attachment to another, with many of the students feeling unwanted (‘burden of teaching’, ‘teaching load‘ etc). Staff and students can walk past each other in the street, none the wiser. Apprentices are — by definition — useful. It is this utility that underpinned the business model that formalised training and acceptance or rejection into the guild. But sadly — through no fault of their own – medical students are rarely useful. If they were useful they would be paid: they are not. Historically, students might have got paid to cover house officer absences (I did), but that world no longer exists. Nor are we able to return to it.

Whereas the master has an implicit model of the goals of training, that is no longer the case in undergraduate education, in which literally 500 or individuals are engaged in educating students for roles that they individually have little knowledge of. Instead of personal interaction, over a long time period, based on a common world view, medical schools create complicated management systems to process students, with the predictable lack of buy-in from those who are doing the educating.

There is a deeper point here. Much though a lot of UK postgraduate medical training is poor, it is possible to improve it within a framework that is known to work. Many doctors know how to do it (although the same cannot be said of the NHS). Undergraduate medical education is in a different place (like much of university education). At graduation, you step form one world into another, but just as with caterpillars and butterflies, the structures and environment we need to create are very different.

Late night thoughts #1

by reestheskin on 05/03/2019

Comments are disabled

Late night thoughts on medical education #1: we have no doctors

Today’s (Scottish) Daily Telegraph ran with a story about the shortage of paediatricians in Scotland. The Herald had a similar story, too. It is not just paediatrics that has major shortages. The same can be said about dermatology, radiology and a host of other areas of medicine. And that is not to mention GP land, which normally seems to attract most ‘government’ attention.

I find none of this surprising. The NHS has long been in subsistence mode, eating the seed corn (or to use that other phrase, ‘eating its young’), spending its moral and cultural capital at an alarming rate. Management is notable by its absence, whereas the administrators think they are ‘managers’, in part, because they can’t administrate and stay sane. By lack of management I meant those functions of management we see in most corporations or freestanding institutions. Changes in demography have not happened suddenly; the relation between age and health care provision, has been well known for a century or more; the impact of family structure and geography on care provision of elderly relatives evident since the early 1960s; changes in work force have been growing  for at least 40 years; and UK medicine has a long history of ignoring why people wish to leave either the UK (or want to leave the NHS). The attempt to run health care as a Taylor-like post-industrial service industry using staff who value their autonomy and professionalism, may not end well for doctors — or patients.

All the above, management should have been grappling with over the last quarter century: instead they have been AWOL. Meanwhile, politicians engage in speculative future-selling, where electoral vapourware is often a vehicle for the maintenance of political power. Given the state of UK politics (as in the BxxxxT word), it seems reasonable not to give politicians the benefit of the doubt any more. As individuals, no doubt, most of them mean well, and love their kids etc, but the system they have helped co-create, cannot command respect (that is now electorally obvious).

There are however some aspects of this that bear on what keeps me awake at night: how we educate — and to a lesser extent—how we train doctors.

  • The UK has not been self-sufficient in physicians since the birth of the NHS, rather choosing to import staff from the rest of the world. Despite this, doctor numbers are low in comparison with many other advanced economies. More dermatologists in the city of Vienna than in the UK……
  • Manpower estimates AFAIK, seem to reflect realpolitik rather than be based on bottom up data. Whatever is estimated is decided by the ‘realistic medicine’ availability of cash. Our politicians and the commissars of the medical establishment do not dissect animals in order to learn how the world works, they sacrifice them to the gods of political power, hoping some of the blood runs off on them.
  • The idea that you can plan on the basis that ‘x’ is the number of doctors you need in 10, 20, or40 years seems foolish. Hayek was not wrong about everything, even if Uncle Joe didn’t read him. Dead reckoning usually loses out to a good GPS system.
  • Most importantly of all, you must overproduce doctors. There are various ways you can think about this, but the current system of taking a bunch of 18 year olds into medical school and assuming that attrition will be low, will breed complacency. You cannot build any organisation that is worth working for when the ratio of applicants to vacant posts is less than one. And to miraculously imagine you can get the figure right over a score of years, is well….(and no, the running mean isn’t the right figure, here).
  • Medical education is claimed to be expensive and rate limiting (a Mr Hunt line, I think). There are various comments to make. First, the figures are inflated for political effect and possibly for accounting reasons. Claims that it costs £x to produce a consultant write off all ‘work’ the individual has done on the way to that position. By contrast, the NHS subverts market rates for many jobs done by ‘juniors’. And, as for undergraduate medical education, we know most of the money is an accounting sleight of hand. If you ask could we do it better, for less money, I will tell you for free.
  • The comments in the last point not withstanding, it must be an immediate goal to reduce the cost of medical education; and to think how the workforce of non-physicians can piggyback on what we know about training doctors. These conversations were alive half a century ago, and we have made little progress. The key issue is straightforward enough: without national accreditation, these posts will not encourage candidates to undergo training — you don’t need to read Gary Becker to get this, just talk to those who leave nursing — there are enough of them)
  • Even with more fluidity within professional careers, you need to allow for sideways movement and retraining of many middle aged doctors. You need to encourage staff, and move our focus from competencies(ugh!) to skills.
  • Without funnelling a lot more students into medicine as a career, little of what I have said above will make much difference. There are ways, but that is for another day.