When I was teaching — and I taught a lot towards the end of my paid career — there were many opportunities to talk to medical students off the record. It takes time, and some trust from both parties, but many students know what talking off the record means. To my surprise — yes, I am that paranoid — some of the online feedback they provide is also informative. My favourite, was a comment about specialty X, saying that they were certain that the teaching would have been of a high standard if they had actually had any. If you scour the BMJ online responses for comments from students, you can find similar views.
For many students, undergraduate medicine resembles flying in the pre-Covid days: the journey’s end is worth it, but you have put up with all the crap that passing through airport security entails. There is just no other practical way to get from A to B. Getting uptight about it as you pass through may come back to bite you.
I think medicine is worse than many other degrees, but there is plenty of misery to go around. The following is from an article in the Times Higher:
Leaning forward, he takes a deep breath and says: “Well, it’s like we’re running some kind of gauntlet, course after course, semester after semester, one year to the next, working hard, but our real selves are asleep. ‘Get good grades, good internships. Do lots of activities. Build an impressive résumé.’ That’s all we hear. We’re so busy proving ourselves that there’s no time to breathe, let alone think or reflect, and the stuff we have to do for classes mostly feels meaningless — to me, anyway. So we just go to sleep to get through it and hope it’s all worth it when the grind is over.”
But my student wonders out loud why learning in college must be a forced march and not a playful adventure — and I silently wonder the same about the process of tenure and promotion.
I have previously commented on Abraham Flexner on this site. The Flexner report is the most influential review of US medical education ever published, although some would argue that the changes it recommended were already working their way through the system. For a long time I was unaware of another project of his, an article with the title The Usefulness of Useless Knowledge 1. For me, there are echoes of Bertrand Russell’s In Praise of Idleness and the fact that Flexner’s essay was published at the onset of World War 2 adds anther dimension to the topic.
As for medical education, the ever-growing pressure is to teach so much that many students don’t have time to learn anything. I wish some other comments from Flexner opened any GMC dicta on what a university medical education should be all about.
“Now I sometimes wonder,” he wrote, “whether there would be sufficient opportunity for a full life if the world were emptied of some of the useless things that give it spiritual significance; in other words, whether our conception of what is useful may not have become too narrow to be adequate to the roaming and capricious possibilities of the human spirit.”
The following is from Scot Galloway at NYU Stern. He shoots from the hip, and sometimes only thinks afterwards. But he is interesting, brave, and more often right than most. I think I would have hated what he said when I was ready (sic) to go to university. But now, I think I wasn’t, and for medicine in particular, allowing 17 year olds to fall into the clutches of the GMC and their ilk should be a crime against….
Gap years should be the norm, not the exception. An increasingly ugly secret of campus life is that a mix of helicopter parenting and social media has rendered many 18-year-olds unfit for college. Parents drop them off at school, where university administrators have become mental health counselors. The structure of the Corona Corps would give kids (and let’s be honest, they are still kids) a chance to marinate and mature. The data supports this. 90% of kids who defer and take a gap year return to college and are more likely to graduate, with better grades. The Corps should be an option for non-college-bound youth as well.
A few months back, I was walking past the entrance of the old Edinburgh Medical School, founded in 1726. A not-so-crazy thought came into my head, one that I could not dismiss: we need to move on from the idea that a Medical School must be situated within a University (and of course, it wasn’t always, anyway). The founding set of ideas that we have struggled with ever since Flexner, we should now recast for a very different world. We need to create something new, something that makes sense in terms of a university and something that puts professional training within a professional context. At present, we fail on both of these accounts. Rather than integrate we should fracture. We need to search out our own new world.
I read Educated by Tara Westover earlier this year (it was published in 2018 and was a best seller). It is both frightening and inspiring. And important. Her story is remarkable, and it says more about real education than all the government-subjugated institutions like schools and universities can cobble together in their mission statements. WikiP provides some background on her.
Westover was the youngest of seven children born in Clifton, Idaho (population 259) to Mormon survivalist parents. She has five older brothers and an older sister. Her parents were suspicious of doctors, hospitals, public schools, and the federal government. Westover was born at home, delivered by a midwife, and was never taken to a doctor or nurse. She was not registered for a birth certificate until she was nine years old. Their father resisted getting formal medical treatment for any of the family. Even when seriously injured, the children were treated only by their mother, who had studied herbalism and other methods of alternative healing.
All the siblings were loosely homeschooled by their mother. Westover has said an older brother taught her to read, and she studied the scriptures of The Church of Jesus Christ of Latter-day Saints to which her family belonged. But she never attended a lecture, wrote an essay, or took an exam. There were few textbooks in their house.
As a teenager, Westover began to want to enter the larger world and attend college.
The last sentence above has it, as The Speaker of the House of Commons might say.
She gained entry to Brigham Young University (BYU), Utah, without a high school diploma and her career there was deeply influenced by a few individuals who saw something in her. She was awarded a Gates scholarship to the University of Cambridge to undertake a Masters and was tutored there by Professor Jonathan Steinberg. Some of their exchanges attest to the qualities of both individuals, and not a little about a genuine education.
‘I am Professor Steinberg,’ he said. ‘What would you like to read?’
‘For two months I had weekly meetings with Professor Steinberg. I was never assigned readings. We read only what I asked to read, whether it was a book or a page. None of my professors at BYU had examined my writing the way Professor Steinberg did. No comma, no period, no adjective or adverb was beneath his interest. He made no distinction between grammar and content, between form and substance. A poorly written sentence, a poorly conceived idea, and in his view the grammatical logic was as much in need of correction.’
‘After I’ve been meeting with Professor Steinberg for a month, he suggested I write an essay comparing Edmund Burke with Publius, the persona under which James Madison, Alexander Hamilton and John Jay had written the Federalist papers.’
‘I finished the essay and sent it to Professor Steinberg. Two days later, when I arrived for our meeting, he was subdued. He peered at me from across the room. I waited for him to say the essay was a disaster, the product of an ignorant mind, that it had overreached, drawn to many conclusions from too little material.’
“I have been teaching in Cambridge for 30 years,” he said. “And this is one of the best essays I’ve read.” I was prepared for insults but not for this.
At my next supervision, Professor Steinberg said that when I apply for graduate school, he would make sure I was accepted to whatever institution I chose. “Have you visited Harvard?” he said. “Or perhaps you prefer Cambridge?”…
“I can’t go,” I said. “I can’t pay the fees.” “Let me worry about the fees,” Professor Steinbeck said.
You can read her book and feel what is says about the value of education on many levels, but I want to pick out a passage that echoed something else I was reading at the same time. Tara Westover writes of her time as a child teaching herself at home despite the best attempts of most of her family.
In retrospect, I see that this was my education, the one that would matter: the hours I spent sitting at the borrowed desk, struggling to parse narrow strands of Mormon doctrine in mimicry of a brother who’d deserted me. The skill I was learning was a crucial one, the patience to read things I could not yet understand [emphasis added].
At the same time as I was reading Educated I was looking at English Grammar: A Student’s Introduction by Huddleston & Pullum (the latter of the University of Edinburgh). This is a textbook, and early on the authors set out to state a problem that crops up in many areas of learning but which I have not seen described so succinctly and bluntly.
We may give that explanation just before we first used the term, or immediately following it, or you may need to set the term aside for a few paragraphs until we can get to a full explanation of it. This happens fairly often, because the vocabulary of grammar can’t all be explained at once, and the meanings of grammatical terms are very tightly connected to each other; sometimes neither member of a pair of terms can be properly understood unless you also understand the other, which makes it impossible to define every term before it first appears, no matter what order is chosen [emphasis added].
Whenever I have looked at the CVs of many young doctors or medical students I have often felt saddened at what I take to be the hurdles than many of them have had to jump through to get into medical school. I don’t mean the exams — although there is lots of empty signalling there too — but the enforced attempts to demonstrate you are a caring or committed to the NHS/ charity sector person. I had none of that; nor do I believe it counts for much when you actually become a doctor1. I think it enforces a certain conformity and limits the social breadth of intake to medical school.
However, I did
do things work outside school before going to university, working in a variety of jobs from the age of 14 upwards: a greengrocer’s shop on Saturdays, a chip shop (4-11pm on Sundays), a pub (living in for a while 😃), a few weeks on a pig-farm (awful) and my favourite, working at a couple of petrol stations (7am-10pm). These jobs were a great introduction to the black economy and how wonderfully inventive humanity — criminal humanity— can be. Naturally, I was not tempted😇. Those in the know would even tell you about other types of fraud in different industries, and even that people actually got awarded PhDs by studying and documenting the sociology of these structures (Is that why you are going to uni, I was once asked).
On the theme of that newest of crime genres — cybercrime — there is a wonderful podcast reminding you that if much capitalism is criminal, there is criminal and there is criminal. But many of the iconic structures of modern capitalism — specialisation, outsourcing and the importance of the boundaries between firm and non-firm — are there. Well worth a listen.
I think there is a danger in exaggerating the role of caring and compassion in medicine. I am not saying you do not need them, but rather that I think they are less important that the technical (or professional) skills that are essential for modern medical practice. I want to be treated by people who know how to assess a situation and who can judge with cold reason the results of administering or withholding an intervention. If doctors were once labelled priests with stethoscopes, I want less of the priest bit. Where I think there are faults is in the idea that you can contribute most to humanity by ‘just caring’. The Economist awhile back reported on an initiative from the Centre for Effective Altruism in Oxford. The project labelled the 80,000 hours initiative advises people on which careers they should choose in order to maximise their impact on the world. Impact should be judged not on how much a particular profession does, but on how much a person can do as an individual. Here is a quote relating to medicine:
Medicine is another obvious profession for do-gooders. It is not one, however, on which 80,000 Hours is very keen. Rich countries have plenty of doctors, and even the best clinicians can see only one patient at a time. So the impact that a single doctor will have is minimal. Gregory Lewis, a public-health researcher, estimates that adding an additional doctor to America’s labour supply would yield health benefits equivalent to only around four lives saved.
The typical medical student, however, should expect to save closer to no lives at all. Entrance to medical school is competitive. So a student who is accepted would not increase a given country’s total stock of doctors. Instead, she would merely be taking the place of someone who is slightly less qualified. Doctors, though, do make good money, especially in America. A plastic surgeon who donates half of her earnings to charity will probably have much bigger social impact on the margin than an emergency-room doctor who donates none.
Yes, the slightly less qualified makes me nervous.
Henry Miller died a few months before I started medical school in Newcastle in 1976. At the time of his death he was VC of the university having been Dean of Medicine and Professor of Neurology. By today’s standards he was a larger than life figure. I like reading what he said about medical education, although with hindsight I think he was wrong about many if not most things. But there was a freshness and sense of spirited independence of mind in his writing that we not longer see in those who run our universities (with some notable exceptions such as Louise Richardson). In the time of COVID we should remember the costs of conformity and patronage.
It would be naive to express surprise at the equanimity with which successive governments have regarded the deteriorating hospital service, since it is in the nature of governments to ignore inconvenient situations until they become scandalous enough to excite powerful public pressure. Nor, perhaps, should one expect patients to be more demanding: their uncomplaining stoicism springs from ignorance and fear rather than fortitude; they are mostly grateful for what they receive and do not know how far it falls short of what is possible. It is less easy to forgive ourselves…..Indeed election as president of a college, a vice chancellor, or a member of the University Grants committee usually spells an inevitable preoccupation with the politically practicable, and insidious identification with central authority, and a change of role from informed critic to uncomfortable apologist.
Originally published in the Lancet, 1966,2, 647-54. (This version from ‘Remembering Henry’, edited by Stephen Lock and Heather Windle).
I have forgotten who asked me to write the following. I think it was from a couple of years ago and was meant for graduating medics here in Edinburgh. (I am still sifting through the detritus of academic droppings)
As Rudolf Virchow was reported to say: sometimes the young are more right than the old. So, beware. This is my — and not his — triad.
First, when you do not know, ask for help. And sometimes ask for help when you do know (for how else would you check the circumference of your competence?).
Second, much as though science and technology changes, the organisation of care will change faster. Think on this in any quiet moments you have, for it may be the biggest influence on your career — for good and bad (sadly).
Third, look around you and do not be afraid to stray. The future is always on the periphery along a rocky path to nowhere in particular.
One thing that sticks with me from medical school onwards (both as student and faculty) is the partisan nature of specialties. Most of this is harmless fun: my organ (skin, liver, kidney etc) is bigger than your organ; the brain is more complicated than any other organ and therefore neurologists must be smarter than everybody else (although curiously this doesn’t seem to stretch to neurosurgeons — at least when neurologists are talking). Let’s call it organ imperialism. The humour of little boys judging their vitality by how high they can p*** up the wall. There are more vital things to get angry about.
There are however some darker sides to this professional ethnicity. Doctors indulging in advocacy for particular patient groups can often seem like doctors wishing their own unit or disease of interest receives more resources. A salient example in dermatology is the way that NHS resources for cancer (or children) frequently trump other demands. It is easier to lobby successfully for skin cancer1 than acne or hair loss in the absence of any meaningful attempt to weigh patient suffering (or just to assume it is self-evident)2. The contrast between paediatrics and geriatrics is often informative about underlying values.
One area that does worry me more is the encroachment of politics on medical education. I am thinking in particular on a priori claims about the superiority of certain models of care, or the attempts to subvert student choice of career in the name of what the ‘NHS needs’.
Undergraduate medical education should be both scholarly and intellectually neutral as to how health care is organised. We should of course introduce students to the various systems, and encourage them to criticise them. We should teach them to be analytical, and to understand the various reasons why people have chosen different systems (or how their views are manipulated). But we should be neutral in the sense that judgments need to be based on rational argument rather than slogans, and that students must be able to argue based on evidence.
I would say the same about career choice. Our primary duty in a university is to students. If a university were to demand that their graduates in computing were only to work for a British computing company and confine themselves to topics of ‘national importance’, or that its graduates in economics were only to work for the public rather than the public sector, they would no longer be taken seriously as an educational institution. And rightly so. Medicine should be no different.
Once there was General Practice, medicine in the image of the late and great Julian Tudor-Hart. Then there was Primary Care. The following article from Pulse made me sit up and wonder whether we have got it right.
Under the five-year contract announced last year, networks were to receive 70% of the funding to employ a pharmacist, a paramedic, a physiotherapist and a physician associate, and 100% of the funding for a hiring social prescriber, by 2023/24… Six more roles will now be added to the scheme from April ‘at the request of PCN clinical directors’ – pharmacy technicians, care co-ordinators, health coaches, dietitians, podiatrists and occupational therapists…PCNs can choose to recruit from the expanded list to ‘make up the workforce they need’…The document added that mental health professionals, including Improving Access to Psychological Therapy (IAPT) therapists, will be added from April 2021 following current pilots…NHS England will also explore the feasibility of adding advanced nurse practitioners (ANPs) to the scheme [emphasis added].
Adam Smith among others pointed out the advantages of specialisation. We owe virtually all of the modern capitalist world to the power of this insight. But we also know that there are opposing forces — and not just those of the Luddites. Just think back to Ronald Coase and the Theory of the Firm. Why do companies not outsource everything? Why are there companies at all? Simply because under some circumstances transaction costs and formalisation of roles and contracts limit outsourcing 1. Contra the English approach is that of the Buurtzorg (links here, here and here) in the Netherlands where it is explicit that many of the tasks undertaken by highly skilled staff do not require high level skills. But — so the argument goes — the approach is more successful, robust and rewarding for both patients and staff. This is closer to the Tudor-Hart model. It really does depend on what sort of widgets you are dealing with, and whether fragmentation of activity improves outcomes, or merely diminishes costs in situations where outcomes are hard to define in an Excel spreadsheet.
I started my dermatological career in Vienna in the mid-1980s as a guest (I am deliberately not using the cognate German term) of Prof Klaus Wolf. Vienna, for close to two hundred years, has been a Mecca for all things dermatological, and Sam Shuster, in Newcastle, thought it wise to go somewhere else for up to a year — before returning to Newcastle. The plan was to learn some clinical dermatology and see how others worked. I had a great time — Vienna is a wonderful European city – and I didn’t work too hard. I learned some clinical basics, enjoyed the music (more ECM than opera) and spent some of my time doing a little lab work, more as a technician than anything else. I knew that when I returned to Newcastle I would spend a year or so as a registrar before applying for a MRC or Wellcome Training Fellowship (and for the medics amongst you, no, I never registered for higher training). In the meantime, as well as learning some clinical dermatology, I needed to learn some cell biology.
I went to medical school in 1976 and qualified in 1982, having taken a year out to study medical statistics (with an emphasis on the medical) and epidemiology, so I hadn’t any lab or cell biological experience. It was now 1986-87 and the preceding decade has seen a revolution in what we now call molecular cell biology — or just biology(?). I needed to teach myself some. Luckily, the best textbook I have ever read — the Molecular Biology of the Cell was published by James Watson and a bunch of other wonderful scientists in 1983 and my memory is that it was this first edition I bought. The book had attitude. The authors clearly loved their subject, and thought science was to do not so much with facts but the activity of designing and implementing experiments that whispered to you how the biological universe worked. They wanted to share that feeling with you, because one day, just perhaps, you might… On the back cover there was a picture of the authors pretending to be real superstars like the ‘fab four’ on that most famous of pedestrian-crossings in the world. (There is more on this here and here)
In the company of a good companion (a book in this case) there is little in biology that is very difficult. If you are motivated, even the absence of a personal teacher is not too serious a drawback. You would be better off with a teacher — if the cost of teacher was zero — but it would be wasteful to imagine that you need a teacher for a significant fraction of the time you need to spend studying. For some areas of biology, say quantitive genetics, the above statements may need tweaking a little, but the general point holds.
Almost a quarter century ago, I read a paper in PNAS on statistics by Peter Donnelly and David Balding on how to interpret DNA forensic evidence. I had studied a little statistics in my intercalated degree but a sentence from this paper made me sit up
We argue that the mode of statistical inference which seems to underlie the arguments of some authors, based on a hypothesis testing framework, is not appropriate for forensic identification.
The paper itself was remarkably clear even to somebody with little mathematics, and unpicking it signalled that I knew even less than I thought I knew. Several years later, it prompted me to go back and try and re-learn what little mathematics I had grasped at school, so that I might appreciate some modern genetics (and medical statistics) a little better.
Learning mathematics is different form learning biology. The absence of a teacher is more of an issue, but there are lots of historical examples showing that a good ‘primer’ with questions and answers allows many children to develop, if not high level skills, a facility with numbers. (I am talking here about using mathematics as a toolbox to follow how one can solve well defined problems — not push back the frontiers). A key aspect of this is the nature of mathematical proof, and how well you can obtain feedback on your abilities by submitting to the discipline of simple exercises with unambiguous answers. I don’t think there is a direct equivalent to this in most of biology but in the process of writing this today I see there are workbooks for the Molecular Biology of the Cell textbook. No doubt they help, but the uniqueness of the correct answer in maths is a wonderful guide and fillip.
I retired earlier this year (yes, thanks for asking, it’s wonderful), and one of the projects I had lined up was to learn a little more about a domain of human knowledge in which my ignorance had been bugging me for years. I had made some attempts in this area before — bought some books as an excuse for lack of effort — but had failed. I had found an excellent primer (in fact I bought it ten or so years ago), but speaking of the present, I have to say that I find the task hard, very hard. For me, its tougher than intermediate mathematics, and although there are questions at the end of each chapters there are no given answers. This is not a criticism of the book, but rather reflects the nature of the subject. A teacher or even a bunch of fellow
masochists students would help greatly. I make progress, but some more pedagogical infrastructure would, I feel, push me around the winding path a little faster. So, for several months I have been plodding away, mostly being disciplined, but because I have other things to do, occasionally falling off the wagon (indeed I note that I can multitask by falling off several wagons simultaneously).
All three stories are germane to how I think about undergraduate medical education and how it is far too wasteful and expensive. As for the how, that I must leave for another day very soon. Even without an exam in sight, I have to get some studying done. Spaced recall and immersion is the student’s friend.
Being an emeritus professor has lots of advantages. You have time to follow your thoughts and allow your reading to take you where it goes. Bruce Charlton pointed out to me many years ago that increasingly academics were embarrassed if you caught them just reading in their office (worse than having a sly fag…). It was looked upon as a form of daydreaming. Much better to fire up the excel spreadsheet or scour the web for funding opportunities. Best of all, you should be grant writing or ensuring that the once wonderful idea that only produced some not-so-shiny results can be veneered into a glossy journal.
Of course, being retired means you don’t have to go to management meetings. For most of career I could reasonably avoid meetings simply because if you spend most of your time researching (as I did), all you care about is publishing and getting funded. The university is just a little bit like WeWork — only the finances
are were stronger.
One aspect of teaching-related meetings particularly irked me: student representatives, and how people misunderstand what representatives should and shouldn’t contribute. This is not specific to meetings — the same problem exists in the ‘happy sheets’ that pass for feedback — but is what I see as a problem in inference. Humans are very capable of telling you how they feel about something especially if they are asked at the time of, or soon after, a particular event. What is much harder is to imagine what the results will be if a change is made in how a particular event is undertaken, and how this will relate to underlying goals. This is a problem of inference. It needs some theory and data. So, if students say Professor Rees doesn’t turn up for teaching sessions, or doesn’t use a microphone or uses slides with minuscule text in lectures, this is useful knowledge. What is less helpful, is when you wish to appear to be empathetic (‘student centred’) and allow students to demand that you accept their views on pedagogy. This is akin to the patient telling the surgeon how to perform the operation. Contrary to what many believe, a lot is known about learning and expertise acquisition, and much of it is very definitely not common sense. And do not get me started on bloody focus groups.
Having got that bitching out of the way, I will add that one of my jobs over the last few years was to read virtually all the formal feedback that students produced for the medical school. Contrary to what you might think, it was an enjoyable task and I learned a lot. The biggest surprise was how restrained and polite students were (I wished they would get a little more angry about some things), and often how thoughtful they were. There were the occasional gems, too; my favourite being a comment about a clinical attachment: ‘I am sure the teaching would have been of a high standard — if we had had any.’ Still makes me smile (and the latter clause was accurate, but I am not so sure about the rest).
Now, I don’t want to feign any humblebragging but a few weeks back I received this comment from a former (anonymous) student (yes, the university is efficient at stopping your pay-cheque but thankfully is not good at terminating staff and in any case I still do some teaching..).
“Honestly you just need to look through the website he has built (http://reestheskin.me/teaching/). Who else has created an open-access textbook, lord knows how many videos (that are all engaging and detailed enough without being overwhelmingly complex) and entire Soundcloud playlists that I listen to while I’m driving for revision. I bet you could learn to spot-diagnose skin cancers without even being medical, just learn from his websites.”
Now of course this is the sort of feedback I like 😂. But it’s the last sentence that pleases and impresses me most. The student has grasped the ‘meta’ of what I spent about seven years trying to do. There is an old aphorism that medical students turn into good doctors despite the best attempts of their medical school. Like many such aphorisms they are deeper than they seem. One of the foundation myths of medical schools is that undergraduate medicine really is as is was portrayed in Doctor in the House with just a smattering of modern political correctness thrown in. Sadly, no. Even without covid-19 universities and medical schools in particular are weaker than they seem. Demarcating what they can do well from things that others might do better needs to be much higher up the agenda. This particular student wasn’t taught that but learned it herself. Good universities can get that bit right occasionally.
Schools will undoubtedly still exist, but a good schoolteacher can do no better than to inspire curiosity which an interested student can then satisfy at home at the console of his computer outlet. There will be an opportunity finally for every youngster, and indeed, every person, to learn what he or she wants to learn in his or her own time, at his or her own speed, in his or her own way. Education will become fun because it will bubble up from within and not be forced in from without.
Not in this world, I would add, or at last not yet. Many — possibly most — medical students view university as akin to clearing airport security: a painful necessit if you want to go somehwere. They are no more generous about their schooling.
Original link Via Stephen Downes
People are always demanding that medical students must learn this or that (obesity, psychiatry, dermatology, ID, eating disorders). The result is curriculum overload, a default in favour of rote learning by many students, and the inhibition of curiosity. It was not meant to be like this, but amongst others, the GMC, the NHS, and others have pushed a vision of university medical education that shortchanges both the students and medical practice over the long term. Short-termism rules. Instead of producing graduates who are ready to learn clinical medicine is an area of their choice, we expect them to somehow come out oven-ready at graduation. I do not believe it is possible to do this to a level of safety that many other professions demand, nor is this the primary job of a university. Sadly, universities have given up on arguing, intimidated by the government and their regulatory commissars, and nervous of losing their monopoly on producing doctors.
But I will make a plea that one area really does deserve more attention within a university : the history of how medical advance occurs. No, I do not mean MCQs asking for the date of birth of Robert Koch or Lord Lister, but a feel for the historical interplay of convention and novelty. Without this our students and our graduates are almost confined to living in the present, unaware of the past, and unable to doubt how different the future will be. Below is one example.
”In 1938 Albert Hofmann, a chemist at the Sandoz Laboratories in Basel, created a series of new compounds from lysergic acid. One of them, later marketed as Hydergine, showed great potential for the treatment of cerebral arteriosclerosis. Another salt, the diethylamide (LSD), he put to one side, but he had “a peculiar presentiment,” as he put it in his memoir LSD: My Problem Child (1980), “that this substance could possess properties other than those established in the first investigations.
In 1943 he prepared a fresh batch of LSD. In the final process of its crystallization, he started to experience strange sensations. He described his first inadvertent “trip” in a letter to his supervisor:
At home I lay down and sank into a not unpleasant, intoxicated-like condition, characterized by extremely stimulated imagination. In a dream-like state, with eyes closed (I found the daylight to be unpleasantly glaring), I perceived an uninterrupted stream of fantastic pictures, extraordinary shapes with intense, kaleidoscopic play of colors.
After eliminating chloroform fumes as a possible cause, he concluded that a tiny quantity of LSD absorbed through the skin of his fingertips must have been responsible. Three days later he began a program of unsanctioned research and deliberately ingested 250 micrograms of LSD at 4:20 PM. Forty minutes later, he wrote in his lab journal, “Beginning dizziness, feeling of anxiety, visual distortions, symptoms of paralysis, desire to laugh.” He set off home on his bicycle, accompanied by his laboratory assistant. This formal trial of what Hofmann considered a minute dose of LSD had more distressing effects than his first chance exposure:
Every exertion of my will, every attempt to put an end to the disintegration of the outer world and the dissolution of my ego, seemed to be wasted effort. A demon had invaded me, had taken possession of my body, mind, and soul. I jumped up and screamed, trying to free myself from him, but then sank down again and lay helpless on the sofa…. I was taken to another world, another place, another time.
A doctor was summoned but found nothing amiss apart from a marked dilation of his pupils. A fear of impending death gradually faded as the drug’s effect lessened, and after some hours Hofmann was seeing surreal colors and enjoying the play of shapes before his eyes.
Many editors of learned medical journals now automatically turn down publications describing the sort of scientific investigation that Albert Hofmann carried out on himself. Institutional review boards are often scathing in their criticism of self-experimentation, despite its hallowed tradition in medicine, because they consider it subjective and biased. But the human desire to alter consciousness and enrich self-awareness shows no sign of receding, and someone must always go first. As long as care and diligence accompany the sort of personal research conducted by Pollan and Lin, it has the potential to be as revealing and informative as any work on psychedelic drugs conducted within the rigid confines of universities.
I titled a recent post musing over my career as ‘The Thrill is Gone’. But I ended on an optimistic note:
‘The baton gets handed on. The thrill goes on. And on’
But there are good reasons to think otherwise. Below is a quote from a recent letter in the Lancet by Gagab Bhatnaga. You can argue all you like about definitions of ‘burnout’, but good young people are leaving medicine. The numbers who leave for ever may not be large but I think some of the best are going. What worries as much is those who stay behind.
The consequences of physician burnout have been clearly observed in the English National Health Service (NHS). F2 doctors (those who are in their second foundation year after medical school) can traditionally go on to apply to higher specialist training. Recent years have seen an astounding drop in F2 doctors willing to continue NHS training4 with just over a third (37·7%) of F2 doctors applying to continue training in 2018, a decrease from 71·3% in 2011. Those taking a career break from medicine increased almost 3-fold from 4·6% to 14·6%. With the NHS already 10 000 doctors short, the consequences of not recruiting and retaining our junior workforce will be devastating.
Henry characterise the less attractive teaching rounds as examples of shifting dullness
Henry Miller (apologies, a medic joke)
Woodrow Wilson once remarked that it is easier to change the location of a cemetery than it is to change a curriculum.
Via Jon Talbot, commenting on an article on the failures of online learning. I would only add the comment made by Henry Miller (in the context of medicine):
Curriculum reform, a disease of Deans.
The government has instructed Health Education England to consult patients and the public on what they need from “21st century” medical graduates
It won’t end well.
The quote below is from a paper in PNAS on how students misjudge their learning and what strategies maximise learning. The findings are not surprising (IMHO) but will, I guess, continue to be overlooked (NSS anybody?). As I mention below, it is the general point that concerns me.
Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom.
In this report, we identify an inherent student bias against active learning that can limit its effectiveness and may hinder the wide adoption of these methods. Compared with students in traditional lectures, students in active classes perceived that they learned less, while in reality they learned more. Students rated the quality of instruction in passive lectures more highly, and they expressed a preference to have “all of their physics classes taught this way,” even though their scores on independent tests of learning were lower than those in actively taught classrooms. These findings are consistent with the observations that novices in a subject are poor judges of their own competence (27⇓–29), and the cognitive fluency of lectures can be misleading (30, 31). Our findings also suggest that novice students may not accurately assess the changes in their own learning that follow from their experience in a class.
The authors go on:
These results also suggest that student evaluations of teaching should be used with caution as they rely on students’ perceptions of learning and could inadvertently favor inferior passive teaching methods over research-based active pedagogical approaches….
As I say above, it is the general rather than the particular that concerns me. Experience and feeling are often poor guides to action. We are, after all, creatures that represent biology’s attempt to see whether contemplation can triumph over reflex. There remains a fundamental asymmetry between expert and novice, and if there isn’t, there is little worth learning (or indeed worth paying for).
At the same time, there has been a growing “pull” from the UK and other richer nations for doctors and nurses from Africa, as their own health systems have struggled to train and retain sufficient local healthcare workers while demand from ageing populations continues to rise.
I am aware of the issue but keep being pulled back to the claims about how expensive it is to train doctors (in the UK or other similar countries). Yes, I know the oft wheeled out figures, but I am suspicious of them.
Awhile back I was sat in a cafe close to the university campus. I couldn’t help but listen in on the conversation of a few students who were discussing various aspect of university life, and their own involvement in student politics. I couldn’t warm to them: they were boorish and reminded me of a certain Prime Minister. But I did find myself in agreement on one point: many UK universities are too big and if you are really serious about undergraduate education, you need smaller institutions than is the norm in the Russell group. You can have large institutions and teach well — the Open University is the classic example historically — but Russell group universities are not designed for the same purpose.
A few months back there was an interview in the Guardian with Michael Arthur, the Vice Chancellor of University College, London (UCL). In it he said some extraordinary things. Not extraordinary in the sense that you have might not have heard them before, or that they were difficult to grasp. Just extraordinary in their banality of purpose.
UCL like many universities in the UK has and will continue to rapidly expand undergraduate student numbers. The interviewer asked him whether or not UCL was not already too big. Arthur replied:
“We want to be a global player,” says Arthur. “Round the world, you’re seeing universities of 90,000, 100,000 students. If you have critical mass, you can create outstanding cross-disciplinary research on things like climate change. You can do research that makes a difference.” He mentions a treatment recently developed at UCL that makes HIV, the virus that causes Aids, untransmittable. If UCL didn’t increase student numbers, thus maximising fee revenue, such research would have to be cut back. “To me,” Arthur says, “that is unthinkable.”
The tropes are familiar to those who have given up serious thinking and have short attention spans: ‘global player’, ‘critical mass’, ‘cross disciplinary’, ‘make a difference’, and so on. Then there is the ‘maximising fee revenue’ so that research is not cut back — “that is unthinkable”
Within the sector it is widely recognised that universities lose money on research. In the US in the Ivy League, endowments buffer research and in some institutions, teaching. In the UK, endowments outwith Oxbridge are modest, and student fees fund much research. As research volume and intensity increases, the need for cross subsidy becomes ever greater. This is of course not just within subjects, but across the university and faculties.
That universities lose money on research is a real problem. For instance, in medicine much research is funded by charities who do not pay the full costs of that research. Governments pretend they fill this gap, but I doubt that is now the case. Gaps in research funding are therefore being made up out of the funds that are allocated to educate doctors, or students in other subjects. And anybody who has been around UK universities for a while knows that a lot of the research — especially in medicine — would have at one time being classed as the D of R&D. This sort of work is not what universities are about: it is just that the numbers are so large that they flatter the ‘research figures’ for the REF (research excellence framework).
Pace the students in the cafe, few can mount any argument against the view that once you have grown beyond several thousand students the student experience and student learning worsen. Phrases such as ‘research-led teaching’ and ‘exposure to cutting edge research’ are common, but the reality is that there is little evidence to support them in the modern university. They are intended as fig leaves to mask some deeper stirrings. Arthur states that it ‘would be unthinkable’ to cut back on research. He may believe that, but I doubt if his self-righteousness is shared by the majority of students who spend much of their lives paying off student debts.
A few years ago, whilst on a flight to Amsterdam, I chatted with a physicist from a Dutch university. We talked about teaching and research. He was keen on the idea of situating institutions that resembled US liberal arts colleges (as in small colleges) within bigger and more devolved institutions. I doubt that would be practical in the UK — the temptation for the centre to steal the funds is something VCs (Vice Chancellors not Venture Capitalists, that is) would not be able to resist. The late Roger Needham, a distinguished Professor of Computing at Cambridge, and former head of Microsoft Research in Cambridge, pointed out that most IP generated by universities was trivial and that the most important IP we produced were educated and smart students. He was perhaps talking about PhDs and within certain domains of knowledge, but I will push beyond that. Educating students matters.
And contrary to what Arthur thinks many of the world’s best universities have far fewer students than UCL even before its recent metastatic spread.
Medical students have higher rates of depression, suicidal ideation, and burnout than the general population and greater concerns about the stigma of mental illness. The nature of medical education seems to contribute to this disparity, since students entering medical school score better on indicators of mental health than similarly aged college graduates. Roughly half of students experience burnout, and 10% report suicidal ideation during medical school
This is from the US, and I do not know the comparable figures for the UK. Nor as I really certain what is going on in a way that sheds light on causation or what has changed. By way of comparison, for early postgraduate training in the UK, I am staggered by how many doctors come through it unscathed. I don’t blame those who want to bail out.
Direct URL for this post.
An economist may have strong views on the benefits of vaccination, for example, but is still no expert on the subject. And I often cringe when I hear a doctor trying to prove a point by using statistics.
There were some critical comments about this phrase used by Wolfgang Münchau in a FT article. The article is about how ‘experts’ lose their power as they lose their independence. This is rightly a big story, one that is not going away, and one the universities with their love of mammon and ‘impact’ seem to wish was otherwise. But there is a more specific point too.
Various commentators argued that because medicine took advantage of statistical ideas that doctors talked sense about statistics. The literature is fairly decisive on this point: most doctors tend to be lousy at statistics, whereas the medical literature may or (frequently) may not be sound on various statistical issues.
Whenever I hear people talk up the need for better ‘communication skills’ or ‘communication training’ for our medical students, I question what level of advanced statistical training they are referring to. Blank stares, result. Statistics is hard, communicating statistics even harder. Our students tend to be great at communicating or signalling empathy, but those with an empathy for numbers often end up elsewhere in the university.
Direct URL for this post.
The main story is about an ‘anti-vaxxer’ who had informed the university that he/she was opposed to receiving any vaccinations, but the university had not noticed or acted upon this advice till after the student had started univeristy. Cardiff university were ordered to pay £9K to the anti-vaxxer healthcare student.
But this caught my eye even more.
In a separate case summary, also published on 1 July, the OIA said that it had told Wrexham Glyndwr University to compensate eight students who had complained about the quality of a healthcare-related course.
The watchdog said that the students had complained that a key part of the course had not been taught as promised, meaning that they were not given the necessary skills to practise safely. Some teaching hours were cancelled for some modules, and the group also complained about the behaviour of a staff member, who they said was “unapproachable and aggressive”.
The OIA, which ruled that the complaint was partly justified, said that Glyndwr should refund tuition fees of £2,140 to each student, and pay an additional £1,500 compensation to each of them for the inconvenience caused.
Direct URL for this post.
Our present pattern of medical education is only one of several that are operating more or less successfully at the present time: good medicine can be taught and practised under widely varying conditions.
Henry Miller. ‘Fifty Years after Flexner’, 1966.
In my last post, I used a familiar Newton quote: ‘the job of science is to cleave nature at the joints’. We can never understand the entirety of the universe, all we can do is to fragment it, in order to make it amenable to experimentation or rational scrutiny. Before you can build anything you have to have taken other familiar things apart. Understanding always does violence to the natural world.
In this series of posts I have already listed some of the many things that confound attempts to improve medical education. But I don’t think we now need just a series of bug fixes. On the contrary, we need radical change — as in a new operating system — but radical change we have had before, and there are plenty of examples that we can use to model what we want. And as I hinted at in my last post, medical exceptionalism (and in truth pride) blind medical educationalists to how other domains of professional life operate. This soul searching about professional schools is not confined to medicine. There are debates taking place about law schools  and engineering schools , and corresponding debates about the role of the professions in society more generally (have the professions a future — professional work has, but who is going to do it?) .
The conventional medical degree has two components: the preclinical years (which I used to to call the prescientific years, simply because rote learning is so favoured in them); and the clinical years. This divide has been blurred a little, but does not seriously alter my argument — the blurring has in any case been a mistake IMHO. The preclinical years have some similarities with other university courses, for good and bad. The clinical years are simply a mess. They aspire to a model of apprenticeship learning that is impossible to deliver on.
All is not lost, however. We know we can do some things well. Let me consider the ‘clinical’ first, before moving back to the ‘preclinical’.
Registrar training day in any speciality can work well. We know how to do it. There is a combination of private study, formal courses, and day-to-day supervised and graded practice. Classic apprenticeship. This doesn’t mean it is always done well — it isn’t — but in practice we know how to put it together. Let me use dermatology as an example.
In the UK and a few other countries, you enter dermatology after having done FY (foundation years 1 & 2) and a few years of internal medicine, having passed the MRCP exams along the way (the College tax). I refer to this as pre-dermatology training. At this stage, you compete nationally for training slots in dermatology.
This pre-dermatology training is unnecessary. We know this to be the case because most of the world does not follow this pattern, and seems to manage OK in terms of quality of their dermatologists. (This ‘wasted years’ period was painfully pointed out to me when I started dermatology training in Vienna: ‘you have wasted four years of your life’, I was told. I wasn’t pleased, but they were right and I was wrong). Why you ask, does the UK persist? Three explanations come to mind. First, the need for cheap labour to staff hospitals. Second, the failure to understand that staff on a training path need to supplement those who provide ‘core service’: much as senior registrars were supernumerary in some parts of the UK at one time. Finally, an inability to realise that we might learn from others.
Providing good apprenticeship training in dermatology is (in theory) very straightforward. Book learning is required, formal courses online can supplement this book learning, and since trainees are grouped in centres, interpersonal learning and discussion is easy to organise. Most importantly, trainees work with consultants, over extended periods of time, who know what they are trying to achieve: the purpose of the apprenticeship is to produce somebody who can replace them in a few years time. You do not need to be deep into educational theory to work well within this sort of environment, indeed you should keep any ‘educationalists’ at arms length.
Where this model does not work well, is in the ‘predermatology’ training. The obvious point is that much of this pre-dermatology work is not necessary and where it is, it should be carried out by those who are embarking on a particular career or by non-training staff (who may or may not be doctors). In the UK, if you have a FY doctor attached to a dermatology ward, they will rotate every few months through a range of specialties, and it is likely that they will have no affinity for most of them. Such jobs are educationally worthless as dermatology is an outpatient specialty. Ironically the only value of such jobs, is for those who have already committed to a career in dermatology. I will return to the all too familiar objections of what I propose in another blog post, but for training in many areas of medicine, including GP, radiology, pathology, psychiatry, what I have said of dermatology, holds.
We could frame my argument in another way. If you cannot hold onto the tenets of apprenticeship learning — extended periods of graded practice under the close supervision of a small group of masters and novices, it is not a training post.
I am now going to jump to the other end of medical education: what are medical schools for?
Current undergraduate medical education is a hybrid of ‘education’ and ‘training’. Universities can deliver high class education (I said can, not do), but they cannot deliver high class clinical training. They do not have the staff to do it, and they do not own the ‘means of production’. Apprenticeship learning does not work given the number of students, and in any case, teaching of medical students is a low priority for NHS hospitals who have been in a ‘subsistence’ mode for decades. Things will only get worse.
Some (but not all) other professional schools or professions organise things differently. A degree may be necessary, but the bond between degree and subsequent training is loose. Unlike medicine, it is not the job of the university to produce somebody who is ‘safe’ and ‘certified’ on the day of graduation.
What I propose is that virtually all the non-foundational learning is shifted into the early years of apprenticeship learning where the individuals are paid employees of the NHS (or other employer). I talked about what foundational learning is in an earlier post, and here I am arguing that it is the foundational learning which universities should deliver. Just as professional service firms, law firms or engineering schools may prefer graduates with particular degrees, they know that they need to train their apprentices in a work environment, an environment in which they are paid (as with all apprenticeships the training salary reflects the market value to the individual of the professional training they receive). What becomes of medical schools?
The corpus of knowledge of the determinants of health and how to promote health, as well as how to diagnose and care for those who are sick is vast. Looked at in financial terms, or numbers of workers, it is a large part of the modern economy, and is of interest way beyond the narrow craft of clinical medicine. The fundamental knowledge underpinning ‘health’ includes sciences and arts. Although modern medicine likes to ride on the coat-tails of science, it is in terms of practice, a professional domain that draws eclectically from a broad scholarship and habits of mind. Medical science has indeed grown, but as a proportion of the domains of knowledge that make up ‘health’ it has shrunk.
Simply put, we might expect many students to study ‘health’, and for the subset of those who want to become doctors we need to think about the domains that are most suitable for ‘practising doctors’. Not all who study ‘health’, will want to be ‘practising doctors’, but of those who do, there may be constraints on what modules they should take. The goal is to produce individuals who can be admitted into a medical apprenticeship when they leave university.
I will write more about ‘health’ in the next post, and contrast it with what we currently teach (and how we teach it). The later part of training (genuine apprenticeship), as in the dermatology example, I would leave alone. But what I am suggesting is that we totally change the demands put on medical schools, and place apprenticeship learning back where it belongs.
 Stolker C. Rethinking the Law School. Cambridge University Press; 2014
 Goldberg DE, Somerville M, Whitney C. A Whole New Engineer: The Coming Revolution in Engineering Education. Threejoy Associates; 2014
 Susskind RE. The end of lawyers? : rethinking the nature of legal services. Oxford; New York: Oxford University Press; 2010
 Susskind R, Susskind D. The Future of the Professions. Oxford University Press, USA; 2015
 Rees J. The UK needs office dermatologists. BMJ. 2012;345:35.
In the previous post, I talked about some of the details of how undergraduate clinical teaching is organised. It is not an apprenticeship, but rather an alienating series of short attachments characterised by a lack of continuity of teacher-pupil contact. This is not something easily fixed because the structure is geared around the needs of the NHS staff who deliver the bulk of student teaching, rather than what we know makes sense pedagogically. I likened it to the need to put up with getting through security when you travel by plane: you want to get somewhere, but just have to grin and bear the humiliation. This is not a university education. I am not saying that individual teachers are to blame — far from it — as many enjoy teaching students. It is a system problem.
It is not possible to make sense of either undergraduate medical education or postgraduate training without looking at the forces that act on the other. It is also far too easy to assume that ‘the system’ in the UK is the only way to organise things, or indeed, to think it is anywhere near optimal. A damning critique of medicine (and much else in society) in the UK is our inability to learn from what others do.
The formative influences on (undergraduate) medical education are those conditions that were operating over half a century ago. At that time, a medical degree qualified you to enter clinical practice with — for many students — no further formal study. And much clinical practice was in a group size of n=1.
In the 1950s the house year (usually 6 months surgery and 6 months medicine) was introduced. Theoretically this was under the supervision of the university, but in practice this supervision was poor, and the reality was that this was never going to work in the ‘modern NHS’. How can the University of Edinburgh supervise its graduates who work at the other end of the country? In any case, as has been remarked on many occasions, although the rationale for the house year was ‘education’, the NHS has never taken this seriously. Instead, housepersons became general dogsbodies, working under conditions that could have come from a Dickens novel. In my own health board, the link between master and pupil has been entirely broken: apprenticeship is not only absent from the undergraduate course, but has been exiled from a lot of postgraduate training (sic). House doctors are referred to as ‘ward resources’, not tied to any group of supervising doctors. Like toilet cisterns, or worse…
Nonetheless, the changes in the 1950 and other reforms in the 1960s established the conventional wisdom that the aim of undergraduate medical education was not to produce a ‘final product’ fit to travel the world with their duffel-shaped leather satchel in hand. Rather, there would be a period of postgraduate training leading to specialist certification.
This change should have been momentous. The goal was to refashion the undergraduate component; and allow the postgraduate period to produce the finished product (either in a specialty, or in what was once called general practice). It is worth emphasising what this should have meant.
From the point of view of the public, the key time for certification for medial practice was not graduation, but being placed on the specialist register. The ability to practice independently was something granted to those with higher medical qualification (MRCP, MRCPysch etc) and who were appointed to a consultant post. All other posts were training posts, and practice within such roles was not independent but under supervision. Within an apprenticeship system — which higher professional training largely should be — supervision comes with lots of constraints, constraints that are implicit in the relation between master and pupil, and which have stayed largely unchanged across many guilds and crafts for near on a thousand years.
What went wrong was no surprise. The hospitals needed a cadre of generic dogbodies to staff them given the 24 hour working conditions necessary in health care. Rather than new graduates choosing their final career destination (to keep with my airport metaphor) they were consigned to a holding pattern for 2-7 years of their life. In this service mode, the main function was ‘service’ not supervised training. As one of my former tutees in Edinburgh correctly told me at graduation: (of course!)he was returning to Singapore, because if he stayed in the NHS he would just be exploited until he could start higher professional training. The UK remains an outlier worldwide in this pattern of enforced servitude.
The driving force in virtually all decision making with the UK health systems is getting through to the year-end. The systems live hand-to-mouth. They share a subsistence culture, in which it almost appears that their primary role is not to deliver health care, but to reflect an ideology that might prove attractive to voters. As with much UK capitalism, the long term always loses out to the short term. What happened after the realisation that a graduating medical students was neither beast nor fowl, was predictable.
The pressure to produce generic trainees with little meaningful supervision in their day-to-day job, meant that more and more of undergraduate education was sacrificed to the goal of producing ‘safe and competent’ FY (foundation years 1 & 2) doctors, doctors who again work as dogsbodies and cannot learn within a genuine apprenticeship model. The mantra became that you needed five years at medical school, to adopt a transitory role, that you would willingly escape from as soon as possible. Furthermore the undergraduate course was a sitting duck for any failings of the NHS: students should know more about eating disorders, resilience, primary care, terminal care, obesity, drug use… the list is infinite, and the students sitting ducks, and the medical schools politically ineffective.
What we now see is an undergraduate degree effectively trying to emulate a hospital (as learning outside an inpatient setting is rare). The problem is simply stated: it is not possible to do this within a university that does not — and I apologise if I sound like an unreconstructed Marxist — control the means of production. Nor is it sensible to try and meld the whole of a university education in order to produce doctors suitable for a particular time-limited period of medical practice, that all will gladly leave within a few years of vassalage.
Medicine is an old profession, (I will pass on GBS’ comments about the oldest profession). In medicine the traditional status of both ‘profession’ and ‘this profession’ in particular has been used to imagine that medicine can stand aloof from other changes in society. There are three points I want to make on this issue: two are germane to my argument, whilst the other, I will return to in another post.
The first is that in the immediate post-Flexner period to the changes in medical education in the 1950s and 1960s, few people in the UK went to university. Doctors did go to university even if the course was deemed heavily vocational, with a guaranteed job at the end of it. Learning lots of senseless anatomy may not have compared well with a liberal arts eduction but there was time for maturing, and exposure to the culture of higher learning. Grand phrases indeed, but many of us have been spoiled by their ubiquity. Our current medical students are bright and mostly capable of hard work, but many lack the breadth and ability to think abstractly of the better students in some other faculties. (It would for instance, be interesting to look at secular changes in degree awards of medical students who have intercalated.) No doubt, medical students are still sought after by non-medical employers, but I suspect this is a highly self-selected group and, in any case, reflects intrinsic abilities and characteristics as much as what the university has provided them with.
The second point, is that all the professions are undergoing change. The specialist roles that were formalised and developed in the 19th century, are under attack from the forces that Max Weber identified a century ago. The ‘terminally differentiated’ individual is treated less kindly in the modern corporate state. Anybody who has practiced medicine in the last half century is aware of the increasing industrialisation of medical practice, in which the battle between professional judgment and the impersonal corporate bureaucracy is being won by the latter 
My third point is more positive. Although there have been lots of different models of ‘professional training’ the most prevalent today is a degree in a relevant domain (which can be interpreted widely) following by selection for on the job training. Not all those who do a particular degree go onto the same career, and nor have the employers expected the university to make their graduates ‘fit for practice’ on day 1 of their employment. Medicine has shunned this approach, still pretending that universities can deliver apprenticeship training, whilst the GMC and hospitals have assumed that you can deliver a safe level of care by offloading core training that has to be learned in the workplace, to others. No professional services firm that relies on return custom and is subject to the market would behave in this cavalier way. Patients should not be so trusting.
In the next post, I will expand on how — what was said of Newton — we should cleave nature at the joints in order to reorganise medical education (and training).
 Re; the enforced servitude. I am not saying this work is not necessary, nor that those within a discipline do not need to know what goes on on the shop floor. But to put it bluntly, the budding dermatologist should not be wasting time admitting patients with IHD or COPD, or inserting central lines or doing lumbar punctures. Nor do I think you can ethically defend a ‘learning curve’ on patients given that the learner has committed not to pursue a career using that procedure. The solution is obvious, and has been discussed for over half a century: most health care workers need not be medically qualified.
 Which of course raises the issue of whether certification at an individual rather than an organisational level makes sense. In the UK the government pressure will be to emphasise the former at the expense of the latter: as they say, the beatings will continue until moral improves.
 Rewards in modern corporations like the NHS or many universities are directed at generic management skills, not domain expertise. University vice-chancellors get paid more than Nobel prize winners at the LMB. In the NHS there is a real misalignment of rewards for those clinicians who their peers recognise as outstanding, versus those who are medical managers (sic). If we think of some of the traditional crafts — say painting or sculpture – I doubt we can match the technical mastery expertise of Florence. Leonardo would no doubt now by handling Excel spreadsheets as a manager (see this piece on Brian Randell’s homepage on this very topic).
In the previous post I laid out some of the basic structures of the ‘clinical years’ of undergraduate medical degrees. In this post I want to delve a little deeper and highlight how things have gone wrong. I do not imagine it was ever wonderful, but it is certainly possible to argue that things have got a lot worse. I think things are indeed bad.
When I was a medical student in Newcastle in 1976-1982 the structure of the first two clinical years (years 3 and 4) were similar, whereas the final year (year 5) was distinct. The final year was made up of several long attachments — say ten weeks medicine and 10 weeks surgery — and there were no lectures or any demands on your time except that you effectively worked as an unpaid houseman, attached to a firm of two or three consultants. The apprenticeship system could work well during these attachments. The reasons for this partly reflected the fact that all parties had something to gain. Many if not most students chose where they did their attachments (‘if you like fellwalking, choose Carlisle etc), and had an eye on these units as a place to do your house jobs the following year. The consultants also had skin in the game. Instead of relying on interviews, or just exam results, they and all their staff (junior docs, nurses etc) got a chance to see close up what an individual student was like, and they could use this as a basis for appointing their houseperson the following year. If a houseman was away, you acted up, and got paid a small amount for this. At any time if you didn’t turn up, all hell would break out. You were essential to the functioning of the unit. No doubt there was some variation between units and centres, but this is how it was for me. So, for at least half of final year, you were on trial, immersed in learning by doing / learning on the job / workplace learning etc. All the right buzzwords were in place.
As I have said, years 3 and 4 were different from final year, but similar to each other. The mornings would be spent on the ward and the afternoons — apart from Wednesdays — were for lectures. I didn’t like lectures (or at least those sort of lectures) so I skipped them apart from making sure that I collected any handouts which were provided on the first day (see some comments from Henry Miller on lectures below ).
The mornings were ‘on the wards’. Four year 3 students might be attached to two 30 bedded wards (one female, one male), and for most of the longer attachments you would be given a patient to go and see, starting at 9:30, breaking for coffee at 10:30 and returning for an hour or more in which one or more of you had to present you findings before visiting the bedside and being taught how to examine the patient. The number of students was small, and there was nowhere to hide, if you didn’t know anything.
For the longer attachments (10 weeks for each of paediatrics, medicine and surgery) this clinical exposure could work well. But the shorter attachments especially in year 4 were a problem, chiefly because you were not there long enough to get to know anybody.
The design problem was of course that the lectures were completely out of synchrony with the clinical attachments. You might be doing surgery in the morning, but listening to lectures on cardiology in the afternoon. Given my lack of love for lectures, I used the afternoons to read about patients I had seen in the morning, and to cover the subject of the afternoon lectures, by reading books.
I don’t want to pretend that all was well. It wasn’t. You might turn up to find that nobody was available to teach you, in which case we would retreat to the nurses canteen to eat the most bacon-rich bacon sandwiches I have ever had the pleasure of meeting (the women in the canteen thought all these young people needed building up with motherly love and food 🙂 ).
The knowledge of what you were supposed to learn was, to say the least, ‘informal’; at worst, anarchic. Some staff were amazingly helpful, but others — how shall I say — not so.
In reality, everybody knew that years 3 and 4 were pale imitations of year 5. The students wanted to be in year 5, because year 5 students — or at last most year 5 students — were useful. The problem was that the numbers (students and patients) and the staffing were not available. It was something to get through, but with occasional moments of hope and pleasure. Like going through security at airports: the holiday might be good, but you pay a price.
The easiest way to summarise what happens now is to provide a snapshot of teaching in my own subject at Edinburgh.
Year 4 (called year 5 now, but the penultimate year of undergraduate medicine) students spend two weeks in dermatology. Each group is made up of 12-15 students. At the beginning of a block of rotations lasting say 18 weeks in total, the students will have 2.5 hours of lectures on dermatology. During the two week dermatology rotation, most teaching will take place in the mornings. On the first morning the students have an orientation session, have to work in groups to answer some questions based on videos they have had to watch along with bespoke reading matter, and then there is an interactive ‘seminar’ going through some of the preparatory work in the videos and text material.
For the rest of the attachment students will attend a daily ‘teaching clinic’, in which they are taught on ‘index’ patients who attend the dermatology outpatients. These patients are selected from those attending the clinic and, if they agree, they pass through to the ‘teaching clinic’. The ‘teacher’ will be a consultant or registrar, and this person is there to teach — not to provide clinical care during this session.
Students will also sit in one ‘normal’ outpatient clinic as a ‘fly on the wall’, and attend one surgical session. At the end of the attachment, there is a quiz in which students attempt to answer questions in small groups of two or three. They also get an opportunity to provide oral feedback as well as anonymous written feedback. Our students rate dermatology highly in comparison with most other disciplines, and our NHS staff are motivated and like teaching.
When I read through the above it all sounds sort of reasonable, except that…
Students will pass though lots of these individual attachments. Some are four weeks long but many are only 1 or 2 weeks in duration. It is demanding to organise such timetables, and stressful for both students and staff
My critique is not concerned with the individuals, but the system. It is simply hard to believe that this whole edifice is coherent or designed in the students’ interest. It is, as Flexner described UK medical school teaching a century ago, wonderfully amateur. Pedagogically it makes little sense. Nor in all truthfulness is it enjoyable for many staff or many students. Each two weeks a new batch will arrive and groundhog days begins. Again. And again. And if you believe the figures bandied about for the cost of medical education, the value proposition seems poor. We could do better: we should do better.
 Lectures. Henry Miller, who was successively Dean of Medicine and Vice Chancellor at Newcastle described how…
“Afternoon lectures were often avoided in favour of the cinema. The medical school was conveniently placed for at least three large cinemas….in one particularly dull week of lectures we saw the Marx brothers in ‘A Day at the Races’ three times.”
In the previous post in this series (Late night thoughts #5: Foundations) I wrote about the content or material of medical education, hinting at some of the foundational problems (pardon the meta). We have problems distinguishing between knowledge that is essential for some particular domain of medical practice, and knowledge that is genuinely foundational. The latter is largely speciality independent, less immediate than essential knowledge, and is rightly situated within the university. The expertise necessary to teach foundational knowledge lies within universities.
What I have not made explicit so far in this essay is also important. The best place to learn much essential knowledge is within the hospital, and during a genuine apprenticeship. There are various ways we can hone a meaningful definition or description of apprenticeship but key is that you are an employee, that you get paid, and you are useful to your employer. Our current structures do not meet any of these criteria.
Kenneth Calman in the introduction to his book ‘Medical Education’ points out that medical education varies enormously between countries, and that there is little evidence showing the superiority of any particular form or system of organisation. It is one of the facts that encourages scepticism about any particular form, and furthermore — especially in the UK — leads to questioning about the exorbitant costs of medical education. It also provides some support for the aphorism that most medical students turn into excellent doctors despite the best attempts of their medical schools.
Across Europe there have been two main models of clinical training (I am referring to undergraduate medical student training, not graduate / junior doctor training). One model relies on large lectures with occasional clinical demonstrations, whereas the UK system — more particularly the traditional English system — relies on ‘ clerkships’ on the wards.
At Newcastle when I was a junior doctor we used to receive a handful of German medical students who studied with us for a year. They were astonished to find that the ‘real clinical material’ was available for them to learn from, with few barriers. They could go and see patients at any time, the patients were willing, and — key point— the clinical material was germane to what they wanted to learn. The shock of discovering this veritable sweetshop put some of our students to shame.
The English (and now UK) system reflects the original guiding influence of the teaching hospitals that were, as the name suggests, hospitals where teaching took place. These hospitals for good and bad were proud of their arms length relationship with the universities and medical schools. The signature pedagogy was the same as for junior doctors. These doctors were paid (poorly), were essential (the place collapsed if they were ill), and of course they were employees. Such doctors learned by doing, supplemented by private study using textbooks, or informal teaching provide locally within the hospital or via the ‘Colleges’ or other medical organisations. Whatever the fees, most learning was within a not-for-profit culture.
It was natural to imagine or pretend that what worked at the postgraduate level would work at the undergraduate level, too. After all, until the 1950s, medical education for most doctors ended at graduation where, as the phrase goes, a surgeon with his bag full of instruments ventured forth to the four corners of the world.
This system may have worked well at one stage, but I think it fair to say it has been failing for nearer a century than half a century. At present, it is not a system of education that should be accepted. There are two reasons for this.
First, medicine has (rightly) splintered into multiple domains of practice. Most of the advances we have seen over the last century in clinical medicine reflect specialisation, specialisation as a response to the growth of explicit knowledge, and the realisation that high level performance in any craft relies not solely on initial certification, but daily practice (as in the ‘practice of medicine’). Second, what might have worked well when students and teachers were members of one small community, fails within the modern environment. As one physician at Harvard / Mass General Hospital commented a few years back in the New England Journal of Medicine: things started to go awry when the staff and students no longer ate lunch together.
Unpicking the ‘how’ of what has happened (rather than the ‘why’ which is, I think obvious), I will leave to the next post. But here is a warning. I first came across the word meliorism in Peter Medawar’s writing. How could it not be so, I naively thought? But of course, historians or political scientists would lecture me otherwise. It is possible for human affairs to get worse, even when all the humans are ‘good’ or at least have good intentions. The dismal science sees reality even more clearly: we need to only rely on institutions that we have designed to work well — even with bad actors.