Late night thoughts #8

by reestheskin on 31/05/2019

Comments are disabled

Late night thoughts on medical education #8: Where to draw the line?

In the previous post, I talked about some of the details of how undergraduate clinical teaching is organised. It is not an apprenticeship, but rather an alienating series of short attachments characterised by a lack of continuity of teacher-pupil contact. This is not something easily fixed because the structure is geared around the needs of the NHS staff who deliver the bulk of student teaching, rather than what we know makes sense pedagogically. I likened it to the need to put up with getting through security when you travel by plane: you want to get somewhere, but just have to grin and bear the humiliation. This is not a university education. I am not saying that individual teachers are to blame — far from it — as many enjoy teaching students. It is a system problem.

The interdependence of undergraduate education and postgraduate medical training

It is not possible to make sense of either undergraduate medical education or postgraduate training without looking at the forces that act on the other. It is also far too easy to assume that ‘the system’ in the UK is the only way to organise things, or indeed, to think it is anywhere near optimal. A damning critique of medicine (and much else in society) in the UK is our inability to learn from what others do.

The formative influences on (undergraduate) medical education are those conditions that were operating over half a century ago. At that time, a medical degree qualified you to enter clinical practice with — for many students — no further formal study. And much clinical practice was in a group size of n=1.

In the 1950s the house year (usually 6 months surgery and 6 months medicine) was introduced. Theoretically this was under the supervision of the university, but in practice this supervision was poor, and the reality was that this was never going to work in the ‘modern NHS’. How can the University of Edinburgh supervise its graduates who work at the other end of the country? In any case, as has been remarked on many occasions, although the rationale for the house year was ‘education’, the NHS has never taken this seriously. Instead, housepersons became general dogsbodies, working under conditions that could have come from a Dickens novel. In my own health board, the link between master and pupil has been entirely broken: apprenticeship is not only absent from the undergraduate course, but has been exiled from a lot of postgraduate training (sic). House doctors are referred to as ‘ward resources’, not tied to any group of supervising doctors. Like toilet cisterns, or worse…

Nonetheless, the changes in the 1950 and other reforms in the 1960s established the conventional wisdom that the aim of undergraduate medical education was not to produce a ‘final product’ fit to travel the world with their duffel-shaped leather satchel in hand. Rather, there would be a period of postgraduate training leading to specialist certification.

Training versus education

This change should have been momentous. The goal was to refashion the undergraduate component; and allow the postgraduate period to produce the finished product (either in a specialty, or in what was once called general practice). It is worth emphasising what this should have meant.

From the point of view of the public, the key time for certification for medial practice was not graduation, but being placed on the specialist register. The ability to practice independently was something granted to those with higher medical qualification (MRCP, MRCPysch etc) and who were appointed to a consultant post. All other posts were training posts, and practice within such roles was not independent but under supervision. Within an apprenticeship system — which higher professional training largely should be — supervision comes with lots of constraints, constraints that are implicit in the relation between master and pupil, and which have stayed largely unchanged across many guilds and crafts for near on a thousand years.

What went wrong was no surprise. The hospitals needed a cadre of generic dogbodies to staff them given the 24 hour working conditions necessary in health care. Rather than new graduates choosing their final career destination (to keep with my airport metaphor) they were consigned to a holding pattern for 2-7 years of their life. In this service mode, the main function was ‘service’ not supervised training. As one of my former tutees in Edinburgh correctly told me at graduation: (of course!)he was returning to Singapore, because if he stayed in the NHS he would just be exploited until he could start higher professional training. The UK remains an outlier worldwide in this pattern of enforced servitude[1].

What has all this to do with undergraduate education?

The driving force in virtually all decision making with the UK health systems is getting through to the year-end. The systems live hand-to-mouth. They share a subsistence culture, in which it almost appears that their primary role is not to deliver health care, but to reflect an ideology that might prove attractive to voters. As with much UK capitalism, the long term always loses out to the short term. What happened after the realisation that a graduating medical students was neither beast nor fowl, was predictable.

The pressure to produce generic trainees with little meaningful supervision in their day-to-day job, meant that more and more of undergraduate education was sacrificed to the goal of producing ‘safe and competent’ FY (foundation years 1 & 2) doctors, doctors who again work as dogsbodies and cannot learn within a genuine apprenticeship model. The mantra became that you needed five years at medical school, to adopt a transitory role, that you would willingly escape from as soon as possible. Furthermore the undergraduate course was a sitting duck for any failings of the NHS: students should know more about eating disorders, resilience, primary care, terminal care, obesity, drug use… the list is infinite, and the students sitting ducks, and the medical schools politically ineffective.

What we now see is an undergraduate degree effectively trying to emulate a hospital (as learning outside an inpatient setting is rare). The problem is simply stated: it is not possible to do this within a university that does not — and I apologise if I sound like an unreconstructed Marxist — control the means of production. Nor is it sensible to try and meld the whole of a university education in order to produce doctors suitable for a particular time-limited period of medical practice, that all will gladly leave within a few years of vassalage.

 Medical exceptionalism

Medicine is an old profession, (I will pass on GBS’ comments about the oldest profession). In medicine the traditional status of both ‘profession’ and ‘this profession’ in particular has been used to imagine that medicine can stand aloof from other changes in society. There are three points I want to make on this issue: two are germane to my argument, whilst the other, I will return to in another post.

The first is that in the immediate post-Flexner period to the changes in medical education in the 1950s and 1960s, few people in the UK went to university. Doctors did go to university even if the course was deemed heavily vocational, with a guaranteed job at the end of it. Learning lots of senseless anatomy may not have compared well with a liberal arts eduction but there was time for maturing, and exposure to the culture of higher learning. Grand phrases indeed, but many of us have been spoiled by their ubiquity. Our current medical students are bright and mostly capable of hard work, but many lack the breadth and ability to think abstractly of the better students in some other faculties. (It would for instance, be interesting to look at secular changes in degree awards of medical students who have intercalated.) No doubt, medical students are still sought after by non-medical employers, but I suspect this is a highly self-selected group and, in any case, reflects intrinsic abilities and characteristics as much as what the university has provided them with.

The second point, is that all the professions are undergoing change. The specialist roles that were formalised and developed in the 19th century, are under attack from the forces that Max Weber identified a century ago. The ‘terminally differentiated’ individual is treated less kindly in the modern corporate state. Anybody who has practiced medicine in the last half century is aware of the increasing industrialisation of medical practice, in which the battle between professional judgment and the impersonal corporate bureaucracy is being won by the latter [2][3]

My third point is more positive. Although there have been lots of different models of ‘professional training’ the most prevalent today is a degree in a relevant domain (which can be interpreted widely) following by selection for on the job training. Not all those who do a particular degree go onto the same career, and nor have the employers expected the university to make their graduates ‘fit for practice’ on day 1 of their employment. Medicine has shunned this approach, still pretending that universities can deliver apprenticeship training, whilst the GMC and hospitals have assumed that you can deliver a safe level of care by offloading core training that has to be learned in the workplace, to others. No professional services firm that relies on return custom and is subject to the market would behave in this cavalier way. Patients should not be so trusting.

In the next post, I will expand on how — what was said of Newton — we should cleave nature at the joints in order to reorganise medical education (and training).

[1] Re; the enforced servitude. I am not saying this work is not necessary, nor that those within a discipline do not need to know what goes on on the shop floor. But to put it bluntly, the budding dermatologist should not be wasting time admitting patients with IHD or COPD, or inserting central lines or doing lumbar punctures. Nor do I think you can ethically defend a ‘learning curve’ on patients given that the learner has committed not to pursue a career using that procedure. The solution is obvious, and has been discussed for over half a century: most health care workers need not be medically qualified.

[2] Which of course raises the issue of whether certification at an individual rather than an organisational level makes sense. In the UK the government pressure will be to emphasise the former at the expense of the latter: as they say, the beatings will continue until moral improves.

[3] Rewards in modern corporations like the NHS or many universities are directed at generic management skills, not domain expertise. University vice-chancellors get paid more than Nobel prize winners at the LMB. In the NHS there is a real misalignment of rewards for those clinicians who their peers recognise as outstanding, versus those who are medical managers (sic). If we think of some of the traditional crafts — say painting or sculpture – I doubt we can match the technical mastery expertise of Florence. Leonardo would no doubt now by handling Excel spreadsheets as a manager (see this piece on Brian Randell’s homepage on this very topic).