“Imagine if we taught baseball the way we teach science. Until they were twelve, children would read about baseball technique and history, and occasionally hear inspirational stories of the great baseball players. They would fill out quizzes about baseball rules. College undergraduates might be allowed, under strict supervision, to reproduce famous historic baseball plays. But only in the second or third year of graduate school, would they, at last, actually get to play a game. If we taught baseball this way, we might expect about the same degree of success in the Little League World Series that we currently see in our children’s science scores.”
No, it doesn’t: pure clickbait. But how many does it need? The headline was taken from a comment by Eric Schmidt, the former CEO of Google, that the ‘UK needs 10,000 computer science academics’, When I saw the headline, I initially read it as saying the UK needed another 10,000 computer science graduates. Oops. He means staff, not students.
But then I wondered, as I often have, how many academics in medicine we need, and how we might go about working out what the number should be. And I should add, I am sceptical we can know how many doctors we need, only those untouched by reality like Jeremy Hunt, know answers to question like that. But there are some numbers that are relevant, even if I cannot match Enrico Fermi’s ability to perform back of the envelope calculations (how many piano teachers are there in New York?).
Depending on how you parse the data, skin disease is said to be the commonest reason to visit a GP in the UK. Estimates suggest there are 15 million visits to GPs with a skin problem each year. In many countries all these patients would go direct to an office dermatologist (this distinction is important, but marginal to my argument here).
Each year about one million people with skin disease are referred from primary care to secondary care. New to follow up ratios are falling — being forced down
without any clinical reason because of money — but assume 1 to 1.5. In terms of visits, the ratio is much higher, because we have to include surgery and phototherapy, so the ratio of new to follow up is much higher, at a guess 1:4. This would mean 4 million visits. This seems frighteningly high.
There are around 70,000 GPs on the register, and around 600 consultant dermatologists in the UK. GP recruitment problems are well known, and estimates are that close to one third of all dermatologist posts are vacant (‘no suitable candidates’). There are juniors (sic) on top, and other miscellaneous doctors too. In terms of new patients, I see 26 per week, and I am clinically part time, so around 1000 per year, plus some on call work, which is light. If we divide the 1,000,000 new referrals by 400 consultants, we get each consultant seeing around 2,500. But if we add in juniors, staff grades and locus, the numbers *feel* about right.
If we were to look at academic staffing, we have about 30-35 clinical academics in dermatology in the UK. They spend their time between clinical practice, research and teaching. Most UK students are taught for most of their time by people who are not ‘academics’ or at least by people without what in most subjects and in most advanced countries would be recognised as an academic apprenticeship. Skin biology or skin science is notable by its almost complete absence in many — possibly the majority— of medical schools. If we argue — and I would — that those who run and organise teaching in higher education need to view this task as a *professional* task, we are running with say 15 FTE providing the undergraduate teaching resource that underpins clinical practice and early training / education. Note: my argument is about undergraduate education, and not specialist training; and I believe that teaching is not a ‘bolt-on’ activity at the undergraduate level (if you don’t agree with this view, I suggest you could largely dispense with university medical schools).
There is a simple way to frame any answer to my question. Do you think it is possible to produce and maintain a culture of learning and clinical expertise given the numbers above?
Great post by Bryan Alexander, titled ‘ A Devil’s Dictionary of Educational Technology’. If you work in it, you will recognise it. I will start off with my favourites
Powerpoint, n. 1. A popular and low cost narcotic, mysteriously decriminalized.
YouTube, n. The ideal educational technology: everyone likes and uses it, it’s reliable and free, and neither you nor anyone you know has to support it.
Here are a few more:
Active learning , n. 1.The opposite of obedience lessons.
Asynchronous, adj. The delightful state of being able to engage with someone online without their seeing you, while allowing you to make a sandwich.
Best practice, n. “An educational approach that someone heard worked well somewhere. See also ‘transformative,’ ‘game changer,’ and ‘disruptive.’” (by Jim Julius)
LMS, n. 1) A document management system, whereby a faculty member can transfer a single document to his or her students. Curiously overpowered for this purpose, nevertheless universally deployed.
2) A good way to avoid legal notices about copyright.
3) The graveyard of pedagogical intentions. A sump for IT budgets.
Nice procrastination piece,
or reality check, depending on whether you teach or you deliver teaching.
I try to avoid writing on this topic, finding it too depressing — although not as depressing as I once did, as I am closer to the end rather than the beginning. And there are signs of hope, just not where they once were.
There is an editorial in Nature titled ‘Early-career researchers need fewer burdens and more support’. It makes depressing reading. The contrast is with a talk on YouTube I listened to a few days back, by the legendary computer engineer (and Turing award winner and much else) Alan Kay, in which he points out that things were really much better in the 1960s and people at the time knew they were much better. Even within my short career, things were much better in 1990 than 2000, 2000 than 2010 and so on. When people ask me, is it sensible to pursue a career in science, I am nervous about offering advice. Science is great. Academia, in many places, is great. But you can only do most science or academia in a particular environment, and there are few places that I would want to work in if I were starting out. And I might not get into any of them, anyway (Michael Eisen’s comment: never a better time to do science, never a worse time to be a scientist’). I will share a few anecdotes.
Maybe 10-15 years ago I was talking to somebody who — with no exaggeration — I would describe as one of the UKs leading biologists. This person described how one of their offspring was at university and had, for the first few years not taken his/ her studies too seriously. Then things changed, and they wondered about doing a PhD and following a ‘classical’ scientific career. The senior biologist expressed concern, worried that there was now no sensible career in science, and that much as though he/she had enjoyed their career, he/she could not longer recommend it. There was some guilt, but your children are your children.
The second, was a brief conversation with the late physicist John Ziman. I had read some of Ziman’s work — his ‘Real Science’ is for me essential reading for anybody who wants to understand what has happened to the Mertonian norms, and why science is often increasingly dysfunctional — but he shared a bit of his life history with me. When he was appointed as a lecturer at Cambridge in physics, the topic of his lectures was ‘new’ and there were no established books. So he set out to remedy the situation and spent the first two years writing such a book (still available, I think), and after that, turned his attention back to physics research, and later much more (‘you have to retire to have the time to do serious work’). He commented that this would simply be impossible now.
With respect to medicine, there has been attempts for most of my life to develop schemes to encourage and support young trainees. I benefited from them, but I question whether they target the real problem. There are a number of issues.
First, the model of training of clinical academics in medicine is unusual. Universities tend to want external funders to support the research training of clinical academics (Fellowships), but that is a model with severe limitations. Nurturing talent is a core business of the universities, and they need to devote resource to it. It is their resposibility. Of course, they need to train and support academics, not just researchers. This is what career progression within academia is about: lecturer, reader, professor etc. What medical schools want to do is to off load the risk on to the person, and then only buy when the goods have been tasted. In a competitive world, where other career options are open, this might not work well. Worst of all, it funnels a large number of institutions — institutions that should show diversity of approaches — into the lowest common denominator of what is likely to be funded by the few central funders. Until you have independence of mind and action, you cut your chances of changing the world. (Yes, I hear you say, there is not enough money, but most universities need to cut back on ‘volume’.)
The second issue, is about whether the focus should be on schemes encouraging young people into science. I know I may sound rather curmudgeonly, but I worry that much activity relating to pursuing certain careers is reminiscent of ‘wonga like’ business models. I think we should do better. If youngsters look at what life is like at 40, 50 and 60 or beyond, and like it, they might move in that direction. You would not need to encourage them — we are dealing with bright people. A real problem for science funding is that for many individuals, it resembles a subsistence society, with little confidence about long term secure funding, and little resilience against changes in political will. Just look at Brexit. I remember once hearing somebody who had once considered a science career telling me that it seemed to him that most academics spent their life writing grants, and feeling uncomfortable about replacing what they wanted to do, with what might be funded. Conversations about funding occupied more time than serous thinking. I listened nervously.
Finally, I take no pleasure in making the point, but I do not see any reason to imagine that things will get better over a ten or twenty year period. One of my favourite quotes of the economist Kenneth Galbraith, is to the effect that the denigration of value judgement is one of the ways the scientific establishment maintains its irrelevance. I think there is a lot in that phrase. If we were to ask the question, what is more critical: understanding genetics, or understanding how institutions work, I know where my focus wold be be. I suspect there is more fun there too, just that much of the intellectual work might not be within academia’s walls.
Note: After writing this I worried that people would think that I was opposing schemes to encourage young people, or that I failed to understand that we have to treat those with new ideas differently. That was not my intention. Elsewhere I have quoted Christos Papadimitriou, and he gets my world view, too.
“Classics are written by people, often in their twenties, who take a good look at their field, are deeply dissatisfied with an important aspect of the state of affairs, put in a lot of time and intellectual effort into fixing it, and write their new ideas with self-conscious clarity. I want all Berkeley graduate students to read them.”
‘You Americans have the best high school education in the world. What a pity you have to go to college to get it.’ In Alan Kay
‘People pay a lot for a great education now, but you can become expert level on most things by looking at your phone.’
It is just nice to see it in black and white. So simple. BTW, the quote came via the wiser (not ‘smarter’) Nick Carr, who commented: ‘By “a really good life,” Altman means a virtual reality headset and an opioid prescription’.
Attention: Some slipping of the causal nexus is evident.
An article in the Economist reviewing, or at least discussing, a couple of books about the rate of innovation caught my eye, in particular a snippet that I will expand on below. The books were “The Rise and Fall of American Growth” by Robert Gordon, and the “The Innovation Illusion” by Fredrik Erixon and Bjorn Weigel. I haven’t read either, but enjoyed a review of the Gordon book in the NYRB by Willian Nordhaus. Based on my reading of the Nordhaus book review the issue is that the rate of innovation and productivity is declining — we are not hurtling towards any singularity — and that the century of out of the ordinary innovation was 1870 to 1970. Here is Nordhaus:
Gordon focuses on growth in the United States. Living standards, as measured by GDP per capita or real wages, accelerated after 1870. The growth rate looks like an inverted U. Productivity growth rose from the late nineteenth century and peaked in the 1950s, but has slowed to a crawl since 1970. In designating 1870–1970 as the special century, Gordon emphasizes that the period since 1970 has been less special. He argues that the pace of innovation has slowed since 1970 (a point that will surprise many people), and furthermore that the gains from technological improvement have been shared less broadly (a point that is widely appreciated and true).
In the Economist article, we read:
The figures from recent years are truly dismal. Karim Foda, of the Brookings Institution, calculates that labour productivity in the rich world is growing at its slowest rate since 1950. Total factor productivity (which tries to measure innovation) has grown at just 0.1% in advanced economies since 2004, well below its historical average.
I do not find this view strange. Medical advance is slowing, not accelerating. Medicine was transformed between 1940 and 1970, but the rate of new discovery has slowed. There is more data , more activity, and more scientists, of course. And a lot more hype and university press officers. Just less advance in comparison with what went before. The same is true about university education, too.
Criticisms of these views include questions about the data used to support the various arguments. In the Economist piece, the ‘techno optimists’ make two criticisms. The second is that the ‘techno’ revolution hasn’t really started yet, but it is the first one that caught my eye:
The first is that there must be something wrong with the figures. One possibility is that they fail to count the huge consumer surplus given away free of charge on the internet. But this is unconvincing. The official figures may well be understating the impact of the internet revolution, just as they downplayed the impact of electricity and cars in the past, but they are not understating it enough to explain the recent decline in productivity growth.
Paul Mason elsewhere uses the example of Wikipedia:
Wikipedia is a non-market form of activity—it’s a $3bn hole in the advertising world.
Now bringing this back to my own little world, I am intrigued by how the battle between, on the one hand, free or OER, and on the other, books or content, you have to pay for, will work out. I touched on this in an article on teaching and learning several years ago, and one of the reasons I wrote the freely accessible textbook of skin cancer, www.skincancer909.com* was out of frustration at the poor quality of dermatology textbooks targeted at medical students. When I surveyed medical students a large fraction did not buy a dermatology textbook, yet it is clear that the university did not provide suitable alternatives, nor was the university able to provide reasonable online alternatives. Now, I do not believe that free is always best, nor do I think that the endgame is anytime soon. But I do believe content is critical, and despair at how the med ed (medical education) world largely ignores it. But there are amazing commercial books out there — think Molecular Biology of the Cell for instance — and there is a battle to be waged about whether you invest large amounts of money in producing material used by many, or continue with the traditional approach taken by universities (those ‘bloody PowerPoints’ and dull lectures, all done on a shoestring budget).
Woodie Flowers touched on cognate issues in a critique of MOOCs and MITx
In the United States, our “education” system is choking to death on a failed training system. Each year, 600,000 first-year college students take calculus; 250,000 fail. At $2000/failed-course, that is half-a-billion dollars. That happens to be the approximate cost of the movie Avatar, a movie that took a thousand people four years to make. Many of those involved in the movie were the best in their field. The present worth of losses of $500 million/year, especially at current discount rates, is an enormous number. I believe even a $100 million investment could cut the calculus failure rate in half.
The criticism stings because Flowers is an educational legend (it also speaks to MIT that they broadcast such critiques of their own activities). Here is Flowers again:
Properly designed new media materials can improve K–12, residential, distance, and life-long learning. In their highly developed form, these learning materials would be as elegantly produced as movies and video games and would be as engaging as a great novel.
I do not know how all of this will work out. I am intrigued by the view that we might be underestimating ‘production’ because much of it is free, but I think we are seeing real market failure, both from the commercial world and from the universities.
* Skincancer909 is due an update, and I am aiming for early 2017.
Cartoon characters not infrequently run off the edge of a cliff. Pause. They then realise there is nothing there to support their running. Time lags are awkward to deal with in any analysis, but since most things do not happen overnight, they are ubiquitous. In analysis, we replace with a fudge factor. Or we ignore them.
I haven’t seen much comment on what I think is the most interesting aspect of the science news over the last few weeks. Here is a line from THE.
Five out of nine laureates in the core prizes for physics, chemistry, medicine and economic sciences were born in the UK. All crossed the pond as rather valuable immigrants to the US.
UK science, and many UK universities, have been in profit-harvesting mode for a long time now. Over the edge of that cliff. Things are going to
fall apart. OK, what the hell:
The falcon cannot hear the falconer;
Things fall apart; the centre cannot hold;
I have been busy updating some teaching stuff. It is never finished but there is time for a little pause. I have completed all the SoundCloud audio answers to the questions in ed.derm.101 (Part C) and there is a ‘completed’ version of ed.derm.101 Part C half way down the linked page. Not all the links have been checked, and a lot had to be redone because the superb New Zealand Dermnet site changed their design (the best source of dermatology images, IMHO). An example of the sort of audio material is below.
“The immune system is unknowable, dynamic, complicated, and it always surprises you.” Stephen Deeks quoted in Science. And yet, useful discoveries are made, and have been made for a long time.
First of all a link to an interview with Joe Ito and Barack Obama. The latter you may have heard about, but Ito is the head of the MIT media lab. Interesting, in that he has no higher ed qualifications. But then again, Jacob Bronowski, was deemed ineligible for a Chair in the UK because he appeared on the radio and TV. But can you really imagine this sort of thing from the bunch of tyros we call a UK government?
The second, an example of how sometimes it seems that scholarship is more in evidence on the web than within the walls of the academy. You think you understand the scurvy story, or how medical progress happens. Read on.
Apart from money, that is.
HigherEd is awash with rankings. Governments like them, and so do publishers. Just look at the THE, with its myriad of bullshit scores. The allure of bogus numbers, over judgment. A feel-good frenzy of metrics. When rankings of US colleges first came in, the assessors used to actually live on campus for a while, go to lectures, and talk to students. There was an attempt at face validity. Not any more. All you need is GIGO data, and you can sell it, or use it to buy power and kickbacks like the politicians. There was even a time when the notion of common sense mattered, but then came the RAE/REF. Then the TEF. Larry Lessig’s comment is worth repeating, again.
The best example of this, I am sure many of you know are familiar with this, is the tyranny of counting in the British educational system for academics, where everything is a function of how many pages you produce that get published by journals. So your whole scholarship is around this metric which is about counting something which is relatively easy to count. All of us have the sense that this can’t be right. That can’t be the way to think about what is contributing to good scholarship.
Anyway, at the back of last week’s Economist I came across a single page advert in the ‘Courses’ section, about IMD (shown below).
I suspect I would have passed over it, except that I used to meet up from time to time with a Professor of finance who lived in Edinburgh and worked at IMD. We had many conversations about teaching, and what impressed me was the focus on ‘education and teaching’ and thinking hard how to do it better. There is a lot about MBA programs that I do not understand, and a fair bit I am suspicious of, but I have little doubt they offer something medicine could learn from.
It piqued my interest enough to follow it up, so here is some more text from the IMD page.
At the end of September, we were informed that IMD would be included in the 2016 MBA ranking, despite the Economist’s initial agreement. This is surprising as we had not supplied any information and our participants and alumni had not been surveyed for this ranking. This contradicts the paper’s statistical method, which requires a minimum 25% survey response rate to be ranked.
Needless to say, IMD has serious reservations regarding the Economist’s methodology and its outcomes. In 2015, relative to the previous year’s ranking, LBS & IESE fell 9 ranks, IMD fell 11, and ESMT fell 23. Meanwhile, IE, Warwick and Macquire all jumped up 19 scores. As a result, Queensland, Warwick, Henley were ranked better than Cornell, London Business School, Carnegie Mellon and IMD!
They then go on to argue that the Economist ranking is scale dependent, and will discriminate against small schools (and, as we know small universities like Cal Tech are terrible…).They finish off with:
Again, IMD was not surveyed for the 2016 ranking and did not actively participate. Given this, we do not know which data the Economist will use to establish our position. What we do know is that the Economist ranking has just lost its last bit of credibility. Unfortunately, there is little IMD can do to stop the Economist from proceeding.
Says it all. Wake up. At least the Economist accepted the money (for the advert).
“In the early 1800s, slate blackboards represented change. For centuries, students had used handheld tablets of wood or slate. Teachers moved about their classrooms, writing instructions and inspecting students’ work on individual slates. When the Scottish educational reformer James Pillans became the rector of Edinburgh High School, in 1810, his use of a blackboard was revolutionary. He explains in an 1856 memoir, Contributions to the Cause of Education:
I placed before my pupils, instead of a crowded and perplexing map, a large black board, having an unpolished non-reflecting surface, on which was inscribed in bold relief a delineation of the country, with its mountains, rivers, lakes, cities, and towns of note. The delineation was executed with chalks of different colours.
Widely recognized as the inventor of the blackboard, Pillans doesn’t specify how he constructed the apparatus……Pillans used his innovation to teach Greek as well as geography, noting,
The very novelty of all looking on one board, instead of each on his own book, had its effect in sustaining attention.”
“the state of most learning management systems is verging on embarrassing in the face of the smartphone generation” and that “grainy footage of an hour-long lecture, filmed from the back of the lecture hall, just won’t cut it”.
Simon Nelson of FutureLearn in the THE
I was sat in a meeting recently. We were discussing teaching, amongst other things. And I pencilled out what we in dermatology deliver each year, every year (subtext: I doubted that people realise how much effort and resource we need to teach clinical medicine).
We provide clinical placements for 36 weeks per year, with 12-14 students attached for each two week period, in 18 ‘cohorts’. Over the year we provide just under 400 hours of clinical seminars, in which patients appear, but are there for teaching purposes only, with the students in groups of around ten, and with the teacher having no other responsibility (they are not managing patients). In addition we provide around 1500 hours of clinical experience — timetabled events in which a student attends a session in which they are not the focus of attention. These latter sessions are real: they are timetabled by person and time, start and finish on time, and are rarely cancelled or changed.
Students like what we do, they like the online stuff, too, and the staff are enthusiastic. But this system does not run itself and, in the long term, I fear might not be sustainable, even though the funding is said to be there. It is certainly not optimal, even though our students get a better deal than students at most other UK medical schools. We need to do something else, building on what we do well. Just thinking.
In my ‘online’ textbook of ‘rashes’ (ed.derm.101), that is the non-cancer bits of dermatology, I have a chapter called ragbag. I used to have two ragbag chapters, but now by combining them, the subject has been made simpler and easier, even though the content is unchanged. I am sure students agree. I put in this chapter all the things I do not put elsewhere. I thought I should now fess up a little, as Gödel would not have said.
In ‘John Wilkins’ Analytical Language’, Borges refers to the work of Franz Kuhn and his work on the the Chinese encyclopaedia the ‘Heavenly Emporium of Benevolent Knowledge’ (John Luis Borges, Selected Non-fiction, ISBN978-0-14-029011-0). This work is a classification of the world. For instance, you can classify the world into various groupings or categories. For animals we have:
All very straightforward stuff, as any medical student will agree. I also like the system proposed by the Bibliographical Institute of Brussels (after Borges). It parcelled the universe into 1000 subdivisions with number 262 corresponding to the Pope, 268 to Sunday Schools, 298 to Mormonism, and 179 to cruelty to animals, duelling and suicide.
We have lots of similar systems in medicine, some of which seem less fit for purpose than those described above. My incomplete categorisation of categorisation (the grant was turned down) is as follows:
Dreyfus and Dreyfus, the US philosophers and students of AI, pointed out that although we like to use rule-based systems in teaching, experts quickly forget them, and do not appear to use them (From Socrates to Expert Systems:The Limits of Calculative Rationality, 1984). We just inflict them on the young either because that is the only way we know how to encourage learning, or because we repeat what happened to us earlier in our career without good reason. Quoting Dreyfus and Dreyfus:
The beginning student wants to do a good job, but lacking any coherent sense of the overall task, he judges his performance mainly by how well he follows his learned rules. After he has acquired more than just a few rules, so much concentration is required that his capacity to talk or listen to advice is severely limited.
They were not writing about medical students. But I recognise what is going on. I can remember it too. Classifications may or may not be useful in learning a subject, and in chunking, may provide a guide to action. But the more experience you get, the less the clinician uses them. Academics, of course, play by different rules, and get to write the rules.
They are qualified practitioners called on to make life-and-death decisions in conditions often far from ideal; simultaneously, they are treated rather as children at school, obliged to tick boxes to show progression, document feedback on performance, demonstrate written evidence of reflection, and comply with burdensome bureaucracy. Their protest is both an expression of breaking point frustration with their training and a clarion call to the country to wake up and recognise the true state of the nation’s health services.
From Neena Modi, President of the UK Royal College of Paediatrics and Child Health.
None of the above is surprising or not widely thought. It is just that untruth has been deemed more important than truth in postgraduate medical education. Politics reigns. BTW written reflection continues to be the bullshit canary in the colliery of medical education.
“If we want our health care practitioners to be more humanistic, perhaps we should begin by treating them as human beings.” Sarab Sodhi Academic Medicine
Michael Feldstein writes about ed tech companies and Pearson:
1 Never, ever, say that you want your company to be the Uber of education, the Airbnb of education, the Pokemon GO of education, or the insert name of tech darling of education unless you really enjoy being a recluse (or you are secretly a double agent for your employer’s direct competitor).
2 If you absolutely must say something like the above, then do not say you are the Netflix of education. Honestly, Netflix isn’t even great at being the Netflix of movies. The last time they recommended a movie that I actually wanted to watch was…uh…never.
There is a recurring cultural fantasy that “solving” the education “problem” consists of creating a customized playlist of little content bits. So really, more like the Spotify of education, if you want to play that game. This idea enrages educators because it trivializes what they do. Nobody who has taught believes that proper sequencing of content chunks is the hard part.
“We’re starting to meta-tag all the content so it can be chopped up into component parts and reassembled on the fly for each user,” explained Hitchcock………
We’ve gone from Netflix to Google Ads. Out of the frying pan, into the fire. Can we get a spamming algorithm analogy just for good measure?
Worth a full read.
Marvin Minsky once quipped “Every educational reform is doomed to succeed”. He meant “with some students”.
“If I had to reduce all of educational psychology to just one principle, I would say this: The most important single factor influencing learning is what the learner already knows. Ascertain this and teach him accordingly.”(Ausubel, 1968 p. vi). Via Dylan Wiliam.
When I was a child, growing up in Wales, my father would express puzzlement that I didn’t seem to know how to pronounce certain words. He didn’t get that since Welsh was his equal first tongue — but not mine— knowing how you pronounce Welsh words was obvious to him, but not to me. For my part, it was only scores of years later that I realised some of his verbal mannerisms were not just odd idiosyncratic English or slang, but Welsh, although the meaning was clear to me. I had just not realised these were Welsh words or phrases, and of course I too would use them.
I have noted in the past, that when students mispronounce some of these dreadful dermatological terms, it was a signal that they had read about a disease, but had never been taught on that disease. It signalled to me how much they were acquiring on their own. English is like that, certainly in comparison with German: until you hear the word spoken, guessing how you say it, is tricky. More so, when you chuck in the various languages that contribute to the dermatological lexicon — and when they are then spoken / bastardised by English speakers.
But today, a student today pointed out that it would be helpful to include how words are pronounced in our course material. I am not certain how to do this yet, but I can believe that not knowing how to pronounce a term might ‘inhibit’ thinking and ‘silent talking’ about the topic (I do not know whether there is any research to back this opinion up).
I wrote a post on this topic awhile back trying to map out the territory of funds going in and out of undergraduate medical education. It was a bit too rambling but more thought out I feel than Jeremy Hunt’s latest slant on statistics that the Guardian (and others) reported. So here are some bullet like points on this issue, together with some questions. The backdrop is the article I wrote before, and the issue of ‘we paid for their training so we can seize their passports’ (Phil Hammond’s, ‘Hotel California clause’: ‘you can check in but you can never leave’.) And because, somebody asked me to spell things out a little more.
In England medical students pay 9K fees. HEFCE (Higher education funding) add another 10K. Lets round up and call it 20K. HEFCE also adds money beyond fees for other expensive degrees (engineering, for example) although I do not know if it is 10K or less. This 20K goes through the universities
Medical students do not pay their final year fees in England, but must meet all their living costs, and 9K fees for the other 4 or 5 years. Government loans attract interest and, as others have commented, the government alters the conditions in a way that would be illegal for any bank (Gee! The government makes even the bankers look like saints). So say 40-45K fees, plus living costs. I doubt much change from a 100-120K. The money attracts interest and will be much larger by the time it is paid back, and will also feed into the debt of students who do not earn enough to pay back their fees.
The other funding stream is via the NHS. In England this is called SIFT, in Scotland it is called ACT. This is probably in the region of 20K per student per clinical year, and is designed to meet the costs of the ‘students on the wards’ and pay for all the NHS staff time for those involved in teaching. This money stays within the NHS, and the universities have essentially no access to it.
If you add there two streams together you are talking about close to 30K of ‘state funding’ plus 10K from the students. Living expenses are on top, and I will ignore opportunity costs of what students defer from earning.
The problem with the 30K state funding figure is it fails the reality test. These sums add up to a figure (40K) close to what Stanford charges its small medical student cohort, and yet it is clear that our UK medical students get a much worse deal. Or just compare what this sort of money buys you at an expensive private school. There is a (fat) rabbit off somewhere. Nobody with any knowledge of medical education, and who isn’t playing politics, believes that is what we spend on each of our students.
Above, I said 20K goes through the universities, but I did not say that universities spend that 20K on delivering undergraduate teaching. The obvious issue is that medical research is big business, and most research in general loses an institution money — this is especially true of charity funded research, the main funder of medical research in the UK (although there is an attempt to make up this deficit from QR funds but it is grossly inadequate). Peacocks tail, and all that. So, teaching fees are used to subsidise this loss. There are good costings for this in some US schools, but they use endowments to meet the costs; in the UK we get students to pay for this. To what extent? I do not know. Do not ask, is the mantra. This will run and run. And then unwind.
What about the NHS money. Well, nothing is transparent in the NHS, but we know most of the this money is not used to support teaching, but siphoned off to pay for clinical care. What proportion? I would start at saying 70% (i.e. only 30% goes for what it is intended for). So I think 18K over the whole course. But I know of no convincing data in this area, just the sort of bumph Hansard repeats, which is not reality based. Do not ask, is again the mantra.
My previous post added in come complexities. And there are more, that I have not mentioned.
The key points are:
Anyway you can still listen….
I have always thought two Irish writers great guides to life: Samuel Becket, and Brian O’Nolan (aka Brian Ó Nualláin, Flann O’Brien, Myles na gCopaleen ). Becket’s line, in a postcard of the lithograph by Tom Phillips of him, hangs on my wall: “No matter. Try again. Fail again. Fail better.” But now I have come across something more fitting for my state, and with optimism:
“I work on, with failing mind, in other words, improved possibilities.”
Review of The Letters of Samuel Beckett, Volume IV, in the FT.
‘Innovative’ educational practice is more fashion-driven than those who attend the catwalks are.
Interesting post from Tony Bates on the history of distance learning, and the University of London External Programme, which started in 1828.
Unfortunately I have no knowledge of the individuals who originally created the University of London External Programme back in 1828. It’s a worthy research project for anyone interested in the history of distance education.
I was once (mid-1960s) a correspondence tutor for students taking undergraduate psychology courses in the External Programme. In those days, the university would publish a curriculum (a list of topics) and provide a reading list. Students could sit an exam when they felt they were ready. Students paid tutors such as myself to help them with their studies. I would find old exam papers for the course, and set questions for individual students, and they would send me their answers and I would mark them. Many students were in British Commonwealth countries and it could take weeks after students sent in their essays before my feedback eventually got back to them. Not surprisingly, in those days completion rates in the programme were very low…
But I am fascinated by (and was ignorant of) the following:
Note though that teaching and examining in the original External Programme were disaggregated (those teaching it were different from those examining it), contract tutors were separate from the main faculty were used, and students studied individually and took exams when ready. So many of the ‘new’ developments in distance education such as disaggregation, self-directed learning, and many of the elements of competency-based learning are in fact over 150 years old.
“The original Apple Watch has 450 nits of brightness. The new one has 1,000 nits. That’s a lot of nits. In case you were wondering, a “nit” is a unit of luminance equal to one candela per square meter.”
My father used to call me a nit on a daily basis. Now I know he was being flattering. Tricky for us dermatologists, too