Use of the term world-class. Usually means one of the following: you are lazy, corrupt, or deluded. Rarely, it means something else that in almost all instances does not need saying.
Biology is short of theory compared with physics, and medicine more so. More dull trials, and less and less insight. Busyness and project management, directed by chief executives, wielding Excel spreadsheets. Alfred G Knudson has just died and Nature’s obituary tells the story of somebody who could play at natural history and then form a majestic and testable hypothesis. The penultimate sentence reads: “[his] lack of patience for science that merely repeated the work of others kept everyone in his sphere striving for the new”
“Physicists studying sport have established that many fieldsmen are very good at catching balls, but bad at answering the question: “Where in the park will the ball land?” Good players don’t forecast the future, but adapt to it. That is the origin of the saying “keep your eye on the ball”.
As complex systems go, the interaction between the ball in flight and the moving fieldsman is still relatively simple. In principle, most of the knowledge needed to compute trajectories and devise an optimal strategy is available: we just don’t have the instruments or the time for analysis and computation. More often, the relevant information is not even potentially knowable. The skill of the sports player is not the result of superior knowledge of the future, but of an ability to employ and execute good strategies for making decisions in a complex and changing world. The same qualities are characteristic of the successful executive. Managers who know the future are more often dangerous fools than great visionaries.”
I think you could say the same about education and medicine: you can say less than you know.
Donald “D.A.” Henderson, an American epidemiologist who led the international war on smallpox that resulted in its eradication in 1980, has died.
“But it was in the fight on smallpox — perhaps the most lethal disease in history and one that killed an estimated 300 million people in the 20th century alone — that he became known around the world…”
“I think it can be fairly said that the smallpox eradication was the single greatest achievement in the history of medicine,” Richard Preston, the best-selling author of volumes including “The Hot Zone,” about the Ebola virus, and “The Demon in the Freezer,” about smallpox, said in an interview. He described Dr. Henderson as a “Sherman tank of a human being — he simply rolled over bureaucrats who got in his way.”
“Investment firm GSV Advisors recently estimated the annual global outlay on education at $5.5 trillion and growing rapidly. Let that number sink in for a second—it’s a doozy. The figure is nearly on par with the global health care industry, but there is no Big Pharma yet in education. Most of that money circulates within government bureaucracies.”
A nice dose of uncommon common sense:
“The world of learning is a failure factory, not in the positive sense of learning from failure, second chances and progress but one of selection, road blocks, disappointment, discouragement and real failure. As professionals, we seem to have lost our critical faculties, stuck in a time warp of old theory and models that were never verified in the first place; lectures, hands-up anyone, Maslow, Myers-Briggs, Learning Styles, Piaget, NLP, Kirkpatrick. This is not good enough. It introduces certainty where there is nothing but ideological belief and unverified theory and practice. We need to think critically and see failure as part of what it is to learn.”
“For a couple of months now I’ve gathering and trudging my way through many higher education institutions’ strategic plans to think through how students’ unions might best influence such strategies in the interests of students. Even before Brexit this was a miserable task. The level of similarity is such that it would cause a Turnitin server to billow with smoke, and the daft jargon gives the overall impression that they were written by the sort of bots that are now taking the graduate level jobs previously promised to the students that fill these places.”
Two articles both from different areas. The first is from an interview with Paul Greengrass (he of ‘Bloody Sunday’, and the Bourne films).
“Youngsters starting out probably aren’t going to be supported and developed like I was in my early career, they’re much more likely be chewed up,” he said. “This places a greater weight on universities like Kingston, which is a breeding ground for talent, to educate kids about the importance of point of view – it’s the easiest thing to lose but the most important thing to hold on to.”
The second in Science, about a likely Nobel prize winner, Rainer Weiss.
Then, in his junior year, Weiss flunked out of school entirely. He fell for a woman he met on a ferry from Nantucket to Boston. “She taught me about folk dancing and playing the piano,” he says. Weiss followed her when she moved to Evanston, Illinois, abandoning his classes in midterm. But the affair fizzled. “I fell in love and went crazy,” he says, “and of course she couldn’t stand to be around a crazy man.” Weiss returned to MIT hoping to take his finals only to find he’d flunked out.
Weiss says he was unfazed. “People say, ‘I failed out of college! My life is over!’ Well, it’s not over. It depends on what you do with it.” He took a job as a technician in MIT’s legendary Building 20, a temporary structure erected during the war, working for Jerrold Zacharias, who studied beams of atoms and molecules with light and microwaves and developed the first commercial atomic clock. Under Zacharias’s tutelage, Weiss finished his bachelor’s degree in 1955 and earned his Ph.D. in 1962.
A later quote from the same article:
After a postdoc at Princeton University developing experimental tests of gravity under physicist Robert Dicke, Weiss returned to MIT in 1964. As a junior faculty member, he says, he published little and didn’t worry about advancing his career. MIT’s Shoemaker says Weiss probably got tenure only for his teaching—and wouldn’t get it today. Bernard Burke, an emeritus physicist at MIT, agrees that early on Weiss was a “happy gadgeteer” who “wasn’t likely to get tenure unless he did something that did something.”
The echo of how he has lived some of his life is provided by one of his protégés, David Shoemaker
Shoemaker adds that Weiss’s foremost quality is empathy. A college dropout, Shoemaker credits Weiss with getting him into graduate school at MIT without an undergraduate degree. “He sought ways to bring out the best in me,” Shoemaker says. “He also took a rather irregular path, and I think because of that and just his nature, he is really interested in helping people.”
Now, none of this is too surprising. Science and any serious intellectual or cultural endeavour is a way of constructively catching dissent. And dissent clusters: it is not uniform across society, but found on the fringes or boundaries of good sense. But we are no longer focussed on diversity or providing a garden for play. Instead, we are obsessed with homogeneity and forcing all to the mean.
Blake got it right:
The Enquiry in England is not whether a Man has Talents & Genius, But whether he is Passive & Polite & a Virtuous Ass & obedient to Noblemen’s Opinions in Art & Science. If he is, he is a Good Man. If not he must be Starved.
I think sequencing — the order — of how we put teaching together is a big issue in medical education. Not the only big issue, just one of a handful. Historically medical education hid behind the idea of education as a form of apprenticeship. There is a lot to be said for post-graduate medical education as an apprenticeship (despite the attempts by the NHS and HR (aka postgraduate deans) to kill it off). But at the undergraduate level it fails; class sizes are too large; the sense of belonging gone; specialisation has led to lots of small attachments; and nobody has been quite certain how to deliver teaching when the responsible body is a university, but where the patients are physically located in the NHS.
In some mythical far distant past, students would be lectured to, and then appear on a ward where they would be supervised, mentored, and where their progress would be monitored in real time (as in real feedback). And many of those delivering the teaching, would know exactly what standards would be expected of the students. This is not how it works now. No surprises here then. Like much of modern education: it doesn’t work.
Even when I was a student the clinical attachments through year 3 and 4 would be in the mornings, with lectures in the afternoon. The problem was that the two activities were out of sync: the mornings might be spend in paediatrics, but the lectures could have been on geriatrics. The time hallowed linkage between seeing patients and reading about them was rendered problematic. My solution was to not attend lectures. It worked for me 🙂
There were attempts to get round this problem. Dermatology teaching in Newcastle in those days was made up of 4 weeks of clinical mornings, but the lectures were delivered on another ‘out-of-phase’ period, and each afternoon, after say a lecture on psoriasis, 10–15 patients with psoriasis would appear, along with 10–15 staff who would demonstrate physical signs and patient stories, to students. You needed a lot of staff, and a lot of seminar rooms, and seminar rooms close to each other (i.e. a medical school). When I was in charge, I kept that system going for a few years, but eventually we had to abandon it due to a lack of resource and central support (‘who is paying the patients’ travel expenses’).
The prompt for for all of this is merely to remark how badly we organise or instruct what we want students to know before they appear on the ‘wards’. Tech allows us to think of ways to do this that were simply impossible 20 or even 10 years ago. But it needs a sea-change in how we view medical education. And much as I fear the expropriation of medical education from the ‘ward’, (simply because bedside teaching is so expensive) we have to think hard about allowing our students to take most advantage of the clinical exposure we can provide.
So, we started a new academic year this morning. The first group of students — there will be another 17 groups this academic year — who will spend two weeks with us throughout the year. And what surprised me, and cheers me up enormously, is how, when medical students are given firm and coherent guides as to what to cover by themselves, they can, with little interaction, achieve so much (connectivists and social constructivists, please note). And when you then interrogate them interactively on these topics, you can feel and see them struggling (successfully) to make sense of so much new material. And with an evident sense of pleasure and achievement.
“It’s no accident that only 11% of the US workforce is passionate about their work. This is a sign of great success. This is exactly what these institutions were designed to do – suppress passion. It starts with our schools that seek to prepare us for all the other institutional environments seeking people who can reliably follow instructions and execute in a predictable manner. Think of our current institutions as powerful chisels, relentlessly chipping away at our edges until we fit neatly into the tightly defined roles that our institutions have create.”
“If we’re really going to save money in health care, it means that somebody’s going to get paid less,”
Of course, this is not always true, but just in the same way that you should never say never in medicine.
Austin Frakt, quoted in the Boston Globe
“No one should use comic sans for lectures in a university.” Student feedback. It wasn’t me!
A comment in Science
A well-stated hypothesis describes a state of nature. It is either true or not true, not subject to probability. The phrase “probability the hypothesis is true” is meaningless. One can only say, “likelihood that the observed data came from a population characterized by the hypothesis.”
I only post, because I seem to spend my life trying to argue that the dismal null hypothesis is a tool for doing one type of statistics, and has a limited role in science. It has little to do with what we mean by a scientific hypothesis. There are not an infinite number of scientific hypotheses: there is a not a probability distribution in the way we use this term in statistics.
The most important thing in learning is copying how other people think. I don’t think learning by doing really gets one to emulate how other people think. Marvin Minksy
“Classics are written by people, often in their twenties, who take a good look at their field, are deeply dissatisfied with an important aspect of the state of affairs, put in a lot of time and intellectual effort into fixing it, and write their new ideas with self-conscious clarity. I want all Berkeley graduate students to read them.”
This is a quote from a review of Alice Gopnik’s most recent book, ‘The gardener and the carpenter’. I have enjoyed Gopnik’s previous work, and this quote could, with some latitude, be applied to medical school and medical education (where students become competent despite the best intentions of the medical educators…).
It assumes that the ‘right’ parenting techniques or expertise will sculpt your child into a successful adult. But using a scheme to shape material into a product is the modus operandi of a carpenter, whose job it is to make the chair steady or the door true. There is very little empirical evidence, Gopnik says, that “small variations” in what parents do (such as whether they sleep-train) “have reliable and predictable long-term effects on who those children become”. Raising and caring for children is more like tending a garden: it involves “a lot of exhausted digging and wallowing in manure” to create a safe, nurturing space in which innovation, adaptability and resilience can thrive.
We can only grasp reality by metaphor.
‘The Socratic slogan: “If you understand it, you can explain it’, should be reversed: Anyone who thinks he can fully explain his skill, does not have expert understanding’. Hubert Dreyfus.
There is sometimes a prejudice in medical education that somehow teaching at the bedside is always best. Of course most medical encounters are not at the bedside (any more) simply because most clinical encounters are not on wards, but in offices, whether the offices are in hospitals or elsewhere. The arguments for the bedside include tradition, but also reflect the fear that medical education will be expropriated from the clinical context. I have a lot of sympathy with the latter view, but it will sometimes lead to error.
Yesterday, I talked about the Dermofit App, to which I contributed. One of the rationales for this whole approach almost a dozen years ago now, was my belated realisation that clinical exposure — however intense — in dermatology might not be as efficient as a learning environment in a virtual world. In dermatology, simulation is over one and a half centuries old, and the history of this simulation, tracks the development of technology. It is just that this simulation relies on something we have got used to because it is all around us: high quality graphics. Pictures of lesions.
Several years later we published a paper, exploring this. We wrote:
“The overwhelming majority of students 82% (n = 41) did not see an example of each of the three major skin cancers (BCC, SCC, melanoma) and only a single student (2%) witnessed two examples of each. The percentage of students witnessing 1, >3 and >5 examples is given for each of the 16 lesions and demonstrates that there was not only a lack of breadth but also of depth to the students’ exposure.”
In one sense this is all very obvious. We know that (perceptual) classification tasks require practice, and that practice requires multiple training examples. The training signal: noise ratio can be higher in the virtual world, and it is easier to manipulate events in the virtual world. If the quip is that technology is everything that gets invented after your teenage years, we don’t recognise the obvious technology here simply because it is has been around so long. It is just that silicon really allows it to be done so much better. The caveat is whether the business model allows this.
Students will prefer the clinic, for reasons I understand. But they will often be to wrong to do so.
The App version of Dermofit is on the App store. Here is a link to a link. I have only just started to play around with it. The App was a commercialisation of some work we did here in Edinburgh, between myself and and Bob Fisher in Informatics. If you search my main site reestheskin.me using the keyword ‘dermofit’ you will find a little more about it and the work that led up to it. It is for iPad only. [Yes, I stand to benefit from any sales, but I do not think I will be giving up the day job anytime soon]. I will write another time about the ‘why’ and rationale behind this whole approach.
The (medical) future is here, just unevenly distributed
The lessons from Glybera, the first gene therapy to be sold in Europe, still loom large. It cures a genetic condition that causes a dangerously high amount of fat to build up in the blood system. Priced at $1m, the product has only been bought once since 2012 and stands out as a commercial disaster. Economist
Roger Schank is speaking at OEB 2016, and there is a post from him here on the OEB site.
His second sentence is:
Sorry to be a downer, but technology will change nothing if what is meant by technology is that we have new ways of delivering the same old material.
His suggestions include:
He goes on to argue how AI / tech can help.
I can agree with much, if not all of this (excepting the ten year old bit, if taken too literally). And building simulators of the ‘real world’ is where we need to be. But I still wrestle with what is foundational and how much preloaded material students need in order to allow them to make sense of the real world (‘preloaded material’, yes, I can hear the hackles…)
You might divide learning into ‘just-in-time’ and ‘foundational’. Foundational rightly has a bad name because most foundational learning is not foundational at all, but often reflects the prejudices of those who benefit from selling particular content. Medical degrees are stuffed full of foundational stuff that is nothing of the kind.
In medicine, you learn your craft by doing it, by seeing patients and diagnosing their ailments in the company of experts, and seeing what happens to them (supplemented by learning about what has happened to others); and by going back to the books and foundational concepts continually. But there is a framework that creates the clinical worldview, and that worldview has a language, that requires immersion.
One (and only one) key goal of medical school is to enable to you to function in a clinical environment such that you can make sense of it, and learn from it. But that is not available to a novice, however bright, without a lot of ‘preloaded’ baggage. The question is about what the balance is between ‘preloaded’ and ‘just-in-time’. I think we obsess over the former and need to shift much more to the latter. But I do not know exactly how it will look in the end, although I know the direction of travel that is needed.
And yes, tech can help this, but not when PowerPoint is involved. He is right there, and has been for a long time.
[Alice Gopnik, in one of the John Brockman edited books (I think), remarked that although she had spent much of her professional life studying how babies make sense of the world, little of any of these learning insights made it into how she delivered material to her university students or how they learned. Three years of lectures on cracking eggs, and then in the fourth, you get to do it.]