Another terrific bit of writing by Hugh Pennington in the LRB. It is saturated with insights into a golden age of medical science.
Streptococcus pyogenes is also known as Lancefield Group A [GAS]. In the 1920s and 1930s, at the Rockefeller Institute in New York, Rebecca Lancefield discovered that streptococci could be grouped at species level by a surface polysaccharide, A, and that A strains could be subdivided by another surface antigen, the M protein.
Ronald Hare, a bacteriologist at Queen Charlotte’s Hospital in London, worked on GAS in the 1930s, a time when they regularly killed women who had just given birth and developed puerperal fever. He collaborated with Lancefield to prove that GAS was the killer. On 16 January 1936 he pricked himself with a sliver of glass contaminated with a GAS. After a day or two his survival was in doubt.
His boss, Leonard Colebrook, had started to evaluate Prontosil, a red dye made by I.G. Farben that prevented the death of mice infected with GAS. He gave it to Hare by IV infusion and by mouth. It turned him bright pink. He was visited in hospital by Alexander Fleming, a former colleague. Fleming said to Hare’s wife: ‘Hae ye said your prayers?’ But Hare made a full recovery.
Prontosil also saved women with puerperal fever. The effective component of the molecule wasn’t the dye, but another part of its structure, a sulphonamide. It made Hare redundant. The disease that he had been hired to study, funded by an annual grant from the Medical Research Council, was now on the way out. He moved to Canada where he pioneered influenza vaccines and set up a penicillin factory that produced its first vials on 20 May 1944. [emphasis added]
He returned to London after the war and in the early 1960s gave me a job at St Thomas’s Hospital Medical School. I wasn’t allowed to work on GAS. There wasn’t much left to discover about it in the lab using the techniques of the day, and penicillin was curative.
John Naughton’s Quote of the Day
”If people don’t believe mathematics is simple, it is only because they don’t realise how complicated life is.”
John von Neumann
Nature today revisits the publication fifty years ago of The Limits to Growth by the System Dynamics group at the Massachusetts Institute of Technology. The journal (then) referred to is as another whiff of doomsday.
Zoologist Solly Zuckerman, a former chief scientific adviser to the UK government, said: “Whatever computers may say about the future, there is nothing in the past which gives any credence whatever to the view that human ingenuity cannot in time circumvent material human difficulties.”
Which is surely akin to saying, there is nothing in the past to suggest we are already extinct. As for computers, there are three branches of science: theory, experiment, and computation. The song goes:
You’re obsolete, my baby
My poor old fashioned baby
I said, baby, baby, baby, you’re out of time
Well, baby, baby, baby, you’re out of time
I said, baby, baby, baby, you’re out of time
Yes you are left out, out of there without a doubt
‘Cause baby, baby, baby, you’re out of time
I came across this article by Zadie Smith via John Naughton.
Magical thinking is a disorder of thought. It sees causality where there is none, confuses private emotion with general reality, imposes—as Didion has it, perfectly, in “The White Album”—“a narrative line upon disparate images.” But the extremity of mourning aside, it was not a condition from which she generally suffered. Didion’s watchword was watchword. She was exceptionally alert to the words or phrases we use to express our core aims or beliefs. Alert in the sense of suspicious. Radically upgrading Hemingway’s “bullshit detector,” she probed the public discourse, the better to determine how much truth was in it and how much delusion. She did that with her own sentences, too. [emphasis added]
I wasn’t familiar with this word although even with my smattering of German (as in, I do violence to the language) I could hazard a guess.
The source was an article in Der Spiegel about Germans who have moved to Bulgaria to get away from Covid restrictions.
The apartment complex in the town of Aheloy is considered a stronghold of German-speaking corona truthers and so-called “Querdenker,” that hodgepodge of anti-government conspiracy theorists who have waged an ongoing campaign against all measures aimed at combatting the pandemic.
If you check out in the Collins online dictionary you find:
MASCULINE NOUN , Querdenkerin FEMININE NOUN
Sounds like a great opener for an essay on epistemology 101.
Before it comes to that, we have another question: Does the unofficial Château boss describe himself as a Querdenker? The term, which, pre-COVID, used to be reserved in Germany for those who think outside the box, “has lost its original meaning,” Gelbrecht says. “True Querdenker were people like Albert Einstein or Stephen Hawking.” He primarily views himself as a savior for the desperate. “Many Germans are growing increasingly concerned that they will be excluded if they don’t get vaccinated, that they will no longer be able to take part in society and that they will be forced to have their children vaccinated.”
From a review of “The Ascent of Information” by Caleb Scharf.
Every cat GIF shared on social media, credit card swiped, video watched on a streaming platform, and website visited add more data to the mind-bending 2.5 quintillion bytes of information that humans produce every single day. All of that information has a cost: Data centers alone consume about 47 billion watts, equivalent to the resting metabolism of more than a tenth of all the humans on the planet.
Scharf begins by invoking William Shakespeare, whose legacy permeates the public consciousness more than four centuries after his death, to show just how powerful the dataome can be. On the basis of the average physical weight of one of his plays, “it is possible that altogether the simple act of human arms raising and lowering copies of Shakespeare’s writings has expended over 4 trillion joules of energy,” he writes. These calculations do not even account for the energy expended as the neurons in our brains fire to make sense of the Bard’s language.
From time to time, I vow not to read any more comments on the FT website. Trolls aside, I clearly live in a different universe. But then I return. It is indeed a signal-noise problem, but one in which the weighting has to be such that the fresh shoots are not overlooked. I know nothing about Paul A Myers, and I assume he lives in the US, but over the years you he has given me pause for thought on many occasions. One recent example below.
Science-based innovation largely comes out of the base of 90 research universities. One can risk an over-generalization and say there are no “universities” in a non-constitutional democratic country, or authoritarian regime. Engineering institutes maybe, but not research universities. Research is serendipity and quirky; engineering is regular and reliable. Engineering loves rules; research loves breaking them. The two fields are similar but worship at different altars.
This contrast is also true of medicine and science. Medicine is regulated to hell and back — badly, often — but I like my planes that way too. But, in John Naughton’s words, if you want great research, buy Aeron chairs, and hide the costs off the balance sheet lest the accountants start discounting all the possible futures.
It is one of dermatology’s tedious and fun facts that, in contradistinction to say scabies or head lice, you treat body lice by treating not the patient (directly) but the clothing. The pharmacological agent is a washing machine. But the excerpt quoted below tells you something wonderful about science: you get things out that you never expected. Spontaneous generation — not in the real world — but in the world of ideas. Well, almost.
Scientific efforts to shed light on the prehistory of clothes have received an unexpected boost from another line of research, the study of clothing lice, or body lice. These blood-sucking insects make their home mainly on clothes and they evolved from head lice when people began to use clothes on a regular basis. Research teams in Germany and the United States analysed the genomes of head and clothing lice to estimate when the clothing parasites split from the head ones. One advantage of the lice research is that the results are independent from other sources of evidence about the origin of clothes, such as archaeology and palaeoclimatology. The German team, led by Mark Stoneking at the Max Planck Institute for Evolutionary Anthropology, came up with a date of 70,000 years ago, revised to 100,000 years ago, early in the last ice age. The US team led by David Reed at the University of Florida reported a similar date of around 80,000 years ago, and maybe as early as 170,000 years ago during the previous ice age. These findings from the lice research suggest that our habit of wearing clothes was established quite late in hominin evolution.
Ever since I read of how Gödel’s work has rendered decades of work by Bertrand Russell and others void, Gödel has fascinated me. Not that I can follow the raw proofs. But his work speaks of a wonderful Platonic world that is hard not to fall in love with. Two quotes: the first is new to me, the second, sadly not.
Einstein sponsored his US citizenship, which Gödel almost torpedoed by telling the judge that he had found a logical inconsistency in the constitution that would allow a person to establish a dictatorship in America.
His end, when it came, was tragic. His paranoia grew and he became convinced that his food was being poisoned. When this had happened earlier in his life, his wife had managed to taste test and spoon feed him to health but this time she too was ill and in January 1978, he died in hospital, curled into a foetal position and weighing only 65 pounds.
Terrific article from Jonathan Flint in the LRB. He is an English psychiatrist and geneticist (mouse models of behaviour) based in UCLA, but like many, has put his hand to other domains (beyond depression). He writes about Covid-19:
Perhaps the real problem is hubris. There have been so many things we thought we knew but didn’t. How many people reassured us Covid-19 would be just like flu? Or insisted that the only viable tests were naso-pharyngeal swabs, preferably administered by a trained clinician? Is that really the only way? After all, if Covid-19 is only detectable by sticking a piece of plastic practically into your brain, how can it be so infectious? We still don’t understand the dynamics of virus transmission. We still don’t know why around 80 per cent of transmissions are caused by just 10 per cent of cases, or why 2 per cent of individuals carry 90 per cent of the virus. If you live with someone diagnosed with Covid-19, the chances are that you won’t be infected (60 to 90 per cent of cohabitees don’t contract the virus). Yet in the right setting, a crowded bar for example, one person can infect scores of others. What makes a superspreader? How do we detect them? And what can we learn from the relatively low death rates in African countries, despite their meagre testing and limited access to hospitals?
That we are still scrambling to answer these questions is deeply worrying, not just because it shows we aren’t ready for the next pandemic. The virus has revealed the depth of our ignorance when it comes to the biology of genomes. I’ve written too many grant applications where I’ve stated confidently that we will be able to determine the function of a gene with a DNA sequence much bigger than that of Sars-CoV-2. If we can’t even work out how Sars-CoV-2 works, what chance do we have with the mammalian genome? Let’s hope none of my grant reviewers reads this.
Medicine is always messier that people want to imagine. It is a hotchpotch of kludges. For those who aspire to absolute clarity, it should be a relief that we manage effective action based on such a paucity of insight. Cheap body-hacks sometimes work. But the worry remains.
Tyler Cowen says some interesting things in an article Why Economics is Failing US on Bloomberg. I don’t think his comments are limited to the economics domain.
Economics is one of the better-funded and more scientific social sciences, but in some critical ways it is failing us. The main problem, as I see it, is standards: They are either too high or too low. In both cases, the result is less daring and creativity.
Consider academic research. In the 1980s, the ideal journal submission was widely thought to be 17 pages, maybe 30 pages for a top journal. The result was a lot of new ideas, albeit with a lower quality of execution. Nowadays it is more common for submissions to top economics journals to be 90 pages, with appendices, robustness checks, multiple methods, numerous co-authors and every possible criticism addressed along the way.
There is little doubt that the current method yields more reliable results. But at what cost? The economists who have changed the world, such as Adam Smith, John Maynard Keynes or Friedrich Hayek, typically had brilliant ideas with highly imperfect execution. It is now harder for this kind of originality to gain traction. Technique stands supreme and must be mastered at an early age, with some undergraduates pursuing “pre-docs” to get into a top graduate school.
Sam Shuster, before I departed to Strasbourg, warned me in a similar vein, with reference to the Art of War by Sun Tzu:
Even the mystique of wisdom turns out to be technique. But if today must be learning technique, don’t leave the tomorrow of discovery too long.
I would say I heard the message but didn’t listen carefully enough. As befits an economist, Cowen warns us that there is no free lunch.
For some reason — COVID of course — I keep coming back to perhaps the greatest vaccine success ever: the eradication of smallpox (here, here and here). But the figures quoted by Scott Galloway make you sit up and notice both the magic — and value — of science.
International bodies are immolated. Consider the World Health Organisation. Mr Trump’s decision to pull America out of the WHO in the midst of a pandemic (reversed under President Joe Biden) was galling, particularly as the WHO is responsible for one of humanity’s greatest public-health accomplishments: the eradication of smallpox in the 1970s. To appreciate the magnitude of this, Google images of “smallpox” and glimpse the horror that once killed millions each year. It was a victory for co-operative, state-funded projects and it cost a mere $300m. By one estimate, America, the largest contributor, recoups that value every 26 days from savings in vaccinations, lost work and other costs. [emphasis added]
Thirty years ago[now 40], scientists who studied climate change, and I am one of them, tended to have long hair and very colourful socks. We were regarded as harmless but irrelevant. But the serendipitous investment in their work revealed processes that we now recognise as threatening the future of human society, and the successors to those scientists are playing a crucial role in assessing how we need to adapt.
I think you could see the same dress sense in the golden ages of molecular biology and computing.
Another snippet from a wonderful article (and previous aside).
Or not, as the case may be.
Smallpox is the greatest success story in the history of medicine. It once took huge numbers of lives — as many as half a billion people in the 20th century alone — and blinded and disfigured many more.
So writes the distinguished historian of science, Steven Shapin in the LRB (A Pox on the Poor, February 4, 2021). He is reviewing The Great Inoculator: The Untold Story of Daniel Sutton and His Medical Revolution).
In historical times you had a one in three chance of getting smallpox, and, if you got it, the case-fatality was 20%. Some outbreaks, however, had a case-fatality of 50% and, unlike Covid-19, its preferred targets were children.
My exposure to smallpox was (thankfully) limited. My mother told me that there was an outbreak in South Wales and the West of England when I was around five or six. There was widespread vaccination, but kids like me with bad eczema, were spared, with the parent advised to ‘shield’ the child indoors (my sympathies go to my mother). (The risk was from the grandly named, but sometimes fatal, Kaposi’s varicelliform reaction, which was due to the vaccinia virus — not smallpox — running riot on the diseased skin).
As a med student, I remember learning how to distinguish the cutaneous lesions of smallpox from chicken pox. Smallpox was declared eradicated in 1980, but, as a dermatology registrar, seeing the occasional adult with chickenpox who seemed so ill (in comparison with kids), I often had doubts that I had to reason away. Perhaps those stores held by the US and USSR were not so secure…
Jenner and smallpox vaccination go together in popular accounts, but the history of this particular clinical discovery is much older and richer — at least to me.
As ever, in medicine, and in particular for anything involving the skin, the terminology is confusing. The Latin name for cowpox is Variolae vaccinae, meaning the pox from the cow (vacca). It was Pasteur who subsequently honoured Jenner by deciding that all forms of inoculation be called vaccination.
Edward Jenner took advantage of the already-known fact that whilst milkmaids tended to be afflicted with the far more mild cowpox virus, they rarely suffered from the more serious, smallpox (they are different, but related, viruses). Jenner, in 1796, inoculated an eight-year-old boy with the pus from a milkmaid’s cowpox sore. After being exposed to smallpox material the boy appeared immune, in that he did not suffer adversely when subsequently exposed to smallpox.
Once Jenner’s method was accepted as safe, Acts of Parliament introduced free vaccination n 1840, and vaccination became obligatory in 1853.
I had never been quite certain of the distinction between inoculation and vaccination, but there is history here too. Shapin writes that the term inoculation was borrowed from horticulture — the grafting of a bud (or ‘eye’) to propagate a plant (something I was taught how to do in horticulture lessons when I was aged 11, in school in Cardiff, by Brother Luke, who, I thought so old, he might have been a contemporary of Jenner). Why the name is apt is explained below.
Before vaccination, inoculation was actually meant to give you a direct form of smallpox (this was also referred to as variolation, after variola, the term for smallpox). The source material, again, was from a lesion of somebody with smallpox. The recipient it was hoped would develop a milder version of smallpox. Shapin writes:
The contract with the inoculator was to accept a milder form of the disease, and a lower chance of death, in exchange for a future secured from the naturally occurring disease, which carried a high chance of killing or disfiguring.
Shapin tells us that Lady Mary Wortley Montagu, when holed up with her husband in Constantinople in 1717, heard stories about how such ‘in-grafting’ was in widespread use by the ‘locals’. She was scarred from smallpox, and therefor she had the procedure carried out on her then five-year-old son. The needle was blunt and rusty, but her son suffering just a few spots and the procedure was judged a success. He was now immune to smallpox.
Not surprisingly, the story goes back further: inoculation was folk medicine practice in Pembrokeshire as early as 1600, and the Chinese had been blowing dried, ground-up smallpox material up the nose for many centuries.
The London medical establishment were apparently not too impressed with the non-British origin of such scientific advance, nor its apparent simplicity (and hence low cost). So, they made the procedure made much more complicated, with specific diets being required, along with advice on behaviour, and, of course, blood-lettings and laxatives, all in addition to not just a ‘prick’ but a bigger incision (payment by the inch). The physician’s ‘fees’ no doubt rose in parallel. Not a bad business model, until…
The London physicians’ ‘add-ons’ put the treatment costs of inoculation out of reach of most of the population, restricting it, for decades, to the ‘medical carriage trade’. Along comes Richard Sutton, a provincial barber-surgeon, with no Latin or Greek, no doubt, who effectively industrialised the whole process, making it both more profitable and cheaper for the customer.
Based in a village near Chelmsford, he inoculated tens of thousands locally. The method was named after him, the Suttonian Method. On one day he inoculated 700 persons. Incisions (favoured by the physicians) were replaced by the simpler prick, and patients were not confined, but instead told to go out in the fresh air (day-case, anybody?). Product differentiation was of course possible:spa-like pampering in local accommodation was available for the top end of the market, with a set of sliding fees depending on the degree of luxury.
Splitting the market and niche pricing were aspects of Sutton’s business success, but so too was control of supply. The extended Sutton clan could satisfy a significant chunk of provincial demand, but Daniel also worked out a franchising system, which ‘authorised’ more than fifty partners throughout Britain and abroad to advertise their use of the ‘Suttonian System’ — provided they paid fees for Sutton-compounded purgatives, kicked back a slice of their take, and kept the trade secrets. Control was especially important, since practically, anyone could, and did, set up as an inoculator. The Suttons themselves had become surgeons through apprenticeship, but apothecaries, clergymen, artisans and farmers were inoculating, and sometimes parents inoculated their own children. The profits of the provincial press were considerably boosted as practitioners advertised their skills at inoculation and their keen prices. Daniel went after competitors — including his own father-in-law and a younger brother — with vigour, putting it about that the Suttonian Method depended on a set of closely held secrets, to which only he and his approved partners had access. His competitors sought to winkle out these secrets, occasionally pouncing on Sutton’s patients to quiz them about their experiences.
Sutton admitted that had ‘lost’ five out of forty thousand patients (due to smallpox). He offered a prize of 100 guineas to anybody who could prove he ever lost a patient due to inoculation, or that any patient got smallpox a second time. More confident, and perhaps more generous, than modern Pharma, I think. By 1764, Sutton had an annual income of 6300 guineas — over one million sterling in today’s money.
In the middle of the pandemic, I got an e-mail asking whether I had access to data from the experiments behind a paper I’d published in 2014. Three months later, I requested that the paper be retracted. The experience has not left me bitter: if anything, it brought me back to my original motivation for doing research.
This is a disarmingly honest piece (in the journal Nature) about how mistakes in the analysis of complicated data sets caused inappropriate conclusions, leading, in this case, to a retraction of a manuscript.
As a student, I was even told never to attempt to replicate before I publish. That is not a career I would want — luckily, my PhD adviser taught me the opposite.
John Ziman warned over 20 years ago in Real Science that we were entering post-academic science. Here are some words of his from an article in Nature.
What is more, science is no longer what it was when [Robert] Merton first wrote about it. The bureaucratic engine of policy is shattering the traditional normative frame. Big science has become a novel way of life, with its own conventions and practices. What price now those noble norms? Tied without tenure into a system of projects and proposals, budgets and assessments, how open, how disinterested, how self-critical, how riskily original can one afford to be?
As the economists are fond of saying: institutions matter. As do incentives. Precious metals can be corrupted, and money — in the short term — made.
Mark Zuckerberg is what happens when you replace civics with computer science.
In the shower, all ideas look good.
A comment about the above article:
As a full professor in a similar situation, a humanities department in a British teaching factory (sorry major research university) I completely agree with Musidorus.
Ironically, some old gender stereotypes may now be helping girls. When girls are toddlers they are read to more than boys. Their fathers are five times more likely to sing or whistle to them and are more likely to speak to them about emotions, including sadness. Their mothers are more likely to use complex vocabulary with them. Most of this gives girls a leg up in a world that increasingly prizes “soft skills”. Girls still have less leisure time than boys, but nowadays that is primarily because they spend more time on homework and grooming, rather than an unfair division of chores. And in the time left for themselves they have far more freedom.
The one good thing about COVID-19 is that it’s good for nature and the environment and dolphins,” says Sarah, “but I wish it wouldn’t kill so many people in the process.”
He had little respect for the professors of his time, telling a friend in 1735 that “there is nothing to be learnt from a Professor, which is not to be met with in Books”. He did not graduate.
Nor did I ever submit my PhD. As David Hubel once said, the great advantage of an MD degree was (then) being able to avoid having to gain a PhD credential.
For decades, Katalin Karikó’s work into mRNA therapeutics was overlooked by her colleagues. Now it’s at the heart of the two leading coronavirus vaccines
By the mid 1990s, Karikó’s bosses at UPenn had run out of patience. Frustrated with the lack of funding she was generating for her research, they offered the scientist a bleak choice: leave or be demoted. It was a demeaning prospect for someone who had once been on the path to a full professorship. For Karikó’s dreams of using mRNA to create new vaccines and drugs for many chronic illnesses, it seemed to be the end of the road… ”It was particularly horrible as that same week, I had just been diagnosed with cancer,” said Karikó. “I was facing two operations, and my husband, who had gone back to Hungary to pick up his green card, had got stranded there because of some visa issue, meaning he couldn’t come back for six months. I was really struggling, and then they told me this.”
Karikó has been at the helm of BioNTech’s Covid-19 vaccine development. In 2013, she accepted an offer to become Senior Vice President at BioNTech after UPenn refused to reinstate her to the faculty position she had been demoted from in 1995. “They told me that they’d had a meeting and concluded that I was not of faculty quality,” she said. ”When I told them I was leaving, they laughed at me and said, ‘BioNTech doesn’t even have a website.’”
Donald Knuth is a legend amongst computer scientists.
I have been a happy man ever since January 1, 1990, when I no longer had an email address. I’d used email since about 1975, and it seems to me that 15 years of email is plenty for one lifetime.Email is a wonderful thing for people whose role in life is to be on top of things. But not for me; my role is to be on the bottom of things. What I do takes long hours of studying and uninterruptible concentration. I try to learn certain areas of computer science exhaustively; then I try to digest that knowledge into a form that is accessible to people who don’t have time for such study. [emphasis added]
I retired early because I realized that I would need at least 20 years of full-time work to complete The Art of Computer Programming (TAOCP), which I have always viewed as the most important project of my life.
Being a retired professor is a lot like being an ordinary professor, except that you don’t have to write research proposals, administer grants, or sit in committee meetings. Also, you don’t get paid.
My full-time writing schedule means that I have to be pretty much a hermit. The only way to gain enough efficiency to complete The Art of Computer Programming is to operate in batch mode, concentrating intensively and uninterruptedly on one subject at a time, rather than swapping a number of topics in and out of my head. I’m unable to schedule appointments with visitors, travel to conferences or accept speaking engagements, or undertake any new responsibilities of any kind.
John Baez is indeed a relative of that other famous J(oan) Baez. I used to read his blog avidly
The great challenge at the beginning of ones career in academia is to get tenure at a decent university. Personally I got tenure before I started messing with quantum gravity, and this approach has some real advantages. Before you have tenure, you have to please people. After you have tenure, you can do whatever the hell you want — so long as it’s legal, and you teach well, your department doesn’t put a lot of pressure on you to get grants. (This is one reason I’m happier in a math department than I would be in a physics department. Mathematicians have more trouble getting grants, so there’s a bit less pressure to get them.)
The great thing about tenure is that it means your research can be driven by your actual interests instead of the ever-changing winds of fashion. The problem is, by the time many people get tenure, they’ve become such slaves of fashion that they no longer know what it means to follow their own interests. They’ve spent the best years of their life trying to keep up with the Joneses instead of developing their own personal style! So, bear in mind that getting tenure is only half the battle: getting tenure while keeping your soul is the really hard part. [emphasis added]
Scientists straying from their field of expertise in this way is an example of what Nathan Ballantyne, a philosopher at Fordham University in New York City, calls “epistemic trespassing”. Although scientists might romanticize the role and occasional genuine insight of an outsider — such as the writings of physicist Erwin Shrödinger on biology — in most cases, he says, such academic off-piste manoeuvrings dump non-experts head-first in deep snow. [emphasis added]
But I do love the language…
Susan Haack is a wonderfully independent English borne philosopher who loves to roam, casting light wherever her interest takes her.
Better ostracism than ostrichism
Moreover, I have learned over the years that I am temperamentally resistant to bandwagons, philosophical and otherwise; hopeless at “networking,” the tit-for-tat exchange of academic favors, “going along to get along,” and at self-promotion
That I have very low tolerance for meetings where nothing I say ever makes any difference to what happens; and that I am unmoved by the kind of institutional loyalty that apparently enables many to believe in the wonderfulness of “our” students or “our” department or “our” school or “our” university simply because they’re ours.
Nor do I feel what I think of as gender loyalty, a sense that I must ally myself with other women in my profession simply because they are women—any more than I feel I must ally myself with any and every British philosopher simply because he or she is British. And I am, frankly, repelled by the grubby scrambling after those wretched “rankings” that is now so common in philosophy departments. In short, I’ve never been any good at academic politicking, in any of its myriad forms.
And on top of all this, I have the deplorable habit of saying what I mean, with neither talent for nor inclination to fudge over disagreements or muffle criticism with flattering tact, and an infuriating way of seeing the funny side of philosophers’ egregiously absurd or outrageously pretentious claims — that there are no such things as beliefs, that it’s just superstitious to care whether your beliefs are true, that feminism obliges us to “reinvent science and theorizing,” and so forth.
From a wonderful article in the Economist
As Michael Massing shows vividly in “Fatal Discord: Erasmus, Luther and the Fight for the Western Mind” (2018), the growing religious battle destroyed Erasmianism as a movement. Princes had no choice but to choose sides in the 16th-century equivalent of the cold war. Some of Erasmus’s followers reinvented themselves as champions of orthodoxy. The “citizen of the world” could no longer roam across Europe, pouring honeyed words into the ears of kings. He spent his final years holed up in the free city of Basel. The champion of the Middle Way looked like a ditherer who was incapable of making up his mind, or a coward who was unwilling to stand up to Luther (if you were Catholic) or the pope (if you were Protestant).
The test of being a good Christian ceased to be decent behaviour. It became fanaticism: who could shout most loudly? Or persecute heresy most vigorously? Or apply fuel to the flames most enthusiastically?
And in case there is any doubt about what I am talking about.
In Britain, Brexiteers denounce “citizens of the world” as “citizens of nowhere” and cast out moderate politicians with more talent than they possess, while anti-Brexiteers are blind to the excesses of establishment liberalism. In America “woke” extremists try to get people sacked for slips of the tongue or campaign against the thought crimes of “unconscious bias”. Intellectuals who refuse to join one camp or another must stand by, as mediocrities are rewarded with university chairs and editorial thrones. [emphasis added]
As Erasmus might have said: ‘Amen’.
The history of science is the history of rejected ideas (and manuscripts). One example I always come back to is the original work of John Wennberg and colleagues on spatial differences in ‘medical procedures’ and the idea that it is not so much medical need that dictates the number of procedures, but that it is the supply of medical services. Simply put: the more surgeons there are, the more procedures that are carried out1. The deeper implication is that many of these procedures are not medically required — it is just the billing that is needed: surgeons have mortgages and tuition loans to pay off. Wennberg and colleagues at Dartmouth have subsequently shown that a large proportion of the medical procedures or treatments that doctors undertake are unnecessary2.
Wennberg’s original manuscript was rejected by the New England Journal of Medicine (NEJM) but subsequently published in Science. Many of us would rate Science above the NEJM, but there is a lesson here about signal and noise, and how many medical journals in particular obsess over procedure and status at the expense of nurturing originality.
Angus Deaton and Anne Case, two economists, the former with a Nobel Prize to his name, tell a similar story. Their recent work has been on the so-called Deaths of Despair — where mortality rates for subgroups of the US population have increased3. They relate this to educational levels (the effects are largely on those without a college degree) and other social factors. The observation is striking for an advanced economy (although Russia had historically seen increased mortality rates after the collapse of communism).
Coming back to my opening statement, Deaton is quoted in the THE
The work on “deaths of despair” was so important to them that they [Deaton and Case] joined forces again as research collaborators. However, despite their huge excitement about it, their initial paper, sent to medical journals because of its health focus, met with rejections — a tale to warm the heart of any academic whose most cherished research has been knocked back.
When the paper was first submitted it was rejected so quickly that “I thought I had put the wrong email address. You get this ping right back…‘Your paper has been rejected’.” The paper was eventually published in Proceedings of the National Academy of Sciences, to a glowing reception. The editor of the first journal to reject the paper subsequently “took us for a very nice lunch”, adds Deaton.
Another medical journal rejected it within three days with the following justification
The editor, he says, told them: “You’re clearly intrigued by this finding. But you have no causal story for it. And without a causal story this journal has no interest whatsoever.”
(‘no interest whatsoever’ — the arrogance of some editors).
Deaton points out that this is a problem not just for medical journals but in economics journals, too; he thinks the top five economics journals would have rejected the work for the same reason.
“That’s the sort of thing you get in economics all the time,” Deaton goes on, “this sort of causal fetish… I’ve compared that to calling out the fire brigade and saying ‘Our house is on fire, send an engine.’ And they say, ‘Well, what caused the fire? We’re not sending an engine unless you know what caused the fire.’
It is not difficult to see the reasons for the fetish on causality. Science is not just a loose-leaf book of facts about the natural or unnatural world, nor is it just about A/B testing or theory-free RCTs, or even just ‘estimation of effect sizes’. Science is about constructing models of how things work. But sometimes the facts are indeed so bizarre in the light of previous knowledge that you cannot ignore them because without these ‘new facts’ you can’t build subsequent theories. Darwin and much of natural history stands as an example, here, but my personal favourite is that provided by the great biochemist Erwin Chargaff in the late 1940s. Wikipedia describes the first of his ‘rules’.
The first parity rule was that in DNA the number of guanine units is equal to the number of cytosine units, and the number of adenine units is equal to the number of thymine units.
Now, in one sense a simple observation (C=G and A=T), with no causal theory. But run the clock on to Watson and Crick (and others), and see how this ‘fact’ gestated an idea that changed the world.
From this week’s Economist | Breaking through
Yet nowhere too little capital is being channelled into innovation. Spending on R&D has three main sources: venture capital, governments and energy companies. Their combined annual investment into technology and innovative companies focused on the climate is over $80bn. For comparison, that is a bit more than twice the R&D spending of a single tech firm, Amazon.
Market and state failure may go together. Which brings me back to Stewart Brand’s idea of Pace Layering
Education is intellectual infrastructure. So is science. They have very high yield, but delayed payback. Hasty societies that can’t span those delays will lose out over time to societies that can. On the other hand, cultures too hidebound to allow education to advance at infrastructural pace also lose out.
I won’t even mention COVID-19.
I came across a note in my diary from around fifteen years ago. It was (I assume) after receiving a grant rejection. For once, I sort of agreed with the funder’s decision1. I wrote:
My grant was trivial, at least in one sense. Neils Bohr always said (or words to the effect) that the job of science was to reduce the profound to the trivial. The ‘magical’ would be made the ordinary of the everyday. My problem was that I started with the trivial.
As for the merits of review: It’s the exception that proves the rule.
Nice article in the Economist on how our ideas about speciation have been revised and updated. And not just for those animals but for humans too. In their words:
To be human, then, is to be a multispecies mongrel.
That “scientific management” bungled the algorithm for children’s exam results, verifies a maxim attributed to J.R. Searle, an American philosopher: if you have to add “scientific” to a field, it probably ain’t.
AD.Pellegrini in a letter to the Economist.
I have written elsewhere about this in medicine and science. We used to have physiology, but now some say physiological sciences; we used to have pharmacology, but now often see pharmacological sciences1. And as for medicine, neurology and neurosurgery used to be just fine, but then the PR and money grabbing started so we now have ‘clinical neuroscience’ — except it isn’t. As Herb Simon pointed out many years ago, the professions and professional practice always lose out in the academy.
Alas, there will no more new ones of these, as arguably the greatest of modern biology’s experimentalists, Sydney Brenner, passed away last year. One of his earlier quotes — the source I cannot find at hand — was that it is important in science to be out of phase. You can be ahead of the curve of fashion or possibly, better still, be behind it. But stay out of phase. So, no apologies for being behind the curve on these ones which I have just come across.
Sydney Brenner remarked in 2008, “We don’t have to look for a model organism anymore. Because we are the model organisms.”
Sydney Brenner has said that systems biology is “low input, high throughput, no output” biology.
Image source and credits via WikiCommons
In my ignorance I had always assumed that the ‘Haldane’ of the Haldane Principle1 referred to the great and singular geneticist and physiologist JBS Haldane. Not true. JBS once remarked that God must have been inordinately fond of beetles because there are so many species of beetles, so with the Haldanes; (good) fortune is, it appears, inordinately fond of the Haldane clan. A relative of JBS, Richard Burdon Haldane — who did indeed come up with the Haldane principle — is the subject of a new biography by Philip Campbell, and a witty and sharp review in the FT by Philip Stephens.
Watching today’s politicians fall over their own mistakes as they fumble with the Covid-19 pandemic, it is easy to forget that securing high office once required more than a few years of dashing off political columns for a national newspaper. So the life and political times of Richard Haldane, the subject of John Campbell’s engaging biography, offers a fitting rebuke to the trivial mendacity and downright incompetence of the nation’s present administration.
Exaggeration, it is not. Haldane…
…an Edinburgh lawyer and philosopher-politician before becoming a minister in Herbert Asquith’s Liberal administrations, was an important champion of universal education and one of the founding fathers of the UK university system. He also found time to create the Territorial Army, and to have a hand in the foundation of the London School of Economics, the Medical Research Council and the Secret Intelligence Service…
As Asquith’s minister for war, he created the expeditionary force that saved Britain from defeat in the opening stages of the first world war. As Lord Chancellor, his judgments did much to set in place the federalist tilt of the Canadian constitution.
And if there is any doubt about his intellectual gravitas, the review is headed by an image of Haldane with Albert Einstein whom he hosted on the latter’s first visit to the UK in the 1920s. Just conjure up BoJo or Patel or Hancock when you read the above, or when you step on something unpleasant and slimy.
It also seems that Haldane might have performed slightly better across the dispatch box than some of the current irregulars. Clark McGinn writes
He [Haldane] is also one of the few men to have beaten Winston Churchill by riposte. Haldane was a portly figure and Churchill remarked on his girth by asking when the baby was due and what it would be called. Haldane retorted: “If it’s a boy it will be George after the King, a girl will be Mary after the Queen. But if it is just wind I shall call it Winston.”
I procured me a Triangular glass-Prisme, to try therewith the celebrated Phenomena of Colours. And in order thereto having darkened my chamber and made a small hole in my window-shuts, to let in a convenient quantity of the Suns light…but I surprised to see them in an oblong form; which, according to the received laws of Refraction, I expected should have been circular.
Well, Isaac Newton obviously had better functioning shutters than my bedroom blackout blinds. Most summer mornings (which at this latitude last much of the night…) I receive — without choice — lessons in physics 101. Here are some iPhone shots from this morning. The quote says a lot about science: note the terms, convenient quantity, surprised and expected.
Today many scientists describe their research as apolitical, but Haldane knew that was impossible: ‘I began to realise that even if the professors leave politics alone, politics won’t leave the professors alone.’
From a review in the Economist of a biography of JBS Haldane by Samanth Subramanian.
Two things to add. Haldane’s paper A Defense of Beanbag Genetics is a personal favourite, but sticking with the genetics theme, I think of politics, and many politicians, as examples of dominant negative mutations.
“I hope the lesson will really be that we can’t afford as a society to create the fire brigade once the house is on fire. We need that fire brigade ready all the time hoping that it never has to be deployed.”
Peter Piot 1
No just in time here. It’s in the statistical tails that dragons lurk and reputations are shattered. Chimes with a quote from Stewart Brand that I posted a short while back.
Education is intellectual infrastructure. So is science. They have very high yield, but delayed payback. Hasty societies that can’t span those delays will lose out over time to societies that can. On the other hand, cultures too hidebound to allow education to advance at infrastructural pace also lose out.
The alternative to science is academic politics, where persistent disagreement is encouraged as a way to create distinctive sub-group identities.
The usual way to protect a scientific discussion from the factionalism of academic politics is to exclude people who opt out of the norms of science. The challenge lies in knowing how to identify them.
I can agree go along with both, but it is in the details that the daemons feast. It appears to me that the ‘norms of science’ argument is itself problematic, reminding me of those silly things you learn at school about the scientific method 1. The historical origin of the concept of the scientific method owed more to attempts to brand certain activities in the eyes of those who were not practicing scientists 2. As a rough approximation, the people who talk about the scientific method tend not to do science. Of course, in more recent times, the use of the term ‘science’ itself has been a flag for obtaining funding, status or approval. Dermatology is now dermatological sciences ; pharmacology is now pharmacological sciences. Even more absurd, in the medical literature I see the term delivery science (and I don’t mean Amazon), or reproducibility science. The demarcation of science from non-science is a hard philosophical problem going back way before Popper; I will not solve it. The danger is that we might end up exiling all those meaningful areas of human rationality that we once — rightly — considered outwith science, but still valued. There is indeed a subject that we might reasonably call medical science(s). It is just not synonymous with the principles and practice of medicine. It is also why political economy is a more useful subject than economics (or worse still, economic sciences).
Freeman Dyson died February 28th this year. There are many obituaries of this great mind and eternal rebel. His book, Disturbing the Universe, is for me one of a handful that gets the fundamental nature of discovery in science and how science interacts with other modes of being human. His intellectual bravery and honesty shine through most of his writings. John Naughton had a nice quote from him a short while back.
Some mathematicians are birds, others are frogs. Birds fly high in the air and survey broad vistas of mathematics out to the far horizon. They delight in concepts that unify our thinking and bring together diverse problems from different parts of the landscape. Frogs live in the mud below and see only the flowers that grow nearby. They delight in the details of particular objects, and they solve problems one at a time. I happen to be a frog, but many of my best friends are birds. The main theme of my talk tonight is this. Mathematics needs both birds and frogs.
In truth he was both frog and an albatross. Here are some words from his obituary in PNAS.
During the Second World War, Dyson worked as a civilian scientist for the Royal Air Force’s Bomber Command, an experience that made him a life-long pacifist. In 1941, as an undergraduate at Trinity College, Cambridge, United Kingdom, he found an intellectual role model in the famed mathematician G. H. Hardy, who shared two ideas that came to define Dyson’s trajectory: “A mathematician, like a painter or a poet, is a maker of patterns,” and “Young men should prove theorems; old men should write books.”
Heeding the advice of his undergraduate mentor, Dyson returned to his first love of writing. He became well-known to a wide audience by his books Disturbing the Universe (1979) (1) and Infinite in All Directions (1988) (2), and his many beautiful essays for The New Yorker and The New York Review of Books. In 2018, he published his autobiography, Maker of Patterns (3), largely composed of letters that he sent to his parents from an early age on.
And as for us eternal students, at least I have one thing in common.
…Dyson never obtained an official doctorate of philosophy. As an eternal graduate student, a “rebel” in his own words, Dyson was unafraid to question everything and everybody. It is not surprising that his young colleagues inspired him the most.
Its natural, Jim. From an obituary of Julian Perry Robinson in Nature
In 1981, the US government publicly accused Soviet-backed forces in southeast Asia of waging toxin warfare and violating their legal obligations under the 1925 Geneva Protocol and 1972 Biological Weapons Convention. It alleged that aircraft dispersed ‘yellow rain’ containing mycotoxins that were “not indigenous to the region”. Julian Perry Robinson, working alongside biologist Matthew Meselson at Harvard University in Cambridge, Massachusetts, established that what actually fell was wild-honeybee faeces containing naturally occurring toxins. He died on 22 April, aged 78.