Science

Arithmetic matters

Summer reading 2021 | Science

From a review of “The Ascent of Information” by Caleb Scharf.

Every cat GIF shared on social media, credit card swiped, video watched on a streaming platform, and website visited add more data to the mind-bending 2.5 quintillion bytes of information that humans produce every single day. All of that information has a cost: Data centers alone consume about 47 billion watts, equivalent to the resting metabolism of more than a tenth of all the humans on the planet.

Scharf begins by invoking William Shakespeare, whose legacy permeates the public consciousness more than four centuries after his death, to show just how powerful the dataome can be. On the basis of the average physical weight of one of his plays, “it is possible that altogether the simple act of human arms raising and lowering copies of Shakespeare’s writings has expended over 4 trillion joules of energy,” he writes. These calculations do not even account for the energy expended as the neurons in our brains fire to make sense of the Bard’s language.

Innovation, and rule breaking

by reestheskin on 23/06/2021

Comments are disabled

How ‘creative destruction’ drives innovation and prosperity | Financial Times

From time to time, I vow not to read any more comments on the FT website. Trolls aside, I clearly live in a different universe. But then I return. It is indeed a signal-noise problem, but one in which the weighting has to be such that the fresh shoots are not overlooked. I know nothing about Paul A Myers, and I assume he lives in the US, but over the years you he has given me pause for thought on many occasions. One recent example below.

Comment from Paul A Myers

Science-based innovation largely comes out of the base of 90 research universities. One can risk an over-generalization and say there are no “universities” in a non-constitutional democratic country, or authoritarian regime. Engineering institutes maybe, but not research universities. Research is serendipity and quirky; engineering is regular and reliable. Engineering loves rules; research loves breaking them. The two fields are similar but worship at different altars.

This contrast is also true of medicine and science. Medicine is regulated to hell and back — badly, often — but I like my planes that way too. But, in John Naughton’s words, if you want great research, buy Aeron chairs, and hide the costs off the balance sheet lest the accountants start discounting all the possible futures.

Body lice

by reestheskin on 14/06/2021

Comments are disabled

It is one of dermatology’s tedious and fun facts that, in contradistinction to say scabies or head lice, you treat body lice by treating not the patient (directly) but the clothing. The pharmacological agent is a washing machine. But the excerpt quoted below tells you something wonderful about science: you get things out that you never expected. Spontaneous generation — not in the real world — but in the world of ideas. Well, almost.

How clothing and climate change kickstarted agriculture | Aeon Essays

Scientific efforts to shed light on the prehistory of clothes have received an unexpected boost from another line of research, the study of clothing lice, or body lice. These blood-sucking insects make their home mainly on clothes and they evolved from head lice when people began to use clothes on a regular basis. Research teams in Germany and the United States analysed the genomes of head and clothing lice to estimate when the clothing parasites split from the head ones. One advantage of the lice research is that the results are independent from other sources of evidence about the origin of clothes, such as archaeology and palaeoclimatology. The German team, led by Mark Stoneking at the Max Planck Institute for Evolutionary Anthropology, came up with a date of 70,000 years ago, revised to 100,000 years ago, early in the last ice age. The US team led by David Reed at the University of Florida reported a similar date of around 80,000 years ago, and maybe as early as 170,000 years ago during the previous ice age. These findings from the lice research suggest that our habit of wearing clothes was established quite late in hominin evolution.

Gödel

Journey to the Edge of Reason by Stephen Budiansky — ruthless logic | Financial Times

Ever since I read of how Gödel’s work has rendered decades of work by Bertrand Russell and others void, Gödel has fascinated me. Not that I can follow the raw proofs. But his work speaks of a wonderful Platonic world that is hard not to fall in love with. Two quotes: the first is new to me, the second, sadly not.

Einstein sponsored his US citizenship, which Gödel almost torpedoed by telling the judge that he had found a logical inconsistency in the constitution that would allow a person to establish a dictatorship in America.

His end, when it came, was tragic. His paranoia grew and he became convinced that his food was being poisoned. When this had happened earlier in his life, his wife had managed to taste test and spoon feed him to health but this time she too was ill and in January 1978, he died in hospital, curled into a foetal position and weighing only 65 pounds.

We know less that we pretend

by reestheskin on 27/05/2021

Comments are disabled

Jonathan Flint · Testing Woes · LRB 6 May 2021

Terrific article from Jonathan Flint in the LRB. He is an English psychiatrist and geneticist (mouse models of behaviour) based in UCLA, but like many, has put his hand to other domains (beyond depression). He writes about Covid-19:

Perhaps the real problem is hubris. There have been so many things we thought we knew but didn’t. How many people reassured us Covid-19 would be just like flu? Or insisted that the only viable tests were naso-pharyngeal swabs, preferably administered by a trained clinician? Is that really the only way? After all, if Covid-19 is only detectable by sticking a piece of plastic practically into your brain, how can it be so infectious? We still don’t understand the dynamics of virus transmission. We still don’t know why around 80 per cent of transmissions are caused by just 10 per cent of cases, or why 2 per cent of individuals carry 90 per cent of the virus. If you live with someone diagnosed with Covid-19, the chances are that you won’t be infected (60 to 90 per cent of cohabitees don’t contract the virus). Yet in the right setting, a crowded bar for example, one person can infect scores of others. What makes a superspreader? How do we detect them? And what can we learn from the relatively low death rates in African countries, despite their meagre testing and limited access to hospitals?

That we are still scrambling to answer these questions is deeply worrying, not just because it shows we aren’t ready for the next pandemic. The virus has revealed the depth of our ignorance when it comes to the biology of genomes. I’ve written too many grant applications where I’ve stated confidently that we will be able to determine the function of a gene with a DNA sequence much bigger than that of Sars-CoV-2. If we can’t even work out how Sars-CoV-2 works, what chance do we have with the mammalian genome? Let’s hope none of my grant reviewers reads this.

Medicine is always messier that people want to imagine. It is a hotchpotch of kludges. For those who aspire to absolute clarity, it should be a relief that we manage effective action based on such a paucity of insight. Cheap body-hacks sometimes work. But the worry remains.

On the patina of technique

by reestheskin on 27/05/2021

Comments are disabled

Tyler Cowen says some interesting things in an article Why Economics is Failing US on Bloomberg. I don’t think his comments are limited to the economics domain.

Why Economics Is Failing Us – Bloomberg

Economics is one of the better-funded and more scientific social sciences, but in some critical ways it is failing us. The main problem, as I see it, is standards: They are either too high or too low. In both cases, the result is less daring and creativity.

Consider academic research. In the 1980s, the ideal journal submission was widely thought to be 17 pages, maybe 30 pages for a top journal. The result was a lot of new ideas, albeit with a lower quality of execution. Nowadays it is more common for submissions to top economics journals to be 90 pages, with appendices, robustness checks, multiple methods, numerous co-authors and every possible criticism addressed along the way.

There is little doubt that the current method yields more reliable results. But at what cost? The economists who have changed the world, such as Adam Smith, John Maynard Keynes or Friedrich Hayek, typically had brilliant ideas with highly imperfect execution. It is now harder for this kind of originality to gain traction. Technique stands supreme and must be mastered at an early age, with some undergraduates pursuing “pre-docs” to get into a top graduate school.

Sam Shuster, before I departed to Strasbourg, warned me in a similar vein, with reference to the Art of War by Sun Tzu:

Even the mystique of wisdom turns out to be technique. But if today must be learning technique, don’t leave the tomorrow of discovery too long.

I would say I heard the message but didn’t listen carefully enough. As befits an economist, Cowen warns us that there is no free lunch.

Smallpox again

by reestheskin on 16/03/2021

Comments are disabled

For some reason — COVID of course — I keep coming back to perhaps the greatest vaccine success ever: the eradication of smallpox (here, here and here). But the figures quoted by Scott Galloway make you sit up and notice both the magic — and value — of science.

Values in America – Scott Galloway on recasting American individualism and institutions | By Invitation | The Economist

International bodies are immolated. Consider the World Health Organisation. Mr Trump’s decision to pull America out of the WHO in the midst of a pandemic (reversed under President Joe Biden) was galling, particularly as the WHO is responsible for one of humanity’s greatest public-health accomplishments: the eradication of smallpox in the 1970s. To appreciate the magnitude of this, Google images of “smallpox” and glimpse the horror that once killed millions each year. It was a victory for co-operative, state-funded projects and it cost a mere $300m. By one estimate, America, the largest contributor, recoups that value every 26 days from savings in vaccinations, lost work and other costs. [emphasis added]

On long hair and being irrevelant

Thirty years ago[now 40], scientists who studied climate change, and I am one of them, tended to have long hair and very colourful socks. We were regarded as harmless but irrelevant. But the serendipitous investment in their work revealed processes that we now recognise as threatening the future of human society, and the successors to those scientists are playing a crucial role in assessing how we need to adapt.

Geoffrey Boulton

I think you could see the same dress sense in the golden ages of molecular biology and computing.

Another snippet from a wonderful article (and previous aside).

A Pox on You All

by reestheskin on 26/02/2021

Comments are disabled

Or not, as the case may be.

Smallpox is the greatest success story in the history of medicine. It once took huge numbers of lives — as many as half a billion people in the 20th century alone — and blinded and disfigured many more.

So writes the distinguished historian of science, Steven Shapin in the LRB (A Pox on the Poor, February 4, 2021). He is reviewing The Great Inoculator: The Untold Story of Daniel Sutton and His Medical Revolution).

In historical times you had a one in three chance of getting smallpox, and, if you got it, the case-fatality was 20%. Some outbreaks, however, had a case-fatality of 50% and, unlike Covid-19, its preferred targets were children.

My exposure to smallpox was (thankfully) limited. My mother told me that there was an outbreak in South Wales and the West of England when I was around five or six. There was widespread vaccination, but kids like me with bad eczema, were spared, with the parent advised to ‘shield’ the child indoors (my sympathies go to my mother). (The risk was from the grandly named, but sometimes fatal, Kaposi’s varicelliform reaction, which was due to the vaccinia virus — not smallpox — running riot on the diseased skin).

As a med student, I remember learning how to distinguish the cutaneous lesions of smallpox from chicken pox. Smallpox was declared eradicated in 1980, but, as a dermatology registrar, seeing the occasional adult with chickenpox who seemed so ill (in comparison with kids), I often had doubts that I had to reason away. Perhaps those stores held by the US and USSR were not so secure…

Before Jenner

Jenner and smallpox vaccination go together in popular accounts, but the history of this particular clinical discovery is much older and richer — at least to me.

As ever, in medicine, and in particular for anything involving the skin, the terminology is confusing. The Latin name for cowpox is Variolae vaccinae, meaning the pox from the cow (vacca). It was Pasteur who subsequently honoured Jenner by deciding that all forms of inoculation be called vaccination.

Edward Jenner took advantage of the already-known fact that whilst milkmaids tended to be afflicted with the far more mild cowpox virus, they rarely suffered from the more serious, smallpox (they are different, but related, viruses). Jenner, in 1796, inoculated an eight-year-old boy with the pus from a milkmaid’s cowpox sore. After being exposed to smallpox material the boy appeared immune, in that he did not suffer adversely when subsequently exposed to smallpox.

Once Jenner’s method was accepted as safe, Acts of Parliament introduced free vaccination n 1840, and vaccination became obligatory in 1853.

I had never been quite certain of the distinction between inoculation and vaccination, but there is history here too. Shapin writes that the term inoculation was borrowed from horticulture — the grafting of a bud (or ‘eye’) to propagate a plant (something I was taught how to do in horticulture lessons when I was aged 11, in school in Cardiff, by Brother Luke, who, I thought so old, he might have been a contemporary of Jenner). Why the name is apt is explained below.

Before vaccination, inoculation was actually meant to give you a direct form of smallpox (this was also referred to as variolation, after variola, the term for smallpox). The source material, again, was from a lesion of somebody with smallpox. The recipient it was hoped would develop a milder version of smallpox. Shapin writes:

The contract with the inoculator was to accept a milder form of the disease, and a lower chance of death, in exchange for a future secured from the naturally occurring disease, which carried a high chance of killing or disfiguring.

Shapin tells us that Lady Mary Wortley Montagu, when holed up with her husband in Constantinople in 1717, heard stories about how such ‘in-grafting’ was in widespread use by the ‘locals’. She was scarred from smallpox, and therefor she had the procedure carried out on her then five-year-old son. The needle was blunt and rusty, but her son suffering just a few spots and the procedure was judged a success. He was now immune to smallpox.

Not surprisingly, the story goes back further: inoculation was folk medicine practice in Pembrokeshire as early as 1600, and the Chinese had been blowing dried, ground-up smallpox material up the nose for many centuries.

There is capitalism and then there is capitalism.

The London medical establishment were apparently not too impressed with the non-British origin of such scientific advance, nor its apparent simplicity (and hence low cost). So, they made the procedure made much more complicated, with specific diets being required, along with advice on behaviour, and, of course, blood-lettings and laxatives, all in addition to not just a ‘prick’ but a bigger incision (payment by the inch). The physician’s ‘fees’ no doubt rose in parallel. Not a bad business model, until…

There is plenty of room at the bottom.

The London physicians’ ‘add-ons’ put the treatment costs of inoculation out of reach of most of the population, restricting it, for decades, to the ‘medical carriage trade’. Along comes Richard Sutton, a provincial barber-surgeon, with no Latin or Greek, no doubt, who effectively industrialised the whole process, making it both more profitable and cheaper for the customer.

Based in a village near Chelmsford, he inoculated tens of thousands locally. The method was named after him, the Suttonian Method. On one day he inoculated 700 persons. Incisions (favoured by the physicians) were replaced by the simpler prick, and patients were not confined, but instead told to go out in the fresh air (day-case, anybody?). Product differentiation was of course possible:spa-like pampering in local accommodation was available for the top end of the market, with a set of sliding fees depending on the degree of luxury.

Shapin writes:

Splitting the market and niche pricing were aspects of Sutton’s business success, but so too was control of supply. The extended Sutton clan could satisfy a significant chunk of provincial demand, but Daniel also worked out a franchising system, which ‘authorised’ more than fifty partners throughout Britain and abroad to advertise their use of the ‘Suttonian System’ — provided they paid fees for Sutton-compounded purgatives, kicked back a slice of their take, and kept the trade secrets. Control was especially important, since practically, anyone could, and did, set up as an inoculator. The Suttons themselves had become surgeons through apprenticeship, but apothecaries, clergymen, artisans and farmers were inoculating, and sometimes parents inoculated their own children. The profits of the provincial press were considerably boosted as practitioners advertised their skills at inoculation and their keen prices. Daniel went after competitors — including his own father-in-law and a younger brother — with vigour, putting it about that the Suttonian Method depended on a set of closely held secrets, to which only he and his approved partners had access. His competitors sought to winkle out these secrets, occasionally pouncing on Sutton’s patients to quiz them about their experiences.

Sutton admitted that had ‘lost’ five out of forty thousand patients (due to smallpox). He offered a prize of 100 guineas to anybody who could prove he ever lost a patient due to inoculation, or that any patient got smallpox a second time. More confident, and perhaps more generous, than modern Pharma, I think. By 1764, Sutton had an annual income of 6300 guineas — over one million sterling in today’s money.

To err is human

by reestheskin on 01/02/2021

Comments are disabled

What my retraction taught me

In the middle of the pandemic, I got an e-mail asking whether I had access to data from the experiments behind a paper I’d published in 2014. Three months later, I requested that the paper be retracted. The experience has not left me bitter: if anything, it brought me back to my original motivation for doing research.

This is a disarmingly honest piece (in the journal Nature) about how mistakes in the analysis of complicated data sets caused inappropriate conclusions, leading, in this case, to a retraction of a manuscript.

As a student, I was even told never to attempt to replicate before I publish. That is not a career I would want — luckily, my PhD adviser taught me the opposite.

John Ziman warned over 20 years ago in Real Science that we were entering post-academic science. Here are some words of his from an article in Nature.

What is more, science is no longer what it was when [Robert] Merton first wrote about it. The bureaucratic engine of policy is shattering the traditional normative frame. Big science has become a novel way of life, with its own conventions and practices. What price now those noble norms? Tied without tenure into a system of projects and proposals, budgets and assessments, how open, how disinterested, how self-critical, how riskily original can one afford to be?

As the economists are fond of saying: institutions matter. As do incentives. Precious metals can be corrupted, and money — in the short term — made.

Winnowing MMXXI

by reestheskin on 21/01/2021

Comments are disabled

STEM stupidity

Stupid | No Mercy / No Malice

Mark Zuckerberg is what happens when you replace civics with computer science.


Eureka… if not the bath

New Year Monday Note. | Jean-Louis Gassée | Jan, 2021

In the shower, all ideas look good.


Post-modem

Universities challenged: critical theory and culture wars | Financial Times

A comment about the above article:

As a full professor in a similar situation, a humanities department in a British teaching factory (sorry major research university) I completely agree with Musidorus.


Girl-talk

The Economist | Awesome, weird and everything else

Ironically, some old gender stereotypes may now be helping girls. When girls are toddlers they are read to more than boys. Their fathers are five times more likely to sing or whistle to them and are more likely to speak to them about emotions, including sadness. Their mothers are more likely to use complex vocabulary with them. Most of this gives girls a leg up in a world that increasingly prizes “soft skills”. Girls still have less leisure time than boys, but nowadays that is primarily because they spend more time on homework and grooming, rather than an unfair division of chores. And in the time left for themselves they have far more freedom.

The one good thing about COVID-19 is that it’s good for nature and the environment and dolphins,” says Sarah, “but I wish it wouldn’t kill so many people in the process.”


COVID-19 aware

David Hume — Wikipedia

He had little respect for the professors of his time, telling a friend in 1735 that “there is nothing to be learnt from a Professor, which is not to be met with in Books”. He did not graduate.

Nor did I ever submit my PhD. As David Hubel once said, the great advantage of an MD degree was (then) being able to avoid having to gain a PhD credential.

Academic lives

by reestheskin on 11/01/2021

Comments are disabled

Originality is usually off track

How mRNA went from a scientific backwater to a pandemic crusher | WIRED UK

For decades, Katalin Karikó’s work into mRNA therapeutics was overlooked by her colleagues. Now it’s at the heart of the two leading coronavirus vaccines

By the mid 1990s, Karikó’s bosses at UPenn had run out of patience. Frustrated with the lack of funding she was generating for her research, they offered the scientist a bleak choice: leave or be demoted. It was a demeaning prospect for someone who had once been on the path to a full professorship. For Karikó’s dreams of using mRNA to create new vaccines and drugs for many chronic illnesses, it seemed to be the end of the road… ”It was particularly horrible as that same week, I had just been diagnosed with cancer,” said Karikó. “I was facing two operations, and my husband, who had gone back to Hungary to pick up his green card, had got stranded there because of some visa issue, meaning he couldn’t come back for six months. I was really struggling, and then they told me this.”

Karikó has been at the helm of BioNTech’s Covid-19 vaccine development. In 2013, she accepted an offer to become Senior Vice President at BioNTech after UPenn refused to reinstate her to the faculty position she had been demoted from in 1995. “They told me that they’d had a meeting and concluded that I was not of faculty quality,” she said. ”When I told them I was leaving, they laughed at me and said, ‘BioNTech doesn’t even have a website.’”


Being at the bottom of things

Knuth versus Email

Donald Knuth is a legend amongst computer scientists.

I have been a happy man ever since January 1, 1990, when I no longer had an email address. I’d used email since about 1975, and it seems to me that 15 years of email is plenty for one lifetime.Email is a wonderful thing for people whose role in life is to be on top of things. But not for me; my role is to be on the bottom of things. What I do takes long hours of studying and uninterruptible concentration. I try to learn certain areas of computer science exhaustively; then I try to digest that knowledge into a form that is accessible to people who don’t have time for such study. [emphasis added]

On retirement:

I retired early because I realized that I would need at least 20 years of full-time work to complete The Art of Computer Programming (TAOCP), which I have always viewed as the most important project of my life.

Being a retired professor is a lot like being an ordinary professor, except that you don’t have to write research proposals, administer grants, or sit in committee meetings. Also, you don’t get paid.

My full-time writing schedule means that I have to be pretty much a hermit. The only way to gain enough efficiency to complete The Art of Computer Programming is to operate in batch mode, concentrating intensively and uninterruptedly on one subject at a time, rather than swapping a number of topics in and out of my head. I’m unable to schedule appointments with visitors, travel to conferences or accept speaking engagements, or undertake any new responsibilities of any kind.


On Keeping Your Soul

John Baez is indeed a relative of that other famous J(oan) Baez. I used to read his blog avidly

The great challenge at the beginning of ones career in academia is to get tenure at a decent university. Personally I got tenure before I started messing with quantum gravity, and this approach has some real advantages. Before you have tenure, you have to please people. After you have tenure, you can do whatever the hell you want — so long as it’s legal, and you teach well, your department doesn’t put a lot of pressure on you to get grants. (This is one reason I’m happier in a math department than I would be in a physics department. Mathematicians have more trouble getting grants, so there’s a bit less pressure to get them.)

The great thing about tenure is that it means your research can be driven by your actual interests instead of the ever-changing winds of fashion. The problem is, by the time many people get tenure, they’ve become such slaves of fashion that they no longer know what it means to follow their own interests. They’ve spent the best years of their life trying to keep up with the Joneses instead of developing their own personal style! So, bear in mind that getting tenure is only half the battle: getting tenure while keeping your soul is the really hard part. [emphasis added]


On the hazards of Epistemic trespassing

Scientists fear that ‘covidization’ is distorting research

Scientists straying from their field of expertise in this way is an example of what Nathan Ballantyne, a philosopher at Fordham University in New York City, calls “epistemic trespassing”. Although scientists might romanticize the role and occasional genuine insight of an outsider — such as the writings of physicist Erwin Shrödinger on biology — in most cases, he says, such academic off-piste manoeuvrings dump non-experts head-first in deep snow. [emphasis added]

But I do love the language…


On the need for Epistemic trespassing

Haack, Susan, Not One of the Boys: Memoir of an Academic Misfit

Susan Haack is a wonderfully independent English borne philosopher who loves to roam, casting light wherever her interest takes her. 

Better ostracism than ostrichism

Moreover, I have learned over the years that I am temperamentally resistant to bandwagons, philosophical and otherwise; hopeless at “networking,” the tit-for-tat exchange of academic favors, “going along to get along,” and at self-promotion

 

That I have very low tolerance for meetings where nothing I say ever makes any difference to what happens; and that I am unmoved by the kind of institutional loyalty that apparently enables many to believe in the wonderfulness of “our” students or “our” department or “our” school or “our” university simply because they’re ours.

 

Nor do I feel what I think of as gender loyalty, a sense that I must ally myself with other women in my profession simply because they are women—any more than I feel I must ally myself with any and every British philosopher simply because he or she is British. And I am, frankly, repelled by the grubby scrambling after those wretched “rankings” that is now so common in philosophy departments. In short, I’ve never been any good at academic politicking, in any of its myriad forms.

 

And on top of all this, I have the deplorable habit of saying what I mean, with neither talent for nor inclination to fudge over disagreements or muffle criticism with flattering tact, and an infuriating way of seeing the funny side of philosophers’ egregiously absurd or outrageously pretentious claims — that there are no such things as beliefs, that it’s just superstitious to care whether your beliefs are true, that feminism obliges us to “reinvent science and theorizing,” and so forth.

.


Citizens of nowhere trespassing…

The Economist | Citizen of the world

From a wonderful article in the Economist

As Michael Massing shows vividly in “Fatal Discord: Erasmus, Luther and the Fight for the Western Mind” (2018), the growing religious battle destroyed Erasmianism as a movement. Princes had no choice but to choose sides in the 16th-century equivalent of the cold war. Some of Erasmus’s followers reinvented themselves as champions of orthodoxy. The “citizen of the world” could no longer roam across Europe, pouring honeyed words into the ears of kings. He spent his final years holed up in the free city of Basel. The champion of the Middle Way looked like a ditherer who was incapable of making up his mind, or a coward who was unwilling to stand up to Luther (if you were Catholic) or the pope (if you were Protestant).

The test of being a good Christian ceased to be decent behaviour. It became fanaticism: who could shout most loudly? Or persecute heresy most vigorously? Or apply fuel to the flames most enthusiastically?

And in case there is any doubt about what I am talking about.

In Britain, Brexiteers denounce “citizens of the world” as “citizens of nowhere” and cast out moderate politicians with more talent than they possess, while anti-Brexiteers are blind to the excesses of establishment liberalism. In America “woke” extremists try to get people sacked for slips of the tongue or campaign against the thought crimes of “unconscious bias”. Intellectuals who refuse to join one camp or another must stand by, as mediocrities are rewarded with university chairs and editorial thrones. [emphasis added]

As Erasmus might have said: ‘Amen’.

On rejection by editors and society

by reestheskin on 16/11/2020

Comments are disabled

The history of science is the history of rejected ideas (and manuscripts). One example I always come back to is the original work of John Wennberg and colleagues on spatial differences in ‘medical procedures’ and the idea that it is not so much medical need that dictates the number of procedures, but that it is the supply of medical services. Simply put: the more surgeons there are, the more procedures that are carried out1. The deeper implication is that many of these procedures are not medically required — it is just the billing that is needed: surgeons have mortgages and tuition loans to pay off. Wennberg and colleagues at Dartmouth have subsequently shown that a large proportion of the medical procedures or treatments that doctors undertake are unnecessary2.

Wennberg’s original manuscript was rejected by the New England Journal of Medicine (NEJM) but subsequently published in Science. Many of us would rate Science above the NEJM, but there is a lesson here about signal and noise, and how many medical journals in particular obsess over procedure and status at the expense of nurturing originality.

Angus Deaton and Anne Case, two economists, the former with a Nobel Prize to his name, tell a similar story. Their recent work has been on the so-called Deaths of Despair — where mortality rates for subgroups of the US population have increased3. They relate this to educational levels (the effects are largely on those without a college degree) and other social factors. The observation is striking for an advanced economy (although Russia had historically seen increased mortality rates after the collapse of communism).

Coming back to my opening statement, Deaton is quoted in the THE

The work on “deaths of despair” was so important to them that they [Deaton and Case] joined forces again as research collaborators. However, despite their huge excitement about it, their initial paper, sent to medical journals because of its health focus, met with rejections — a tale to warm the heart of any academic whose most cherished research has been knocked back.

When the paper was first submitted it was rejected so quickly that “I thought I had put the wrong email address. You get this ping right back…‘Your paper has been rejected’.” The paper was eventually published in Proceedings of the National Academy of Sciences, to a glowing reception. The editor of the first journal to reject the paper subsequently “took us for a very nice lunch”, adds Deaton.

Another medical journal rejected it within three days with the following justification

The editor, he says, told them: “You’re clearly intrigued by this finding. But you have no causal story for it. And without a causal story this journal has no interest whatsoever.”

(‘no interest whatsoever’ — the arrogance of some editors).

Deaton points out that this is a problem not just for medical journals but in economics journals, too; he thinks the top five economics journals would have rejected the work for the same reason.

“That’s the sort of thing you get in economics all the time,” Deaton goes on, “this sort of causal fetish… I’ve compared that to calling out the fire brigade and saying ‘Our house is on fire, send an engine.’ And they say, ‘Well, what caused the fire? We’re not sending an engine unless you know what caused the fire.’

It is not difficult to see the reasons for the fetish on causality. Science is not just a loose-leaf book of facts about the natural or unnatural world, nor is it just about A/B testing or theory-free RCTs, or even just ‘estimation of effect sizes’. Science is about constructing models of how things work. But sometimes the facts are indeed so bizarre in the light of previous knowledge that you cannot ignore them because without these ‘new facts’ you can’t build subsequent theories. Darwin and much of natural history stands as an example, here, but my personal favourite is that provided by the great biochemist Erwin Chargaff in the late 1940s. Wikipedia describes the first of his ‘rules’.

The first parity rule was that in DNA the number of guanine units is equal to the number of cytosine units, and the number of adenine units is equal to the number of thymine units.

Now, in one sense a simple observation (C=G and A=T), with no causal theory. But run the clock on to Watson and Crick (and others), and see how this ‘fact’ gestated an idea that changed the world.

  1. The original work was on surgical procedures undertaken by surgeons. Medicine has changed, and now physicians undertake many invasive procedures, and I suspect the same trends would be evident.
  2. Yes, you can go a lot deeper on this topic and add in more nuance.
  3. Their book on this topic is Deaths of Despair and the Future of Capitalism published by Princeton Universty Press.

Breaking bad

by reestheskin on 02/11/2020

Comments are disabled

From this week’s Economist | Breaking through

Yet nowhere too little capital is being channelled into innovation. Spending on R&D has three main sources: venture capital, governments and energy companies. Their combined annual investment into technology and innovative companies focused on the climate is over $80bn. For comparison, that is a bit more than twice the R&D spending of a single tech firm, Amazon.

Market and state failure may go together. Which brings me back to Stewart Brand’s idea of Pace Layering

Education is intellectual infrastructure. So is science. They have very high yield, but delayed payback. Hasty societies that can’t span those delays will lose out over time to societies that can. On the other hand, cultures too hidebound to allow education to advance at infrastructural pace also lose out.

Pace Layering: How Complex Systems Learn and Keep Learning

I won’t even mention COVID-19.

Fail. Fail again. Fail better.

by reestheskin on 28/10/2020

Comments are disabled

I came across a note in my diary from around fifteen years ago. It was (I assume) after receiving a grant rejection. For once, I sort of agreed with the funder’s decision1. I wrote:

My grant was trivial, at least in one sense. Neils Bohr always said (or words to the effect) that the job of science was to reduce the profound to the trivial. The ‘magical’ would be made the ordinary of the everyday. My problem was that I started with the trivial.

As for the merits of review: It’s the exception that proves the rule.

  1. Bert Vogelstein, who I collaborated with briefly in the 1990s, after seeing our paper initially rejected by the glossy of the day , informed me that the only sensible personal strategy was to believe that reviewers are always wrong.

Human origins: just a little messy

by reestheskin on 06/10/2020

Comments are disabled

Nice article in the Economist on how our ideas about speciation have been revised and updated. And not just for those animals but for humans too. In their words:

To be human, then, is to be a multispecies mongrel.

Following the science

by reestheskin on 03/10/2020

Comments are disabled

That “scientific management” bungled the algorithm for children’s exam results, verifies a maxim attributed to J.R. Searle, an American philosopher: if you have to add “scientific” to a field, it probably ain’t.

AD.Pellegrini in a letter to the Economist.

I have written elsewhere about this in medicine and science. We used to have physiology, but now some say physiological sciences; we used to have pharmacology, but now often see pharmacological sciences1. And as for medicine, neurology and neurosurgery used to be just fine, but then the PR and money grabbing started so we now have ‘clinical neuroscience’ — except it isn’t. As Herb Simon pointed out many years ago, the professions and professional practice always lose out in the academy.

  1. Sadly, my old department in Newcastle became Dermatological Sciences, and my most recent work address is Deanery of Clinical Sciences — which means both nouns are misplaced.

The way Brenner sees the world

by reestheskin on 29/09/2020

Comments are disabled

Alas, there will no more new ones of these, as arguably the greatest of modern biology’s experimentalists, Sydney Brenner, passed away last year. One of his earlier quotes — the source I cannot find at hand — was that it is important in science to be out of phase. You can be ahead of the curve of fashion or possibly, better still, be behind it. But stay out of phase. So, no apologies for being behind the curve on these ones which I have just come across.

Sydney Brenner remarked in 2008, “We don’t have to look for a model organism anymore. Because we are the model organisms.”

Sydney Brenner has said that systems biology is “low input, high throughput, no output” biology.

Quoted in The science and medicine of human immunology | Science

Image source and credits via WikiCommons

As light as a bag of wind

by reestheskin on 25/09/2020

Comments are disabled

In my ignorance I had always assumed that the ‘Haldane’ of the Haldane Principle1 referred to the great and singular geneticist and physiologist JBS Haldane. Not true. JBS once remarked that God must have been inordinately fond of beetles because there are so many species of beetles, so with the Haldanes; (good) fortune is, it appears, inordinately fond of the Haldane clan. A relative of JBS, Richard Burdon Haldane — who did indeed come up with the Haldane principle — is the subject of a new biography by Philip Campbell, and a witty and sharp review in the FT by Philip Stephens.

Watching today’s politicians fall over their own mistakes as they fumble with the Covid-19 pandemic, it is easy to forget that securing high office once required more than a few years of dashing off political columns for a national newspaper. So the life and political times of Richard Haldane, the subject of John Campbell’s engaging biography, offers a fitting rebuke to the trivial mendacity and downright incompetence of the nation’s present administration.

Exaggeration, it is not. Haldane…

…an Edinburgh lawyer and philosopher-politician before becoming a minister in Herbert Asquith’s Liberal administrations, was an important champion of universal education and one of the founding fathers of the UK university system. He also found time to create the Territorial Army, and to have a hand in the foundation of the London School of Economics, the Medical Research Council and the Secret Intelligence Service…

As Asquith’s minister for war, he created the expeditionary force that saved Britain from defeat in the opening stages of the first world war. As Lord Chancellor, his judgments did much to set in place the federalist tilt of the Canadian constitution.

And if there is any doubt about his intellectual gravitas, the review is headed by an image of Haldane with Albert Einstein whom he hosted on the latter’s first visit to the UK in the 1920s. Just conjure up BoJo or Patel or Hancock when you read the above, or when you step on something unpleasant and slimy.

It also seems that Haldane might have performed slightly better across the dispatch box than some of the current irregulars. Clark McGinn writes

He [Haldane] is also one of the few men to have beaten Winston Churchill by riposte. Haldane was a portly figure and Churchill remarked on his girth by asking when the baby was due and what it would be called. Haldane retorted: “If it’s a boy it will be George after the King, a girl will be Mary after the Queen. But if it is just wind I shall call it Winston.”

  1. The Haldane Principle is the idea that decisions about what to spend research funds on should be made by researchers instead of politicians. It is named after Richard Burdon Haldane. For a recent take on the Haldane Principle see David Edgerton, The ‘Haldane Principle’ and other invented traditions in science policy here.

In the year of the plague 1666

by reestheskin on 03/08/2020

Comments are disabled

I procured me a Triangular glass-Prisme, to try therewith the celebrated Phenomena of Colours. And in order thereto having darkened my chamber and made a small hole in my window-shuts, to let in a convenient quantity of the Suns light…but I surprised to see them in an oblong form; which, according to the received laws of Refraction, I expected should have been circular.

Well, Isaac Newton obviously had better functioning shutters than my bedroom blackout blinds. Most summer mornings (which at this latitude last much of the night…) I receive — without choice — lessons in physics 101. Here are some iPhone shots from this morning. The quote says a lot about science: note the terms, convenient quantity, surprised and expected.

Politics as a class of dominant negative mutation

by reestheskin on 29/07/2020

Comments are disabled

Today many scientists describe their research as apolitical, but Haldane knew that was impossible: ‘I began to realise that even if the professors leave politics alone, politics won’t leave the professors alone.’

From a review in the Economist of a biography of JBS Haldane by Samanth Subramanian.

Two things to add. Haldane’s paper A Defense of Beanbag Genetics is a personal favourite, but sticking with the genetics theme, I think of politics, and many politicians, as examples of dominant negative mutations.

No just in time here

by reestheskin on 16/07/2020

Comments are disabled

“I hope the lesson will really be that we can’t afford as a society to create the fire brigade once the house is on fire. We need that fire brigade ready all the time hoping that it never has to be deployed.”

Peter Piot 1

No just in time here. It’s in the statistical tails that dragons lurk and reputations are shattered. Chimes with a quote from Stewart Brand that I posted a short while back.

Education is intellectual infrastructure. So is science. They have very high yield, but delayed payback. Hasty societies that can’t span those delays will lose out over time to societies that can. On the other hand, cultures too hidebound to allow education to advance at infrastructural pace also lose out.

  1. (Virologist Peter Piot,  co-discoverer of  Ebola and who worked on treating and preventing HIV, talking about getting COVID-19 on his institution’s podcast. (London School of Hygiene & Tropical Medicine podcast )

Mathiness, scientism and give me the money.

by reestheskin on 06/07/2020

Comments are disabled

Two nice quotes from Paul Romer about his paper Mathiness in the Theory of Economic Growth

The alternative to science is academic politics, where persistent disagreement is encouraged as a way to create distinctive sub-group identities.

The usual way to protect a scientific discussion from the factionalism of academic politics is to exclude people who opt out of the norms of science. The challenge lies in knowing how to identify them.

I can agree go along with both, but it is in the details that the daemons feast. It appears to me that the ‘norms of science’ argument is itself problematic, reminding me of those silly things you learn at school about the scientific method 1. The historical origin of the concept of the scientific method owed more to attempts to brand certain activities in the eyes of those who were not practicing scientists 2. As a rough approximation, the people who talk about the scientific method tend not to do science. Of course, in more recent times, the use of the term ‘science’ itself has been a flag for obtaining funding, status or approval. Dermatology is now dermatological sciences ; pharmacology is now pharmacological sciences. Even more absurd, in the medical literature I see the term delivery science (and I don’t mean Amazon), or reproducibility science. The demarcation of science from non-science is a hard philosophical problem going back way before Popper; I will not solve it. The danger is that we might end up exiling all those meaningful areas of human rationality that we once — rightly — considered outwith science, but still valued. There is indeed a subject that we might reasonably call medical science(s). It is just not synonymous with the principles and practice of medicine. It is also why political economy is a more useful subject than economics (or worse still, economic sciences).

  1. I guess this depends on how you interpret the ‘they’ in Romer’s second quote. It is the people or the norms that are the problem? I tend to think of both.
  2. There is an excellent recent article in the New York Review of Books that touches upon this issue Just Use Your Thinking Pump! by Jessica Riskin.

On being an eternal student.

by reestheskin on 22/06/2020

Comments are disabled

Freeman Dyson died February 28th this year. There are many obituaries of this great mind and eternal rebel. His book, Disturbing the Universe, is for me one of a handful that gets the fundamental nature of discovery in science and how science interacts with other modes of being human. His intellectual bravery and honesty shine through most of his writings. John Naughton had a nice quote from him a short while back.

Some mathematicians are birds, others are frogs. Birds fly high in the air and survey broad vistas of mathematics out to the far horizon. They delight in concepts that unify our thinking and bring together diverse problems from different parts of the landscape. Frogs live in the mud below and see only the flowers that grow nearby. They delight in the details of particular objects, and they solve problems one at a time. I happen to be a frog, but many of my best friends are birds. The main theme of my talk tonight is this. Mathematics needs both birds and frogs.

In truth he was both frog and an albatross. Here are some words from his obituary in PNAS.

During the Second World War, Dyson worked as a civilian scientist for the Royal Air Force’s Bomber Command, an experience that made him a life-long pacifist. In 1941, as an undergraduate at Trinity College, Cambridge, United Kingdom, he found an intellectual role model in the famed mathematician G. H. Hardy, who shared two ideas that came to define Dyson’s trajectory: “A mathematician, like a painter or a poet, is a maker of patterns,” and “Young men should prove theorems; old men should write books.”

Heeding the advice of his undergraduate mentor, Dyson returned to his first love of writing. He became well-known to a wide audience by his books Disturbing the Universe (1979) (1) and Infinite in All Directions (1988) (2), and his many beautiful essays for The New Yorker and The New York Review of Books. In 2018, he published his autobiography, Maker of Patterns (3), largely composed of letters that he sent to his parents from an early age on.

And as for us eternal students, at least I have one thing in common.

…Dyson never obtained an official doctorate of philosophy. As an eternal graduate student, a “rebel” in his own words, Dyson was unafraid to question everything and everybody. It is not surprising that his young colleagues inspired him the most.

Freeman J. Dyson 1923–2020: Legendary physicist, writer, and fearless intellectual explorer | PNAS

Queerer that I can imagine

by reestheskin on 16/06/2020

Comments are disabled

Its natural, Jim. From an obituary of Julian Perry Robinson in Nature

Julian Perry Robinson (1941–2020)

In 1981, the US government publicly accused Soviet-backed forces in southeast Asia of waging toxin warfare and violating their legal obligations under the 1925 Geneva Protocol and 1972 Biological Weapons Convention. It alleged that aircraft dispersed ‘yellow rain’ containing mycotoxins that were “not indigenous to the region”. Julian Perry Robinson, working alongside biologist Matthew Meselson at Harvard University in Cambridge, Massachusetts, established that what actually fell was wild-honeybee faeces containing naturally occurring toxins. He died on 22 April, aged 78.

Prose

by reestheskin on 09/06/2020

Comments are disabled

The Nobel laureate David Hubel commented somewhere that reading most modern scientific papers was like chewing sawdust. Certainly it is rare nowadays to see the naked honesty of Watson and Crick’s classic opening paragraphs, or the melody not being drowned out by the the metrical percussion.

WE wish to suggest a structure for the salt of deoxyribose nucleic acid (D.N.A.). This structure has novel features which are of considerable biological interest [emphasis added].

A structure for nucleic acid has already been proposed by Pauling and Corey1. They kindly made their manuscript available to us in advance of publication. Their model consists of three intertwined chains, with the phosphates near the fibre axis, and the bases on the outside. In our opinion, this structure is unsatisfactory for two reasons : (1) We believe that the material which gives the X-ray diagrams is the salt, not the free acid. Without the acidic hydrogen atoms it is not clear what forces would hold the structure together, especially as the negatively charged phosphates near the axis will repel each other. (2) Some of the van der Waals distances appear to be too small.

And then there is that immortal understated penultimate paragraph.

It has not escaped our notice that the specific pairing we have postulated immediately suggests a possible copying mechanism for the genetic material.

Well here is another one that impresses me even if I can claim no expertise in this domain. It is from the prestigious journal Physical Review, is 27 words in length, with one number, one equation and one reference.  Via Fermat’s Library @fermatslibrary

 

 

 

 

 

Via Fermat’s Library @fermatslibrary

The slow now

by reestheskin on 18/05/2020

Comments are disabled

Education is intellectual infrastructure.  So is science.  They have very high yield, but delayed payback.  Hasty societies that can’t span those delays will lose out over time to societies that can.  On the other hand, cultures too hidebound to allow education to advance at infrastructural pace also lose out.

Stewart Brand.

Almost queerer than we can imagine

by reestheskin on 24/04/2020

Comments are disabled

Some non-covid-19 recreational reading. Although the bees might be here longer than us..

Hive Mentalities | by Tim Flannery | The New York Review of Books

According to Thor Hanson’s Buzz, the relationship between bees and the human lineage goes back three million years, to a time when our ancestors shared the African savannah with a small, brownish, robin-sized bird—the first honeyguide. Honeyguides are very good at locating beehives, but they are unable to break into them to feed on the bee larvae and beeswax they eat. So they recruit humans to help, attracting them with a call and leading them to the hive. In return for the service, Africans leave a small gift of honey and wax: not enough that the bird is uninterested in locating another hive, but sufficient to make it feel that its efforts have been worthwhile. Honeyguides may have been critical to our evolution: today, honey contributes about 15 percent of the calories consumed by the Hadza people—Africa’s last hunter-gatherers—and because brains run on glucose, honey located by honeyguides may have helped increase our brain size, and thus intelligence.

Review of Buzz: The Nature and Necessity of Bees

by Thor Hanson. Basic Books.

Dan Greenberg RIP

by reestheskin on 17/04/2020

Comments are disabled

Daniel S. Greenberg (1931–2020) has died. Nice obituary about him and why he mattered in this week’s Science.

Daniel S. Greenberg (1931–2020) | Science

At the time, the idea of a journalist-written section in a publication devoted to publishing research papers was highly unusual, and so was the approach that Dan and his team took. They covered basic research policy in much the same way a business reporter would cover development of economic policy: as a set of competing interests…[emphasis added].

However, it was not greeted with universal enthusiasm. In a preface to the second edition, Dan noted that it sparked “reactions that flowed from the belief that the scientific community should be exempt from the types of journalistic inquiries that are commonplace to other segments of our society.” He called that attitude “nonsense.”

No old (nor broken) records here

by reestheskin on 21/03/2020

Comments are disabled

Richard Horton in the Lancet writes:

Imagine if the entire edifice of knowledge in medicine was built upon a falsehood. Systematic reviews are said to be the highest standard of evidence-based health care. Regularly updated to ensure that treatment decisions are built on the most up-to-date and reliable science, systematic reviews and meta-analyses are widely used to inform clinical guidelines and decision making. Powerful organisations have emerged to construct a knowledge base in medicine underpinned by the results of systematic reviews. One such organisation is Cochrane, with 11 000 members in over 130 countries. This extraordinary movement of people is justifiably passionate about the idea that it is contributing to better health outcomes for everyone, everywhere. The industry that drives the production of systematic reviews today is financed by some of the most influential agencies in medical research. Cochrane, for example, points to three funders providing over £1 million each—the UK’s National Institute for Health Research (NIHR), the US National Institutes of Health (NIH), and Australia’s National Health and Medical Research Council (NHMRC).

Well, it really is a bit late for all this soul searching. See my earlier post here ‘Mega-silliness’ (commenting on what others had already pointed out); or my Evidence Based Medicine: the Epistemology That Isn’t, written over 20 years ago;  and my contribution to the wake (even if I didn’t put my hand in my pocket), Why we should let “evidence-based medicine” rest in peace. The genesis of EBM was as a cult whose foundational myth was that P values could act as a true machine. Those followers who had originally hoped for a place in the promised afterlife, soon settled for paying the bills, and EBM morphed into a career opportunity for those who found accountancy too daring. So, pace John Mayall on Jazz Blues Fusion, don’t come here to listen to an old record. I promise.

Offline: The gravy train of systematic reviews – The Lancet