Frans de Waal taught the world that animals had emotions
The young male chimps at Burgers’ Zoo in Arnhem were fighting again. They were running round their island, teeth bared, screaming. Two in particular were battling until one definitively won, and the other lost. They ended up, apparently sulking, high in widely separate branches of the same tree. Then young Frans de Waal, who was observing their wars for his dissertation, saw something astonishing. One held out his hand to the other, as if to say “Let’s make it up.” In a minute they had swung down to a main fork of the tree, where they embraced and kissed.
He did not hesitate to call this what it was: reconciliation. What was more, it was essential if the group was to cohere and survive. The word, though, scandalised his tutors. Studying primates in those days, the mid-1970s, was mostly a matter of recording violence, aggression and selfishness.
Frans de Waal has died. All brings to mind the wonderful photo of the chimp and Jane Goodall eyeing each other up in the Think Different series.
Citation cartels help some mathematicians—and their universities—climb the rankings | Science | AAAS
Cliques of mathematicians at institutions in China, Saudi Arabia, and elsewhere have been artificially boosting their colleagues’ citation counts by churning out low-quality papers that repeatedly reference their work, according to an unpublished analysis seen by Science. As a result, their universities—some of which do not appear to have math departments—now produce a greater number of highly cited math papers each year than schools with a strong track record in the field, such as Stanford and Princeton universities.
These so-called “citation cartels” appear to be trying to improve their universities’ rankings, according to experts in publication practices. “The stakes are high—movements in the rankings can cost or make universities tens of millions of dollars,” says Cameron Neylon, a professor of research communication at Curtin University. “It is inevitable that people will bend and break the rules to improve their standing.” In response to such practices, the publishing analytics company Clarivate has excluded the entire field of math from the most recent edition of its influential list of authors of highly cited papers, released in November 2023.
Corporations tend to choose survival over morality in the absence of countervailing power.
The 2023 Nobel prizes – What they mean for higher education
Dr Katalin Karikó, joint winner of the physiology-medicine award, has received much comment in the media. Born and educated in Hungary, she has spent most of her career in the United States. But she has also held appointments in three other countries at a variety of institutions, and has most recently been senior vice president at BioNTech, a biotech company in Germany.
The debate stems from her time at the University of Pennsylvania, where she worked from 1989 to 2001, in positions ranging from scientific assistant professor, to senior head of research, to adjunct associate professor.
During that period, she was demoted from a tenure-track position in 1995, refused the possibility of reinstatement to the tenure track and eventually ushered into retirement in 2013.
Meanwhile, her close collaborator and fellow prize winner, Dr Drew Weissman, whom she met in 1997, remains at the University of Pennsylvania as professor of medicine, as well as being co-director of the immunology core of the Penn Center for AIDS Research and director of vaccine research in the infectious diseases division.
Some have pointed out that Karikó was working on risky or unconventional scientific themes, and that the usual funding agencies and senior academics were unable to see the promise in her work until recently, when she and her colleague Weissman have been recipients of multiple prizes. The fact that she received her doctorate from the University of Szeged in Hungary and not a prestigious institution in a major country may not have helped.
None of this story is surprising nor strange.
My main theme in the book, which is something I’ve discussed for a number of years in other fora, is that we are in a state where science has greater potential benefits, but greater potential downsides. And indeed, in our evermore interconnected world, there’s a genuine risk of global catastrophes, which could arise through our collective actions, as we’re seeing in the concerns about climate change and loss of biodiversity. But it could also arise from an engineered pandemic, for instance, which could be generated by ill-intended applications of biology.
I’m talking really here in the book about what I’m trying to do, that is to measure up how much progress we have made with how much progress could be made or is ever likely to be made
Martin Rees explains how science might save us – Bulletin of the Atomic Scientists
Roger Searle Payne (1935–2023) | Science
He changed the world. Humanity takes longer to come aboard.
Roger Searle Payne, the biologist who pioneered studies of whale behavior and communication and advocated for their protection, died on 10 June. He was 88. Payne was widely known to both scientists and the public for his groundbreaking discovery of the songs of humpback whales.
Born on 29 January 1935 in New York City, Payne received a BA in biology from Harvard University in 1956 and a PhD in animal behavior from Cornell University in 1961. From 1966 to 1984, he served as a biology and physiology professor at The Rockefeller University in New York. In 1971, Payne founded Ocean Alliance, an organization established to study and protect whales and their environment, and he remained its director until 2021.
Initially, Payne’s research focused on auditory localization in moths, owls, and bats, but he changed course to focus on conservation and selected whales for their status as a keystone species. In 1967, he and his then-wife Katharine (Katy) first heard the distinctive sounds of the humpback whales on a secret military recording intended to detect Russian submarines off the coast of Bermuda. Payne and his collaborators, including Scott McVay and Frank Watlington, were the first to discover that male humpback whales produce complex and varied calls. Mesmerized by the recordings, Payne realized that the recurring pattern and rhythmicity constituted a song. He published his findings in a seminal Science paper in 1971. After many years and many additional recordings, he and Katy further realized that the songs varied and changed seasonally.
These hauntingly beautiful whale songs captured the public’s attention thanks to Payne’s extraordinary vision. He released an album, Songs of the Humpback Whale, in 1970 that included a booklet in English and Japanese about whale behavior and the dire situation that many species of whales faced. He recognized the power of juxtaposing the plaintive and ethereal songs of humpbacks with images of whaling.
The album remains the most popular nature recording in history, with more than two million copies sold. Humpback whale songs are now carried aboard the Voyager spacecrafts as part of the signature of our planet.
The UK accounts for 2 per cent of global manufacturing and 2 per cent of global R&D. You’re not a science superpower if you do 2 per cent…You can’t go around claiming that in seven years’ time the UK is going to be a climate leader or leader in green tech, it just doesn’t make sense
The British economy needs to follow a policy of improvement, not a policy of chest-beating and claiming to be on the cusp of transformative breakthroughs.
David Edgerton, the historian of science and technology, quoted in The New Statesman 14-20 July 2023 page 143
The great joy of science and technology is that they are, together, the one part of human culture which is genuinely and continuously progressive.
This is from the science and tech editor at the Economist who is standing down from that role at three decades. Science: Stop and starts — not continuous. And that is just for starters. Discuss!
Hugh Pennington | Deadly GAS · LRB 13 December 2022
Another terrific bit of writing by Hugh Pennington in the LRB. It is saturated with insights into a golden age of medical science.
Streptococcus pyogenes is also known as Lancefield Group A [GAS]. In the 1920s and 1930s, at the Rockefeller Institute in New York, Rebecca Lancefield discovered that streptococci could be grouped at species level by a surface polysaccharide, A, and that A strains could be subdivided by another surface antigen, the M protein.
Ronald Hare, a bacteriologist at Queen Charlotte’s Hospital in London, worked on GAS in the 1930s, a time when they regularly killed women who had just given birth and developed puerperal fever. He collaborated with Lancefield to prove that GAS was the killer. On 16 January 1936 he pricked himself with a sliver of glass contaminated with a GAS. After a day or two his survival was in doubt.
His boss, Leonard Colebrook, had started to evaluate Prontosil, a red dye made by I.G. Farben that prevented the death of mice infected with GAS. He gave it to Hare by IV infusion and by mouth. It turned him bright pink. He was visited in hospital by Alexander Fleming, a former colleague. Fleming said to Hare’s wife: ‘Hae ye said your prayers?’ But Hare made a full recovery.
Prontosil also saved women with puerperal fever. The effective component of the molecule wasn’t the dye, but another part of its structure, a sulphonamide. It made Hare redundant. The disease that he had been hired to study, funded by an annual grant from the Medical Research Council, was now on the way out. He moved to Canada where he pioneered influenza vaccines and set up a penicillin factory that produced its first vials on 20 May 1944. [emphasis added]
He returned to London after the war and in the early 1960s gave me a job at St Thomas’s Hospital Medical School. I wasn’t allowed to work on GAS. There wasn’t much left to discover about it in the lab using the techniques of the day, and penicillin was curative.
[emphasis added]
Monday 17 May, 2022 – by John Naughton – Memex 1.1
John Naughton’s Quote of the Day
”If people don’t believe mathematics is simple, it is only because they don’t realise how complicated life is.”
John von Neumann
Are there limits to economic growth? It’s time to call time on a 50-year argument
Nature today revisits the publication fifty years ago of The Limits to Growth by the System Dynamics group at the Massachusetts Institute of Technology. The journal (then) referred to is as another whiff of doomsday.
Zoologist Solly Zuckerman, a former chief scientific adviser to the UK government, said: “Whatever computers may say about the future, there is nothing in the past which gives any credence whatever to the view that human ingenuity cannot in time circumvent material human difficulties.”
Which is surely akin to saying, there is nothing in the past to suggest we are already extinct. As for computers, there are three branches of science: theory, experiment, and computation. The song goes:
You’re obsolete, my baby
My poor old fashioned baby
I said, baby, baby, baby, you’re out of time
Well, baby, baby, baby, you’re out of time
I said, baby, baby, baby, you’re out of time
Yes you are left out, out of there without a doubt
‘Cause baby, baby, baby, you’re out of time
(Rolling Stones)
Monday 3 January, 2021 – by John Naughton – Memex 1.1
I came across this article by Zadie Smith via John Naughton.
Magical thinking is a disorder of thought. It sees causality where there is none, confuses private emotion with general reality, imposes—as Didion has it, perfectly, in “The White Album”—“a narrative line upon disparate images.” But the extremity of mourning aside, it was not a condition from which she generally suffered. Didion’s watchword was watchword. She was exceptionally alert to the words or phrases we use to express our core aims or beliefs. Alert in the sense of suspicious. Radically upgrading Hemingway’s “bullshit detector,” she probed the public discourse, the better to determine how much truth was in it and how much delusion. She did that with her own sentences, too. [emphasis added]
Escaping Corona: A Community of German Anti-Vaxxers on the Black Sea Coast – DER SPIEGEL
I wasn’t familiar with this word although even with my smattering of German (as in, I do violence to the language) I could hazard a guess.
The source was an article in Der Spiegel about Germans who have moved to Bulgaria to get away from Covid restrictions.
The apartment complex in the town of Aheloy is considered a stronghold of German-speaking corona truthers and so-called “Querdenker,” that hodgepodge of anti-government conspiracy theorists who have waged an ongoing campaign against all measures aimed at combatting the pandemic.
If you check out in the Collins online dictionary you find:
English Translation of “Querdenker” | Collins German-English Dictionary
Querdenker
MASCULINE NOUN , Querdenkerin FEMININE NOUN
Sounds like a great opener for an essay on epistemology 101.
Before it comes to that, we have another question: Does the unofficial Château boss describe himself as a Querdenker? The term, which, pre-COVID, used to be reserved in Germany for those who think outside the box, “has lost its original meaning,” Gelbrecht says. “True Querdenker were people like Albert Einstein or Stephen Hawking.” He primarily views himself as a savior for the desperate. “Many Germans are growing increasingly concerned that they will be excluded if they don’t get vaccinated, that they will no longer be able to take part in society and that they will be forced to have their children vaccinated.”
From a review of “The Ascent of Information” by Caleb Scharf.
Every cat GIF shared on social media, credit card swiped, video watched on a streaming platform, and website visited add more data to the mind-bending 2.5 quintillion bytes of information that humans produce every single day. All of that information has a cost: Data centers alone consume about 47 billion watts, equivalent to the resting metabolism of more than a tenth of all the humans on the planet.
Scharf begins by invoking William Shakespeare, whose legacy permeates the public consciousness more than four centuries after his death, to show just how powerful the dataome can be. On the basis of the average physical weight of one of his plays, “it is possible that altogether the simple act of human arms raising and lowering copies of Shakespeare’s writings has expended over 4 trillion joules of energy,” he writes. These calculations do not even account for the energy expended as the neurons in our brains fire to make sense of the Bard’s language.
How ‘creative destruction’ drives innovation and prosperity | Financial Times
From time to time, I vow not to read any more comments on the FT website. Trolls aside, I clearly live in a different universe. But then I return. It is indeed a signal-noise problem, but one in which the weighting has to be such that the fresh shoots are not overlooked. I know nothing about Paul A Myers, and I assume he lives in the US, but over the years you he has given me pause for thought on many occasions. One recent example below.
Science-based innovation largely comes out of the base of 90 research universities. One can risk an over-generalization and say there are no “universities” in a non-constitutional democratic country, or authoritarian regime. Engineering institutes maybe, but not research universities. Research is serendipity and quirky; engineering is regular and reliable. Engineering loves rules; research loves breaking them. The two fields are similar but worship at different altars.
This contrast is also true of medicine and science. Medicine is regulated to hell and back — badly, often — but I like my planes that way too. But, in John Naughton’s words, if you want great research, buy Aeron chairs, and hide the costs off the balance sheet lest the accountants start discounting all the possible futures.
It is one of dermatology’s tedious and fun facts that, in contradistinction to say scabies or head lice, you treat body lice by treating not the patient (directly) but the clothing. The pharmacological agent is a washing machine. But the excerpt quoted below tells you something wonderful about science: you get things out that you never expected. Spontaneous generation — not in the real world — but in the world of ideas. Well, almost.
How clothing and climate change kickstarted agriculture | Aeon Essays
Scientific efforts to shed light on the prehistory of clothes have received an unexpected boost from another line of research, the study of clothing lice, or body lice. These blood-sucking insects make their home mainly on clothes and they evolved from head lice when people began to use clothes on a regular basis. Research teams in Germany and the United States analysed the genomes of head and clothing lice to estimate when the clothing parasites split from the head ones. One advantage of the lice research is that the results are independent from other sources of evidence about the origin of clothes, such as archaeology and palaeoclimatology. The German team, led by Mark Stoneking at the Max Planck Institute for Evolutionary Anthropology, came up with a date of 70,000 years ago, revised to 100,000 years ago, early in the last ice age. The US team led by David Reed at the University of Florida reported a similar date of around 80,000 years ago, and maybe as early as 170,000 years ago during the previous ice age. These findings from the lice research suggest that our habit of wearing clothes was established quite late in hominin evolution.
Journey to the Edge of Reason by Stephen Budiansky — ruthless logic | Financial Times
Ever since I read of how Gödel’s work has rendered decades of work by Bertrand Russell and others void, Gödel has fascinated me. Not that I can follow the raw proofs. But his work speaks of a wonderful Platonic world that is hard not to fall in love with. Two quotes: the first is new to me, the second, sadly not.
Einstein sponsored his US citizenship, which Gödel almost torpedoed by telling the judge that he had found a logical inconsistency in the constitution that would allow a person to establish a dictatorship in America.
His end, when it came, was tragic. His paranoia grew and he became convinced that his food was being poisoned. When this had happened earlier in his life, his wife had managed to taste test and spoon feed him to health but this time she too was ill and in January 1978, he died in hospital, curled into a foetal position and weighing only 65 pounds.
Jonathan Flint · Testing Woes · LRB 6 May 2021
Terrific article from Jonathan Flint in the LRB. He is an English psychiatrist and geneticist (mouse models of behaviour) based in UCLA, but like many, has put his hand to other domains (beyond depression). He writes about Covid-19:
Perhaps the real problem is hubris. There have been so many things we thought we knew but didn’t. How many people reassured us Covid-19 would be just like flu? Or insisted that the only viable tests were naso-pharyngeal swabs, preferably administered by a trained clinician? Is that really the only way? After all, if Covid-19 is only detectable by sticking a piece of plastic practically into your brain, how can it be so infectious? We still don’t understand the dynamics of virus transmission. We still don’t know why around 80 per cent of transmissions are caused by just 10 per cent of cases, or why 2 per cent of individuals carry 90 per cent of the virus. If you live with someone diagnosed with Covid-19, the chances are that you won’t be infected (60 to 90 per cent of cohabitees don’t contract the virus). Yet in the right setting, a crowded bar for example, one person can infect scores of others. What makes a superspreader? How do we detect them? And what can we learn from the relatively low death rates in African countries, despite their meagre testing and limited access to hospitals?
That we are still scrambling to answer these questions is deeply worrying, not just because it shows we aren’t ready for the next pandemic. The virus has revealed the depth of our ignorance when it comes to the biology of genomes. I’ve written too many grant applications where I’ve stated confidently that we will be able to determine the function of a gene with a DNA sequence much bigger than that of Sars-CoV-2. If we can’t even work out how Sars-CoV-2 works, what chance do we have with the mammalian genome? Let’s hope none of my grant reviewers reads this.
Medicine is always messier that people want to imagine. It is a hotchpotch of kludges. For those who aspire to absolute clarity, it should be a relief that we manage effective action based on such a paucity of insight. Cheap body-hacks sometimes work. But the worry remains.
Tyler Cowen says some interesting things in an article Why Economics is Failing US on Bloomberg. I don’t think his comments are limited to the economics domain.
Why Economics Is Failing Us – Bloomberg
Economics is one of the better-funded and more scientific social sciences, but in some critical ways it is failing us. The main problem, as I see it, is standards: They are either too high or too low. In both cases, the result is less daring and creativity.
Consider academic research. In the 1980s, the ideal journal submission was widely thought to be 17 pages, maybe 30 pages for a top journal. The result was a lot of new ideas, albeit with a lower quality of execution. Nowadays it is more common for submissions to top economics journals to be 90 pages, with appendices, robustness checks, multiple methods, numerous co-authors and every possible criticism addressed along the way.
There is little doubt that the current method yields more reliable results. But at what cost? The economists who have changed the world, such as Adam Smith, John Maynard Keynes or Friedrich Hayek, typically had brilliant ideas with highly imperfect execution. It is now harder for this kind of originality to gain traction. Technique stands supreme and must be mastered at an early age, with some undergraduates pursuing “pre-docs” to get into a top graduate school.
Sam Shuster, before I departed to Strasbourg, warned me in a similar vein, with reference to the Art of War by Sun Tzu:
Even the mystique of wisdom turns out to be technique. But if today must be learning technique, don’t leave the tomorrow of discovery too long.
I would say I heard the message but didn’t listen carefully enough. As befits an economist, Cowen warns us that there is no free lunch.
For some reason — COVID of course — I keep coming back to perhaps the greatest vaccine success ever: the eradication of smallpox (here, here and here). But the figures quoted by Scott Galloway make you sit up and notice both the magic — and value — of science.
International bodies are immolated. Consider the World Health Organisation. Mr Trump’s decision to pull America out of the WHO in the midst of a pandemic (reversed under President Joe Biden) was galling, particularly as the WHO is responsible for one of humanity’s greatest public-health accomplishments: the eradication of smallpox in the 1970s. To appreciate the magnitude of this, Google images of “smallpox” and glimpse the horror that once killed millions each year. It was a victory for co-operative, state-funded projects and it cost a mere $300m. By one estimate, America, the largest contributor, recoups that value every 26 days from savings in vaccinations, lost work and other costs. [emphasis added]
Thirty years ago[now 40], scientists who studied climate change, and I am one of them, tended to have long hair and very colourful socks. We were regarded as harmless but irrelevant. But the serendipitous investment in their work revealed processes that we now recognise as threatening the future of human society, and the successors to those scientists are playing a crucial role in assessing how we need to adapt.
I think you could see the same dress sense in the golden ages of molecular biology and computing.
Another snippet from a wonderful article (and previous aside).
Or not, as the case may be.
Smallpox is the greatest success story in the history of medicine. It once took huge numbers of lives — as many as half a billion people in the 20th century alone — and blinded and disfigured many more.
So writes the distinguished historian of science, Steven Shapin in the LRB (A Pox on the Poor, February 4, 2021). He is reviewing The Great Inoculator: The Untold Story of Daniel Sutton and His Medical Revolution).
In historical times you had a one in three chance of getting smallpox, and, if you got it, the case-fatality was 20%. Some outbreaks, however, had a case-fatality of 50% and, unlike Covid-19, its preferred targets were children.
My exposure to smallpox was (thankfully) limited. My mother told me that there was an outbreak in South Wales and the West of England when I was around five or six. There was widespread vaccination, but kids like me with bad eczema, were spared, with the parent advised to ‘shield’ the child indoors (my sympathies go to my mother). (The risk was from the grandly named, but sometimes fatal, Kaposi’s varicelliform reaction, which was due to the vaccinia virus — not smallpox — running riot on the diseased skin).
As a med student, I remember learning how to distinguish the cutaneous lesions of smallpox from chicken pox. Smallpox was declared eradicated in 1980, but, as a dermatology registrar, seeing the occasional adult with chickenpox who seemed so ill (in comparison with kids), I often had doubts that I had to reason away. Perhaps those stores held by the US and USSR were not so secure…
Jenner and smallpox vaccination go together in popular accounts, but the history of this particular clinical discovery is much older and richer — at least to me.
As ever, in medicine, and in particular for anything involving the skin, the terminology is confusing. The Latin name for cowpox is Variolae vaccinae, meaning the pox from the cow (vacca). It was Pasteur who subsequently honoured Jenner by deciding that all forms of inoculation be called vaccination.
Edward Jenner took advantage of the already-known fact that whilst milkmaids tended to be afflicted with the far more mild cowpox virus, they rarely suffered from the more serious, smallpox (they are different, but related, viruses). Jenner, in 1796, inoculated an eight-year-old boy with the pus from a milkmaid’s cowpox sore. After being exposed to smallpox material the boy appeared immune, in that he did not suffer adversely when subsequently exposed to smallpox.
Once Jenner’s method was accepted as safe, Acts of Parliament introduced free vaccination n 1840, and vaccination became obligatory in 1853.
I had never been quite certain of the distinction between inoculation and vaccination, but there is history here too. Shapin writes that the term inoculation was borrowed from horticulture — the grafting of a bud (or ‘eye’) to propagate a plant (something I was taught how to do in horticulture lessons when I was aged 11, in school in Cardiff, by Brother Luke, who, I thought so old, he might have been a contemporary of Jenner). Why the name is apt is explained below.
Before vaccination, inoculation was actually meant to give you a direct form of smallpox (this was also referred to as variolation, after variola, the term for smallpox). The source material, again, was from a lesion of somebody with smallpox. The recipient it was hoped would develop a milder version of smallpox. Shapin writes:
The contract with the inoculator was to accept a milder form of the disease, and a lower chance of death, in exchange for a future secured from the naturally occurring disease, which carried a high chance of killing or disfiguring.
Shapin tells us that Lady Mary Wortley Montagu, when holed up with her husband in Constantinople in 1717, heard stories about how such ‘in-grafting’ was in widespread use by the ‘locals’. She was scarred from smallpox, and therefor she had the procedure carried out on her then five-year-old son. The needle was blunt and rusty, but her son suffering just a few spots and the procedure was judged a success. He was now immune to smallpox.
Not surprisingly, the story goes back further: inoculation was folk medicine practice in Pembrokeshire as early as 1600, and the Chinese had been blowing dried, ground-up smallpox material up the nose for many centuries.
The London medical establishment were apparently not too impressed with the non-British origin of such scientific advance, nor its apparent simplicity (and hence low cost). So, they made the procedure made much more complicated, with specific diets being required, along with advice on behaviour, and, of course, blood-lettings and laxatives, all in addition to not just a ‘prick’ but a bigger incision (payment by the inch). The physician’s ‘fees’ no doubt rose in parallel. Not a bad business model, until…
The London physicians’ ‘add-ons’ put the treatment costs of inoculation out of reach of most of the population, restricting it, for decades, to the ‘medical carriage trade’. Along comes Richard Sutton, a provincial barber-surgeon, with no Latin or Greek, no doubt, who effectively industrialised the whole process, making it both more profitable and cheaper for the customer.
Based in a village near Chelmsford, he inoculated tens of thousands locally. The method was named after him, the Suttonian Method. On one day he inoculated 700 persons. Incisions (favoured by the physicians) were replaced by the simpler prick, and patients were not confined, but instead told to go out in the fresh air (day-case, anybody?). Product differentiation was of course possible:spa-like pampering in local accommodation was available for the top end of the market, with a set of sliding fees depending on the degree of luxury.
Shapin writes:
Splitting the market and niche pricing were aspects of Sutton’s business success, but so too was control of supply. The extended Sutton clan could satisfy a significant chunk of provincial demand, but Daniel also worked out a franchising system, which ‘authorised’ more than fifty partners throughout Britain and abroad to advertise their use of the ‘Suttonian System’ — provided they paid fees for Sutton-compounded purgatives, kicked back a slice of their take, and kept the trade secrets. Control was especially important, since practically, anyone could, and did, set up as an inoculator. The Suttons themselves had become surgeons through apprenticeship, but apothecaries, clergymen, artisans and farmers were inoculating, and sometimes parents inoculated their own children. The profits of the provincial press were considerably boosted as practitioners advertised their skills at inoculation and their keen prices. Daniel went after competitors — including his own father-in-law and a younger brother — with vigour, putting it about that the Suttonian Method depended on a set of closely held secrets, to which only he and his approved partners had access. His competitors sought to winkle out these secrets, occasionally pouncing on Sutton’s patients to quiz them about their experiences.
Sutton admitted that had ‘lost’ five out of forty thousand patients (due to smallpox). He offered a prize of 100 guineas to anybody who could prove he ever lost a patient due to inoculation, or that any patient got smallpox a second time. More confident, and perhaps more generous, than modern Pharma, I think. By 1764, Sutton had an annual income of 6300 guineas — over one million sterling in today’s money.
In the middle of the pandemic, I got an e-mail asking whether I had access to data from the experiments behind a paper I’d published in 2014. Three months later, I requested that the paper be retracted. The experience has not left me bitter: if anything, it brought me back to my original motivation for doing research.
This is a disarmingly honest piece (in the journal Nature) about how mistakes in the analysis of complicated data sets caused inappropriate conclusions, leading, in this case, to a retraction of a manuscript.
As a student, I was even told never to attempt to replicate before I publish. That is not a career I would want — luckily, my PhD adviser taught me the opposite.
John Ziman warned over 20 years ago in Real Science that we were entering post-academic science. Here are some words of his from an article in Nature.
What is more, science is no longer what it was when [Robert] Merton first wrote about it. The bureaucratic engine of policy is shattering the traditional normative frame. Big science has become a novel way of life, with its own conventions and practices. What price now those noble norms? Tied without tenure into a system of projects and proposals, budgets and assessments, how open, how disinterested, how self-critical, how riskily original can one afford to be?
As the economists are fond of saying: institutions matter. As do incentives. Precious metals can be corrupted, and money — in the short term — made.
Mark Zuckerberg is what happens when you replace civics with computer science.
New Year Monday Note. | Jean-Louis Gassée | Jan, 2021
In the shower, all ideas look good.
Universities challenged: critical theory and culture wars | Financial Times
A comment about the above article:
As a full professor in a similar situation, a humanities department in a British teaching factory (sorry major research university) I completely agree with Musidorus.
The Economist | Awesome, weird and everything else
Ironically, some old gender stereotypes may now be helping girls. When girls are toddlers they are read to more than boys. Their fathers are five times more likely to sing or whistle to them and are more likely to speak to them about emotions, including sadness. Their mothers are more likely to use complex vocabulary with them. Most of this gives girls a leg up in a world that increasingly prizes “soft skills”. Girls still have less leisure time than boys, but nowadays that is primarily because they spend more time on homework and grooming, rather than an unfair division of chores. And in the time left for themselves they have far more freedom.
The one good thing about COVID-19 is that it’s good for nature and the environment and dolphins,” says Sarah, “but I wish it wouldn’t kill so many people in the process.”
He had little respect for the professors of his time, telling a friend in 1735 that “there is nothing to be learnt from a Professor, which is not to be met with in Books”. He did not graduate.
Nor did I ever submit my PhD. As David Hubel once said, the great advantage of an MD degree was (then) being able to avoid having to gain a PhD credential.
How mRNA went from a scientific backwater to a pandemic crusher | WIRED UK
For decades, Katalin Karikó’s work into mRNA therapeutics was overlooked by her colleagues. Now it’s at the heart of the two leading coronavirus vaccines
By the mid 1990s, Karikó’s bosses at UPenn had run out of patience. Frustrated with the lack of funding she was generating for her research, they offered the scientist a bleak choice: leave or be demoted. It was a demeaning prospect for someone who had once been on the path to a full professorship. For Karikó’s dreams of using mRNA to create new vaccines and drugs for many chronic illnesses, it seemed to be the end of the road… ”It was particularly horrible as that same week, I had just been diagnosed with cancer,” said Karikó. “I was facing two operations, and my husband, who had gone back to Hungary to pick up his green card, had got stranded there because of some visa issue, meaning he couldn’t come back for six months. I was really struggling, and then they told me this.”
Karikó has been at the helm of BioNTech’s Covid-19 vaccine development. In 2013, she accepted an offer to become Senior Vice President at BioNTech after UPenn refused to reinstate her to the faculty position she had been demoted from in 1995. “They told me that they’d had a meeting and concluded that I was not of faculty quality,” she said. ”When I told them I was leaving, they laughed at me and said, ‘BioNTech doesn’t even have a website.’”
Donald Knuth is a legend amongst computer scientists.
I have been a happy man ever since January 1, 1990, when I no longer had an email address. I’d used email since about 1975, and it seems to me that 15 years of email is plenty for one lifetime.Email is a wonderful thing for people whose role in life is to be on top of things. But not for me; my role is to be on the bottom of things. What I do takes long hours of studying and uninterruptible concentration. I try to learn certain areas of computer science exhaustively; then I try to digest that knowledge into a form that is accessible to people who don’t have time for such study. [emphasis added]
On retirement:
I retired early because I realized that I would need at least 20 years of full-time work to complete The Art of Computer Programming (TAOCP), which I have always viewed as the most important project of my life.
Being a retired professor is a lot like being an ordinary professor, except that you don’t have to write research proposals, administer grants, or sit in committee meetings. Also, you don’t get paid.
My full-time writing schedule means that I have to be pretty much a hermit. The only way to gain enough efficiency to complete The Art of Computer Programming is to operate in batch mode, concentrating intensively and uninterruptedly on one subject at a time, rather than swapping a number of topics in and out of my head. I’m unable to schedule appointments with visitors, travel to conferences or accept speaking engagements, or undertake any new responsibilities of any kind.
John Baez is indeed a relative of that other famous J(oan) Baez. I used to read his blog avidly
The great challenge at the beginning of ones career in academia is to get tenure at a decent university. Personally I got tenure before I started messing with quantum gravity, and this approach has some real advantages. Before you have tenure, you have to please people. After you have tenure, you can do whatever the hell you want — so long as it’s legal, and you teach well, your department doesn’t put a lot of pressure on you to get grants. (This is one reason I’m happier in a math department than I would be in a physics department. Mathematicians have more trouble getting grants, so there’s a bit less pressure to get them.)
The great thing about tenure is that it means your research can be driven by your actual interests instead of the ever-changing winds of fashion. The problem is, by the time many people get tenure, they’ve become such slaves of fashion that they no longer know what it means to follow their own interests. They’ve spent the best years of their life trying to keep up with the Joneses instead of developing their own personal style! So, bear in mind that getting tenure is only half the battle: getting tenure while keeping your soul is the really hard part. [emphasis added]
Scientists fear that ‘covidization’ is distorting research
Scientists straying from their field of expertise in this way is an example of what Nathan Ballantyne, a philosopher at Fordham University in New York City, calls “epistemic trespassing”. Although scientists might romanticize the role and occasional genuine insight of an outsider — such as the writings of physicist Erwin Shrödinger on biology — in most cases, he says, such academic off-piste manoeuvrings dump non-experts head-first in deep snow. [emphasis added]
But I do love the language…
Haack, Susan, Not One of the Boys: Memoir of an Academic Misfit
Susan Haack is a wonderfully independent English borne philosopher who loves to roam, casting light wherever her interest takes her.
Better ostracism than ostrichism
Moreover, I have learned over the years that I am temperamentally resistant to bandwagons, philosophical and otherwise; hopeless at “networking,” the tit-for-tat exchange of academic favors, “going along to get along,” and at self-promotion
That I have very low tolerance for meetings where nothing I say ever makes any difference to what happens; and that I am unmoved by the kind of institutional loyalty that apparently enables many to believe in the wonderfulness of “our” students or “our” department or “our” school or “our” university simply because they’re ours.
Nor do I feel what I think of as gender loyalty, a sense that I must ally myself with other women in my profession simply because they are women—any more than I feel I must ally myself with any and every British philosopher simply because he or she is British. And I am, frankly, repelled by the grubby scrambling after those wretched “rankings” that is now so common in philosophy departments. In short, I’ve never been any good at academic politicking, in any of its myriad forms.
And on top of all this, I have the deplorable habit of saying what I mean, with neither talent for nor inclination to fudge over disagreements or muffle criticism with flattering tact, and an infuriating way of seeing the funny side of philosophers’ egregiously absurd or outrageously pretentious claims — that there are no such things as beliefs, that it’s just superstitious to care whether your beliefs are true, that feminism obliges us to “reinvent science and theorizing,” and so forth.
.
The Economist | Citizen of the world
From a wonderful article in the Economist
As Michael Massing shows vividly in “Fatal Discord: Erasmus, Luther and the Fight for the Western Mind” (2018), the growing religious battle destroyed Erasmianism as a movement. Princes had no choice but to choose sides in the 16th-century equivalent of the cold war. Some of Erasmus’s followers reinvented themselves as champions of orthodoxy. The “citizen of the world” could no longer roam across Europe, pouring honeyed words into the ears of kings. He spent his final years holed up in the free city of Basel. The champion of the Middle Way looked like a ditherer who was incapable of making up his mind, or a coward who was unwilling to stand up to Luther (if you were Catholic) or the pope (if you were Protestant).
The test of being a good Christian ceased to be decent behaviour. It became fanaticism: who could shout most loudly? Or persecute heresy most vigorously? Or apply fuel to the flames most enthusiastically?
And in case there is any doubt about what I am talking about.
In Britain, Brexiteers denounce “citizens of the world” as “citizens of nowhere” and cast out moderate politicians with more talent than they possess, while anti-Brexiteers are blind to the excesses of establishment liberalism. In America “woke” extremists try to get people sacked for slips of the tongue or campaign against the thought crimes of “unconscious bias”. Intellectuals who refuse to join one camp or another must stand by, as mediocrities are rewarded with university chairs and editorial thrones. [emphasis added]
As Erasmus might have said: ‘Amen’.
The history of science is the history of rejected ideas (and manuscripts). One example I always come back to is the original work of John Wennberg and colleagues on spatial differences in ‘medical procedures’ and the idea that it is not so much medical need that dictates the number of procedures, but that it is the supply of medical services. Simply put: the more surgeons there are, the more procedures that are carried out1. The deeper implication is that many of these procedures are not medically required — it is just the billing that is needed: surgeons have mortgages and tuition loans to pay off. Wennberg and colleagues at Dartmouth have subsequently shown that a large proportion of the medical procedures or treatments that doctors undertake are unnecessary2.
Wennberg’s original manuscript was rejected by the New England Journal of Medicine (NEJM) but subsequently published in Science. Many of us would rate Science above the NEJM, but there is a lesson here about signal and noise, and how many medical journals in particular obsess over procedure and status at the expense of nurturing originality.
Angus Deaton and Anne Case, two economists, the former with a Nobel Prize to his name, tell a similar story. Their recent work has been on the so-called Deaths of Despair — where mortality rates for subgroups of the US population have increased3. They relate this to educational levels (the effects are largely on those without a college degree) and other social factors. The observation is striking for an advanced economy (although Russia had historically seen increased mortality rates after the collapse of communism).
Coming back to my opening statement, Deaton is quoted in the THE
The work on “deaths of despair” was so important to them that they [Deaton and Case] joined forces again as research collaborators. However, despite their huge excitement about it, their initial paper, sent to medical journals because of its health focus, met with rejections — a tale to warm the heart of any academic whose most cherished research has been knocked back.
When the paper was first submitted it was rejected so quickly that “I thought I had put the wrong email address. You get this ping right back…‘Your paper has been rejected’.” The paper was eventually published in Proceedings of the National Academy of Sciences, to a glowing reception. The editor of the first journal to reject the paper subsequently “took us for a very nice lunch”, adds Deaton.
Another medical journal rejected it within three days with the following justification
The editor, he says, told them: “You’re clearly intrigued by this finding. But you have no causal story for it. And without a causal story this journal has no interest whatsoever.”
(‘no interest whatsoever’ — the arrogance of some editors).
Deaton points out that this is a problem not just for medical journals but in economics journals, too; he thinks the top five economics journals would have rejected the work for the same reason.
“That’s the sort of thing you get in economics all the time,” Deaton goes on, “this sort of causal fetish… I’ve compared that to calling out the fire brigade and saying ‘Our house is on fire, send an engine.’ And they say, ‘Well, what caused the fire? We’re not sending an engine unless you know what caused the fire.’
It is not difficult to see the reasons for the fetish on causality. Science is not just a loose-leaf book of facts about the natural or unnatural world, nor is it just about A/B testing or theory-free RCTs, or even just ‘estimation of effect sizes’. Science is about constructing models of how things work. But sometimes the facts are indeed so bizarre in the light of previous knowledge that you cannot ignore them because without these ‘new facts’ you can’t build subsequent theories. Darwin and much of natural history stands as an example, here, but my personal favourite is that provided by the great biochemist Erwin Chargaff in the late 1940s. Wikipedia describes the first of his ‘rules’.
The first parity rule was that in DNA the number of guanine units is equal to the number of cytosine units, and the number of adenine units is equal to the number of thymine units.
Now, in one sense a simple observation (C=G and A=T), with no causal theory. But run the clock on to Watson and Crick (and others), and see how this ‘fact’ gestated an idea that changed the world.
From this week’s Economist | Breaking through
Yet nowhere too little capital is being channelled into innovation. Spending on R&D has three main sources: venture capital, governments and energy companies. Their combined annual investment into technology and innovative companies focused on the climate is over $80bn. For comparison, that is a bit more than twice the R&D spending of a single tech firm, Amazon.
Market and state failure may go together. Which brings me back to Stewart Brand’s idea of Pace Layering
Education is intellectual infrastructure. So is science. They have very high yield, but delayed payback. Hasty societies that can’t span those delays will lose out over time to societies that can. On the other hand, cultures too hidebound to allow education to advance at infrastructural pace also lose out.
Pace Layering: How Complex Systems Learn and Keep Learning
I won’t even mention COVID-19.
I came across a note in my diary from around fifteen years ago. It was (I assume) after receiving a grant rejection. For once, I sort of agreed with the funder’s decision1. I wrote:
My grant was trivial, at least in one sense. Neils Bohr always said (or words to the effect) that the job of science was to reduce the profound to the trivial. The ‘magical’ would be made the ordinary of the everyday. My problem was that I started with the trivial.
As for the merits of review: It’s the exception that proves the rule.
Nice article in the Economist on how our ideas about speciation have been revised and updated. And not just for those animals but for humans too. In their words:
To be human, then, is to be a multispecies mongrel.
That “scientific management” bungled the algorithm for children’s exam results, verifies a maxim attributed to J.R. Searle, an American philosopher: if you have to add “scientific” to a field, it probably ain’t.
AD.Pellegrini in a letter to the Economist.
I have written elsewhere about this in medicine and science. We used to have physiology, but now some say physiological sciences; we used to have pharmacology, but now often see pharmacological sciences1. And as for medicine, neurology and neurosurgery used to be just fine, but then the PR and money grabbing started so we now have ‘clinical neuroscience’ — except it isn’t. As Herb Simon pointed out many years ago, the professions and professional practice always lose out in the academy.
Alas, there will no more new ones of these, as arguably the greatest of modern biology’s experimentalists, Sydney Brenner, passed away last year. One of his earlier quotes — the source I cannot find at hand — was that it is important in science to be out of phase. You can be ahead of the curve of fashion or possibly, better still, be behind it. But stay out of phase. So, no apologies for being behind the curve on these ones which I have just come across.
Sydney Brenner remarked in 2008, “We don’t have to look for a model organism anymore. Because we are the model organisms.”
Sydney Brenner has said that systems biology is “low input, high throughput, no output” biology.
Quoted in The science and medicine of human immunology | Science
Image source and credits via WikiCommons