It is one of dermatology’s tedious and fun facts that, in contradistinction to say scabies or head lice, you treat body lice by treating not the patient (directly) but the clothing. The pharmacological agent is a washing machine. But the excerpt quoted below tells you something wonderful about science: you get things out that you never expected. Spontaneous generation — not in the real world — but in the world of ideas. Well, almost.
Scientific efforts to shed light on the prehistory of clothes have received an unexpected boost from another line of research, the study of clothing lice, or body lice. These blood-sucking insects make their home mainly on clothes and they evolved from head lice when people began to use clothes on a regular basis. Research teams in Germany and the United States analysed the genomes of head and clothing lice to estimate when the clothing parasites split from the head ones. One advantage of the lice research is that the results are independent from other sources of evidence about the origin of clothes, such as archaeology and palaeoclimatology. The German team, led by Mark Stoneking at the Max Planck Institute for Evolutionary Anthropology, came up with a date of 70,000 years ago, revised to 100,000 years ago, early in the last ice age. The US team led by David Reed at the University of Florida reported a similar date of around 80,000 years ago, and maybe as early as 170,000 years ago during the previous ice age. These findings from the lice research suggest that our habit of wearing clothes was established quite late in hominin evolution.
There was an article in the FT last week, commenting on an article in JAMA here. The topic is the use of AI (or, to be fair, other machine learning techniques) to help diagnose skin disease. Google will allow people to upload their own images and will, in turn, provide “guidance” as to what they think it is.
I think the topic important, and I wrote a little editorial on this subject here a few years ago with the strikingly unoriginal title of Software is eating the clinic. For about 8-10 years I used to work in this field but although we managed to get ‘science funding’ from the Wellcome Trust (and a little from elsewhere), and published extensively, we were unable to take it further via commercialisation. As is often the case, when you fail to get funded, you may not know why. My impression was that people did not imagine that there was a viable business model in software in this sort of area (we were looking for funds around 2012-2015). Yes, seemed crazy to me then, too (and yes, I know, Google have not proven there is a business model). Some of the answers via NHS and Scottish funding bodies were along the lines of come back when you prove it works, and then we will then fund the research.😤
A few days back somebody interested in digital health asked me what I thought about the recent work. Below is a lightly edited version of my email response.
If only we had been funded…. 😀. Only joking.
More accurately, late night thoughts from 26 years ago. I have no written record of my Edinburgh inaugural, but my Newcastle inaugural given in 1994 was edited and published by Bruce Charlton in the Northern Review. As I continue to sift through the detritus of a lifetime of work, I have just come across it. I haven’t looked at it for over 20 years, and it is interesting to reread it and muse over some of the (for me) familiar themes. There is plenty to criticise. I am not certain all the metaphors should survive, and I fear some examples I quote from out with my field may not be as sound as I imply. But it is a product of its time, a time when there was some unity of purpose in being a clinical academic, when teaching, research and praxis were of a piece. No more. Feinstein was right. It is probably for the best, but I couldn’t see this at the time.
The practice of medicine is made up of two elements. The first is an ability to identify with the patient: a sense of a common humanity, of compassion. The second is intellectual, and is based on an ethic that states you must make a clear judgement of what is at stake before acting. That, without a trace of deception, you must know the result of your actions. In Leo Szilard’s words, you must “recognise the connections of things and the laws and conduct of men so that you may know what you are doing”.
This is the ethic of science. William Gifford, the 19th century mathematician, described scientific thought as “the guide of action”: “that the truth at which it arrives is not that which we can ideally contemplate without error, but that which we may act upon without fear”.
Late last year when I was starting to think what I wanted to say in my inaugural lecture, the BBC Late Show devoted a few programmes to science. One of these concerned itself with medical practice and the opportunities offered by advances in medical science. On the one side. Professor Lewis Wolpert, a developmental biologist, and Dr Markus Pembrey, a clinical geneticist, described how they went about their work. How, they asked, can you decide whether novel treatments are appropriate for a patient except by a judgement based on your assessment of the patient’s wishes, and imperfect knowledge. Science always comes with confidence limits attached.
On the opposing side were two academic ethicists, including the barrister and former Reith Lecturer Professor Ian Kennedy. You may remember it was Kennedy in his Reith lectures who quoting Ivan Illicit described medicine itself as the biggest threat to people’s health. The debate, or at least the lack of it. clearly showed that we haven’t moved on very far from when C P Snow (in the year I was born) gave his Two cultures lecture. What do I mean by two cultures? Is it that people are not aware of the facts of science or new techniques?… It was recently reported in the journal Science that over half the graduates of Harvard University were unable to explain why it is warmer in summer than winter. A third of the British population still believe that the sun goes round the earth.
But, in a really crucial way, this absence of cultural knowledge is not nearly so depressing as the failure to understand the activity rather the artefacts of science. Kennedy in a memorable phrase described knowledge as a ‘tyranny’1. It is as though he wanted us back with Galen and Aristotle, safe in our dogma, our knowledge fossilised and therefore ethically safe and neutered. There is, however, with any practical knowledge always a sense of uncertainly. When you lift your foot off the ground you never quite know where it is going to come down. And, as in Alice in Wonderland, “it takes all the running you can do to stay in the same place”.
It is this relationship, between practice and knowledge and how if affects my subject that I want to talk about. And in turn, I shall talk about clinical teaching and diagnosis, research and the treatment of skin disease.
His younger co-workers, with their zippy metabolisms and surplus collagen, started referring to him as “the elder.”
I started my dermatological career in Vienna in the mid-1980s as a guest (I am deliberately not using the cognate German term) of Prof Klaus Wolf. Vienna, for close to two hundred years, has been a Mecca for all things dermatological, and Sam Shuster, in Newcastle, thought it wise to go somewhere else for up to a year — before returning to Newcastle. The plan was to learn some clinical dermatology and see how others worked. I had a great time — Vienna is a wonderful European city – and I didn’t work too hard. I learned some clinical basics, enjoyed the music (more ECM than opera) and spent some of my time doing a little lab work, more as a technician than anything else. I knew that when I returned to Newcastle I would spend a year or so as a registrar before applying for a MRC or Wellcome Training Fellowship (and for the medics amongst you, no, I never registered for higher training). In the meantime, as well as learning some clinical dermatology, I needed to learn some cell biology.
I went to medical school in 1976 and qualified in 1982, having taken a year out to study medical statistics (with an emphasis on the medical) and epidemiology, so I hadn’t any lab or cell biological experience. It was now 1986-87 and the preceding decade has seen a revolution in what we now call molecular cell biology — or just biology(?). I needed to teach myself some. Luckily, the best textbook I have ever read — the Molecular Biology of the Cell was published by James Watson and a bunch of other wonderful scientists in 1983 and my memory is that it was this first edition I bought. The book had attitude. The authors clearly loved their subject, and thought science was to do not so much with facts but the activity of designing and implementing experiments that whispered to you how the biological universe worked. They wanted to share that feeling with you, because one day, just perhaps, you might… On the back cover there was a picture of the authors pretending to be real superstars like the ‘fab four’ on that most famous of pedestrian-crossings in the world. (There is more on this here and here)
In the company of a good companion (a book in this case) there is little in biology that is very difficult. If you are motivated, even the absence of a personal teacher is not too serious a drawback. You would be better off with a teacher — if the cost of teacher was zero — but it would be wasteful to imagine that you need a teacher for a significant fraction of the time you need to spend studying. For some areas of biology, say quantitive genetics, the above statements may need tweaking a little, but the general point holds.
Almost a quarter century ago, I read a paper in PNAS on statistics by Peter Donnelly and David Balding on how to interpret DNA forensic evidence. I had studied a little statistics in my intercalated degree but a sentence from this paper made me sit up
We argue that the mode of statistical inference which seems to underlie the arguments of some authors, based on a hypothesis testing framework, is not appropriate for forensic identification.
The paper itself was remarkably clear even to somebody with little mathematics, and unpicking it signalled that I knew even less than I thought I knew. Several years later, it prompted me to go back and try and re-learn what little mathematics I had grasped at school, so that I might appreciate some modern genetics (and medical statistics) a little better.
Learning mathematics is different form learning biology. The absence of a teacher is more of an issue, but there are lots of historical examples showing that a good ‘primer’ with questions and answers allows many children to develop, if not high level skills, a facility with numbers. (I am talking here about using mathematics as a toolbox to follow how one can solve well defined problems — not push back the frontiers). A key aspect of this is the nature of mathematical proof, and how well you can obtain feedback on your abilities by submitting to the discipline of simple exercises with unambiguous answers. I don’t think there is a direct equivalent to this in most of biology but in the process of writing this today I see there are workbooks for the Molecular Biology of the Cell textbook. No doubt they help, but the uniqueness of the correct answer in maths is a wonderful guide and fillip.
I retired earlier this year (yes, thanks for asking, it’s wonderful), and one of the projects I had lined up was to learn a little more about a domain of human knowledge in which my ignorance had been bugging me for years. I had made some attempts in this area before — bought some books as an excuse for lack of effort — but had failed. I had found an excellent primer (in fact I bought it ten or so years ago), but speaking of the present, I have to say that I find the task hard, very hard. For me, its tougher than intermediate mathematics, and although there are questions at the end of each chapters there are no given answers. This is not a criticism of the book, but rather reflects the nature of the subject. A teacher or even a bunch of fellow
masochists students would help greatly. I make progress, but some more pedagogical infrastructure would, I feel, push me around the winding path a little faster. So, for several months I have been plodding away, mostly being disciplined, but because I have other things to do, occasionally falling off the wagon (indeed I note that I can multitask by falling off several wagons simultaneously).
All three stories are germane to how I think about undergraduate medical education and how it is far too wasteful and expensive. As for the how, that I must leave for another day very soon. Even without an exam in sight, I have to get some studying done. Spaced recall and immersion is the student’s friend.
Today is my last day of (paid) work, and of course a day that will be infamous for many more people for other more important reasons. Europe and my professional life have been intertwined for near on 40 years. In the mid 1980s I went to start my dermatological career in Vienna. I had been a student at Newcastle and done junior jobs there, as well as some research on skin physiology with Sam Shuster as an undergraduate student. Sam rightly thought I should now move somewhere else — see how others did things before returning — and he suggested Paris, or Vienna under Klaus Wolff. Vienna was, and perhaps still is, the centre of the dermatological universe, and has been since the mid 19th century. Now, even if I haven’t got very far into this post — it is a day for nostalgia — so allow me an aside: The German literature Problem.
As I have hinted at above, in many ways there have only been two schools of dermatology: the French school, and the German school. The latter has been dominant. Throughout the second half of the 19th century dermatology was a ‘German speaking’ subject. To follow it you would be wise to know German, and better still to have visited the big centres in Germany, Switzerland or Austria. And like most of the modern research university, German medicine and science was the blueprint for the US and then belatedly — and with typos— for England (Scotland, reasonably, had taken a slightly different path).
All of the above I knew, but when I returned to Newcastle after my first sojourn away (a later one was to Strasbourg), I naturally picked up on all these allusions to the German literature, but they were accompanied by sniggering by those who had been around longer than me. Indeed there seemed to be a ‘German Literature Problem’. Unbeknown to me, Sam had written “das problem ” up in ‘World Medicine’, but World Medicine had been killed off by those from Mordor, so here is a synopsis.
The German literature seemed so vast that whenever somebody described a patient with what they were convinced must be a ‘new syndrome’, some bright spark would say that it had already been described, and that it was to be found in the German literature. Now the synoptic Handbuch der Hautkrankheiten on our shelves in the library in Newcastle ran to over 10 weighty volumes. And that was just the start. But of course only German speaking dermatologists (and we had one) could meaningfully engage in this conversation. Dermatology is enough of a a nightmare even in your own mother tongue. Up to the middle of the 20th century however, there were indeed separate literatures in German, French and English (in the 1960’s the newly formed ESDR had to sort out what language was going to be used for its presentations).
Sam’s sense of play now took over (with apologies to Shaw: nothing succeeds like excess). It appeared that all of dermatology had already been previously described, but more worryingly for the researchers, the same might be true of skin science. In his article in World Medicine he set out to describe his meta-science investigation into this strange phenomenon. Sam has an unrivalled ability to take simple abstract objects — a few straight lines, a circle, a square — and meld them into an argument in the form of an Escher print. A print that you know is both real, unreal and illegal. Imagine a dastardly difficult 5 x 5 Rubik’s cube (such as the one my colleagues recently bought me for my retirement). You move and move and move the individual facets, then check each whole face in turn. All aligned, problem solved. But then you look in the mirror: whilst the faces are all perfect in your own hands, that is not what is apparent in the mirror. This is my metaphor for Sam’s explanation. Make it anymore explicit, and the German literature problem rears its head. It’s real — of a sort. Anyway, this was all in the future (which didn’t exist at that time), so lets get back to Vienna.
Having left general medical jobs behind in Newcastle, armed with my BBC language tapes and guides, I spent a month travelling through Germany from north to south. I stayed with a handful of German medical students who I had taught in Newcastle when I was a medical registrar (a small number of such students used to spend a year with us in Newcastle). Our roles were now reversed: they were now my teachers. At the end of the month I caught the night train in Ulm, arriving in Vienna early one morning.
Vienna was majestic — stiff collared, yes — but you felt in the heart of Europe. A bit of Paris, some of Berlin and the feel of what lay further east: “Wien ist die erste Balkanstadt”. For me, it was unmistakably and wonderfully foreign.
It was of course great for music, too. No, I couldn’t afford the New Year’s Day Concerts, but there were cheap seats at the Staatsoper, more modest prices at the Volksoper, and more to my taste, some European jazz and rock music. I saw Ultravox sing — yes, what else— “Vienna” in Vienna. I saw some people from the ECM label (eg Eberhard Weber), a style of European jazz music that has stayed with me since my mid teens. And then there was the man (for me) behind ‘The Thrill is Gone’.
I saw BB King on a double bill with Miles Davies at the Stadthalle. Two very different styles of musician. I was more into Miles Davies then, but he was not then at his best (as medics in Vienna found out). I was, however, very familiar with the ‘Kings’ (BB, Freddie, Albert etc) after being introduced to them via their English interpreters. Clapton’s blue’s tone on ‘All Your Love’ with John Mayall’s Bluesbreakers still makes the hairs on my neck stand up (fraternal thanks to ‘Big Al’ for the introduction).
The YouTube video at the top of the page is wonderful (Montreux 1993), but there is a later one below, taken from Crossroads in 2010 which moves me even more. He is older, playing with a bunch of craftsmen, but all still pupils before the master.
But — I am getting there — germane to my melancholia on this day is a video featuring BB King and John Mayer. Now there is a trope that there are two groups of people who like John Mayer: girlfriends; and guitarists who understand just how bloody good he is. As EC pointed out, the problem with John Mayer is that he realises just how good he is. True.
But the banter at the beginning of the video speaks some eternal truths about craft, expertise, and the onward march of all culture — including science. Mayer plays a few BB King licks, teasing King that he is ‘stealing’ them. He continues, it was as though he was ‘stealing silverware from somebody’s house right in front of them’. King replies: ”You keep that up and I’m going to get up and go”. Both know it doesn’t work that way. Whatever the provenance of the phrase ‘great artists steal, not copy’, except in the most trivial way you cannot steal or copy culture: people discover it in themselves by stealing what masters show them might be there in their pupils. Teachers just help people find what they suspect or hope is there. The baton gets handed on. The thrill goes on. And on.
My earliest conscious memory of disease and doctors was in the management of my atopic dermatitis. Here is Sam Shuster writing poetically about atopic dermatitis in ‘World Medicine’ in 1983.
A dozen years of agony; years of sleeplessness for child and parents, years of weeping, itching, scaling skin, the look and feel of which is detested.
The poverty of our treatments is made all the worse by the unfair raising of expectations: I don’t mean by obvious folk remedies; I mean medical folk remedies like the recent pseudoscientific dietary treatments which eliminate irrelevant allergens. There neither is nor ever was good evidence for a dietary mechanism. And as for cows’ milk, I would willingly drown its proponents in it. We have nothing fundamental for the misery of atopic eczema and that’s why I would like to see a real treatment—not one of those media breakthroughs, and not another of those hope raising nonsenses like breast-feeding: I mean a real and monstrously effective treatment. Not one of your P<.05 drugs the effect of which can only be seen if you keep your eyes firmly double-blind, I mean a straightforward here today and gone tomorrow job, an Aladdin’s salve—put it on and you have a new skin for old.
Nothing would please me more in the practice of clinical dermatology than never again to see a child tearing its skin to shreds and not knowing how long it will be before it all stops, if indeed it does.
Things are indeed better now, but not as much we need: we still don’t understand the itch nor can we easily block the neural pathways involved. Nor has anything replaced the untimely murder of ‘World Medicine’. A glass of milk has never looked the same since, either.
You can dice the results in various ways, but software is indeed eating the world — and the clinic. The (slow) transition to this new world will be interesting and eventful. A good spectator sport for some of us. (Interesting to note that this study in Lancet Oncology received no specific funding. Hmmm).
Direct URL for this post.
Genital scabies was, to the English, “Scotch itch,” and Scotland was “Itch-land.” The pox was the Spanish or Neapolitan Disease to the French; the French Disease to the Spanish, English, and Germans; the Polish Disease to the Russians; the Portuguese Disease to the Japanese. Captain Cook was chagrined to learn that it was called the British Disease in Tahiti as, in so many words, it was in Ireland: in Ulysses the Citizen, a rabid Irish nationalist, mocks Leopold Bloom’s reference to British civilization: “Their syphilisation you mean.”
Direct URL for this post.
On some Swedish trains, passengers carry their e-tickets in their hands—literally. About 3,000 Swedes have opted to insert grain-of-rice-sized microchips beneath the skin between their thumbs and index fingers. The chips, which cost around $150, can hold personal details, credit-card numbers and medical records. They rely on Radio Frequency ID (RFID), a technology already used in payment cards, tickets and passports.
One of these is going to end up being sectioned as some time….waiting for the first case-report. Not often I can get two puns in a three word title.
Direct URL for this post.
A couple of articles from the two different domains of my professional life made me riff on some old memes. The first, was an article in (I think) the Times Higher about the fraud detection software Turnitin. I do not have any firsthand experience with Turnitin (‘turn-it-in’), as most of our exams use either clinical assessments or MCQs. My understanding is that submitted summative work is uploaded to Turnitin and the text compared with the corpus of text already collected. If strong similarities are present, the the work might be fraudulent. A numerical score is provided, but some interpretation is necessary, because in many domains there will be a lot of ‘stock phrases’ that are part of domain expertise, rather than evidence of cheating. How was the ‘corpus’ of text collected? Well, of course, from earlier student texts that had been uploaded.
Universities need to pay for this service, because in the age of massification, lecturers do not recognise the writing style of the students they teach. (BTW, as Graham Gibbs has pointed out, the move from formal supervised exams to course work has been a key driver of grade inflation in UK universities).
I do not know who owns the rights to the texts students submit, nor whether they are able to assert any property rights. There may be other companies out there apart from Turnitin, but you can see easily see that the more data they collect, the more powerful their software becomes. If the substrate is free, then the costs relate to how powerful their algorithms are. It is easy to imagine how this becomes a monopoly. However, if copies of all the submitted texts are kept by universities then collectively it would make it easier for a challenger to enter the field. But network effects will still operate.
The other example comes from medicine rather than education. The FT ran a story about the use of ‘machine learning’ to diagnose retinal scans. Many groups are working on this, but this report was about Moorfields in London. I think I read that as the work was being commercialised, then the hospital would have access to the commercial software free of charge. There are several issues, here.
Although, I have no expert knowledge in this particular domain, I know a little about skin cancer diagnosis using automated methods. First, the clinical material and annotation of clinical material is absolutely rate limiting. Second, once the system is commercialised, the more any subsequent images can be uploaded the better you would imagine the system will become. This of course requires further image annotation, but if we are interesting in improving diagnosis, we should keep enlarging the database if the costs of annotation are acceptable. As in the Turnitin example, the danger is that the monopoly provider becomes ever more powerful. Again, if the image use remains non-exclusive, then it means there are lower barriers to entry.
Skin gets mentions (of course), but rarely do I see the ‘epidermis’ feature outwith the professional literature. But here — with the late Christopher Hitchens’ phrase — it does.
Identity politics can bring a more thoroughgoing fragmentation. In a country as diverse as the US, the number of ethnic groups — to take just that form of identity — will always exceed the number of social classes. To vote, as Christopher Hitchens once put it, “with the epidermis”, is to invite an endless subdivision of the electorate. And this does not even reckon with the criss-crossing variables of sex, sexual preference, religion and language.
Direct URL for this post.
In addition to its vulnerability to spoofing, for example, there is its gross inefficiency. “For a child to learn to recognize a cow,” says Hinton, “it’s not like their mother needs to say ‘cow’ 10,000 times”—a number that’s often required for deep-learning systems. Humans generally learn new concepts from just one or two examples.
There is a nice review on Deep Learning in PNAS. The spoofing referred to, is an ‘adversarial patch’ — a patch comprising an image of something else. In the example here, a mini-image of a toaster confuses the AI such that a very large banana is seen as a toaster (the paper is here on arXiv — an image is worth more than a thousand of my words).
Hinton, one of the giants of this field, is of course referring to Plato’s problem: how can we know so much given so little (input). From the dermatology perspective, the humans may still be smarter than the current machines in the real world, but pace Hinton our training sets need not be so large. But they do need to be a lot larger than n=2. The great achievement of the 19th century clinician masters was to be able to create concepts that gathered together disparate appearances, under one ‘concept’. Remember the mantra: there is no one-to-one correspondence between diagnosis and appearance. The second problem with humans is that they need continued (and structured) practice: the natural state of clinical skills is to get worse in the absence of continued reinforcement. Entropy rules.
Will things change? Yes, but radiology will fall first, then ‘lesions’ (tumours), and then rashes — the latter I suspect after entropy has had its way with me.
Genome-wide study of hair colour in UK Biobank explains most of the SNP heritability.
Michael D. Morgan, Erola Pairo-Castineira, Konrad Rawlik, Oriol Canela-Xandri, Jonathan Rees, David Sims, Albert Tenesa & Ian J. Jackson
[Link to Nature Comm paper] https://doi.org/10.1038/s41467-018-07691-z
My guess is this is likely my last ‘research paper’ (although I now choose to redefine what counts as research). But not my last ‘thinking paper’. I cannot help but contrast the sheer volume of activity with that from our original papers on red hair. Things seemed so much simpler when we were young. But it is a nice coda to a career fugue.
This is from an editorial in the NEJM, discussing the results of a trial of a synthetic peanut antigen to facilitate tolerance. Prevously the ‘raw’ stuff had been shown to be useful. The synethic version will of course cost a lot, and might be considered IPR created through regulatory arbitrage.
AR101 and other, similar products such as CA002, which is being developed by the Cambridge group, would therefore appear to have a role in initial dose escalation. The potential market for these products is believed to be billions of dollars. It is perhaps salutary to consider that in the study conducted by the Cambridge group, children underwent desensitization with a bag of peanut flour costing peanuts.
Costing penauts: I wish I had said that
(Professor) John Burton died recently. He worked in Newcastle before my time there, but his reputation was ever present. When I did some research on eccrine glands as a medical student, I came across some of the papers he published with Sam (Shuster) on sebum. But it was his textbook “Essentials of Medicine” that I still marvel at. A textbook that in places made you break out in laughter the way a Tom Sharpe novel did (The scene on a train was as follows: somebody quietly reading would foolishly try to suppress the inevitable convulsion of laughter — a glance at the book cover, would confirm the hypothesis). I suspect subsequent editions of ‘Essentials’ were sanitised — not certain the Welder from Wigan with Warts, or the differential diagnosis of lipstick on that most prized of organs, survived.
When I actually got to meet John, what struck me most was how dull and conventional he appeared. I am not certain what I expected, but at least with Sam, when you got to meet him, his appearance betrayed his intellect and ability to prick pomposity.
Since I was in charge of organising Sam’s retirement festivities in 1992, I invited John to provide the humour. The ‘old’ seminar room was standing room only, and John’s irreverence and ability to play an audience was — and I realised it at the time — magical. Perhaps the only occasion I ever saw Sam reduced to silence. As they say in Wales about such moments, ‘I was there’.
June, July are the busiest time of year for. It is when I update all my teaching material, and I always underestimate how long it will take me. Here is a guide to some of it. But still I need to catch up with some more, as the new students have already started.
This looks even more alarming if you factor in humidity. Human beings can tolerate heat with sweat, which evaporates and cools the skin. That is why a dry 50°C can feel less stifling than a muggy 30°C. If the wet-bulb temperature (equivalent to that recorded by a thermometer wrapped in a moist towel) exceeds 35°C, even a fit, healthy youngster lounging naked in the shade next to a fan could die in six hours.
At present, wet-bulb temperatures seldom exceed 31°C.
The first paper I ever published was on sweating. It was my entry as a medical student into dermatology, and the product of meeting Sam Shuster (the rest, as they say, is history). Sweat glands don’t get a lot of attention, but the ~3 million mini-kidneys are full of fascinating biology. Did you know you can shift more fluid through your sweat glands that you can pass urine (quite a thought, considering how many pints of beer some people can manage — and no, I do not have a reference for this factoid so readers beware……).
Here are the figures for skincancer909 my online textbook of skin cancer for medical students. The site was rewritten and updated in the final quarter of last year (with videos). Usage is 80% from search, with the rest from direct links. In June about 4,600 sessions. Local usage (Edinburgh) is around 5%. I am pleased, but financially poorer.
Despite my professional area of interest, I always forget how fast fingernails grow. But no longer.
Thanks to plate tectonics, America and Europe are moving apart at about the speed that fingernails grow. (Thanks to Bill Bryson)
The following is an excerpt from a review in press with Acta. You can see the full article with DOI 10.2340/00015555-2916 here
From the solar constant to thong bikinis and all stops in between.
A review of: “Sun Protection: A risk management approach.” Brian Diffey. IOP Publishing, Bristol, UK. ISBN 978-0-7503-1377-3 (ebook) ISBN 978-0-7503-1378-0 (print) ISBN 978-0-7503-1379-7 (mobi)
Leo Szilard was one of half a dozen or so physical scientists who, having attended the same Budapest gymnasium, revolutionised twentieth century physics. In 1934, whilst working in London, he realised that if one neutron hit an atom which then released two further neutrons, a chain reaction might ensue. Fearing of the consequences, he tried to keep the discovery secret by assigning the patent to the British Admiralty. In 1939, he authored the letter, that Einstein signed, warning the then US President of the coming impact of nuclear weapons.
After the war, in revulsion at the uses to which his physics had been applied, he swapped physics for biology. There was a drawback, however. Szilard liked to think in a hot bath, and he liked to think a lot. Once his interests had turned to biology he remarked that he could no longer enjoy a long uninterrupted bath — he was forever having to leave his bath, to check some factual detail (before returning to think some more). Biology seemed to lack the deep simplifying foundations of the Queen of Sciences.
Enrico Fermi was big on back-of-the-envelope calculations. I cannot match his brain, but I like playing with simple arithmetic. Here are some notes I made several years ago after reading a paper from Mistry et al in the British Journal of Cancer on cancer incidence projections for the UK.
For melanoma we will see a doubling between now (then) and 2030, half of this is increase in age specific incidence and half due to age change. Numbers of cases for the UK:
If we assume we see 15 non-melanomas (mimics) for every melanoma, the number of OP visits with or without surgery is as follows.
This is for melanoma. The exponent for non-melanoma skin cancer is higher, so these numbers are an underestimate of the challenge we face. Once you add in ‘awareness campaigns’, things look even worse.
At present perhaps 25% of consultant dermatology posts are empty (no applicants), and training numbers and future staffing allowing for working patterns, reducing. Waiting times to see a dermatologist in parts of Wales are over a year. The only formal training many receive in dermatology as an undergraduate can be measured in days. Things are worse than at any time in my career. It is with relief, that I say I am married to a dermatologist.
He is one of 10 case studies in Black Tudors, an enlightening and constantly surprising book about the men and women of African origin who found themselves on a cold island on the fringe of Europe amid a pale and pockmarked people.
One thing about trying to put the Internet and computing in context, is that you are forced to look back at the history of other communication revolutions (pace Tim Wu, John Naughton etc). It is now a well trodden path, but one I still find fascinating. Even down to the details of how the cost of distributing images or 3D moulages had an effect on my own specialty. The following caught my eye — or maybe my nose.
“When paper was embraced in Europe, it became arguably the continent’s earliest heavy industry. Fast-flowing streams (first in Fabriano, Italy, and then across the continent) powered massive drop-hammers that pounded cotton rags, which were being broken down by the ammonia from urine. The paper mills of Europe reeked, as dirty garments were pulped in a bath of human piss.”
No, still not finished but useable.
”On the business side , its advertising inventory, whether it is sold directly or via third parties, is made of different kinds of ads, ranging from high-value brands such as Rolex or Lexus to low-paying advertisers, like the toe fungus ads used to fill unsold spaces.”
You can’t sell news for what it costs to make – The Walkley Magazine – Medium