According to a helpful app on my phone that I like to think acts as a brake on my sloth, I retired 313 days ago. One of the reasons I retired was so that I could get some serious work done; I increasingly felt that professional academic life was incompatible with the sort of academic life I signed up for. If you read my previous post, you will see this was not the only reason, but since I have always been more of an academic than clinician, my argument still stands.
Over twenty years ago, my friend and former colleague, Bruce Charlton, observed wryly that academics felt embarrassed — as though they had been caught taking a sly drag round the back of the respiratory ward — if they were surprised in their office and found only to be reading. No grant applications open; no Gantt charts being followed; no QA assessments being written. Whatever next.
I thought about retirement from two frames of reference. The first, was about finding reasons to leave. After all, until I was about 50, I never imagined that I would want to retire. I should therefore be thrilled that I need not be forced out at the old mandatory age of 65. The second, was about finding reasons to stay, or better still, ‘why keep going to work?’. Imagine you had a modest private income (aka a pension), what would belonging to an institution as a paid employee offer beyond that achievable as a private scholar or an emeritus professor? Forget sunk cost, why bother to move from my study?
Many answers straddle both frames of reference, and will be familiar to those within the universities as well as to others outwith them. Indeed, there is a whole new genre of blogging about the problems of academia, and employment prospects within it (see alt-acor quit-lit for examples). Sadly, many posts are from those who are desperate to the point of infatuation to enter the academy, but where the love is not reciprocated. There are plenty more fish in the sea, as my late mother always advised. But looking back, I cannot help but feel some sadness at the changing wheels of fortune for those who seek the cloister. I think it is an honourable profession.
Many, if not most, universities are very different places to work in from those of the 1980s when I started work within the quad. They are much larger, they are more corporatised and hierarchical and, in a really profound sense, they are no longer communities of scholars or places that cherish scholarly reason. I began to feel much more like an employee than I ever used to, and yes, that bloody term, line manager, got ever more common. I began to find it harder and harder to characterise universities as academic institutions, although from my limited knowledge, in the UK at least, Oxbridge still manage better than most 1. Yes, universities deliver teaching (just as Amazon or DHL deliver content), and yes, some great research is undertaken in universities (easy KPIs, there), but their modus operandi is not that of a corpus of scholars and students, but rather increasingly bends to the ethos of many modern corporations that self-evidently are failing society. Succinctly put, universities have lost their faith in the primacy of reason and truth, and failed to wrestle sufficiently with the constraints such a faith places on action — and on the bottom line.
Derek Bok, one of Harvard’s most successful recent Presidents, wrote words to the effect that universities appear to always choose institutional survival over morality. There is an externality to this, which society ends up paying. Wissenschaft als Beruf is no longer in the job descriptions or the mission statements2.
A few years back via a circuitous friendship I attended a graduation ceremony at what is widely considered as one of the UK’s finest city universities3. This friend’s son was graduating with a Masters. All the pomp was rolled out and I, and the others present, were given an example of hawking worthy of an East End barrow boy (‘world-beating’ blah blah…). Pure selling, with the market being overseas students: please spread the word. I felt ashamed for the Pro Vice Chancellor who knew much of what he said was untrue. There is an adage that being an intellectual presupposes a certain attitude to the idea of truth, rather than a contract of employment; that intellectuals should aspire to be protectors of integrity. It is not possible to choose one belief system one day, and act on another, the next.
The charge sheet is long. Universities have fed off cheap money — tax subsidised student loans — with promises about social mobility that their own academics have shown to be untrue. The Russell group, in particular, traducing what Humboldt said about the relation between teaching and research, have sought to diminish teaching in order to subsidise research, or, alternatively, claimed a phoney relation between the two. As for the “student experience”, as one seller of bespoke essays argued4, his business model depended on the fact that in many universities no member of staff could recognise the essay style of a particular student. Compare that with tuition in the sixth form. Universities have grown more and more impersonal, and yet claimed a model of enlightenment that depends on personal tuition. Humboldt did indeed say something about this:
“[the] goals of science and scholarship are worked towards most effectively through the synthesis of the teacher’s and the students’ dispositions”.
As the years have passed by, it has seemed to me that universities are playing intellectual whack-a-mole, rather than re-examining their foundational beliefs in the light of what they offer and what others may offer better. In the age of Trump and mini-Trump, more than ever, we need that which universities once nurtured and protected. It’s just that they don’t need to do everything, nor are they for everybody, nor are they suited to solving all of humankind’s problems. As had been said before, ask any bloody question and the universal answer is ‘education, education, education’. It isn’t.
That is a longer (and more cathartic) answer to my questions than I had intended. I have chosen not to describe the awful position that most UK universities have found themselves in at the hands of hostile politicians, nor the general cultural assault by the media and others on learning, rigour and nuance. The stench of money is the accelerant of what seeks to destroy our once-modern world. And for the record, I have never had any interest in, or facility for, management beyond that required to run a small research group, and teaching in my own discipline. I don’t doubt that if I had been in charge the situation would have been far worse.
Reading debt
Sydney Brenner, one of the handful of scientists who made the revolution in biology of the second half of the 20th century once said words to the effect that scientists no longer read papers they just Xerox them. The problem he was alluding to, was the ever-increasing size of the scientific literature. I was fairly disciplined in the age of photocopying but with the world of online PDFs I too began to sink. Year after year, this reading debt has increased, and not just with ‘papers’ but with monographs and books too. Many years ago, in parallel with what occupied much of my time — skin cancer biology and the genetics of pigmentation, and computerised skin cancer diagnostic systems — I had started to write about topics related to science and medicine that gradually bugged me more and more. It was an itch I felt compelled to scratch. I wrote a paper in the Lancet on the nature of patents in clinical medicine and the effect intellectual property rights had on the patterns of clinical discovery; several papers on the nature of clinical discovery and the relations between biology and medicine in Science and elsewhere. I also wrote about why you cannot use “spreadsheets to measure suffering” and why there is no universal calculus of suffering or dis-ease for skin disease ( here and here ); and several papers on the misuse of statistics and evidence by the evidence-based-medicine cult (here and here). Finally, I ventured some thoughts on the industrialisation of medicine, and the relation between teaching and learning, industry, and clinical practice (here), as well as the nature of clinical medicine and clinical academia (here and here ). I got invited to the NIH and to a couple of AAAS meetings to talk about some of these topics. But there was no interest on this side of the pond. It is fair to say that the world was not overwhelmed with my efforts.
At one level, most academic careers end in failure, or at last they should if we are doing things right. Some colleagues thought I was losing my marbles, some viewed me as a closet philosopher who was now out, and partying wildly, and some, I suspect, expressed pity for my state. Closer to home — with one notable exception — the work was treated with what I call the Petit-mal phenomenon — there is a brief pause or ‘silence’ in the conversation, before normal life returns after this ‘absence’, with no apparent memory of the offending event. After all, nobody would enter such papers for the RAE/REF — they weren’t science with data and results, and since of course they weren’t supported by external funding, they were considered worthless. Pace Brenner, in terms of research assessment you don’t really need to read papers, just look at the impact factor and the amount and source of funding: sexy, or not?5
You have to continually check-in with your own personal lodestar; dead-reckoning over the course of a career is not wise. I thought there was some merit in what I had written, but I didn’t think I had gone deep enough into the problems I kept seeing all around me (an occupational hazard of a skin biologist, you might say). Lack of time was one issue, another was that I had little experience of the sorts of research methods I needed. The two problems are not totally unrelated; the day-job kept getting in the way.
Time for coffee
Years ago, when I was sat in the coffee room6 of the biochemistry department in Newcastle, I struck up a conversation with Roger Paine, a professor of biochemistry, and an FRS. He commiserated with me about being a clinical academic, pointing out that original science requires time, and ’medics’ were always too busy, condemned to follow the crowd or latest ‘cool’ techniques that will churn out papers to keep the academic accountants happy. Valid points, said with both objectivity and empathy. Around the same time, Peter Friedman introduced me to a beautiful paper written by the Nobel Laureate Joe Goldstein — the deepest thinker about medical research I have ever come across. Goldstein described a fictional clinician scientist who suffered from PAIDS (paralysed academic investigator syndrome). It is a harrowing tale, and that the sad fictional researcher shared my initials, JR, meant it had a special resonance for me7.
John Stanley, a US dermatologist who used to work at the NIH8, published what are for me some of the best clinical science papers in dermatology of my lifetime. His research took an everyday clinical sign — blisters — and explained the diverse causes of many of them within a single molecular framework. In a conversation many years back, when he was visiting Edinburgh, he told me that what he found wonderful about NIH was that he could start reading at 9am, find something interesting fifteen minutes later, and know that he could keep reading until he chose to go home. The diary was empty9.
And if those who are blessed with the brain of an Andrew Wiles or John Rawls need time and space, then how much more is it necessary for those of us who run at a clock speed one or two orders of magnitude lower. Finally, it isn’t just time, it is the crushing effects on the spirit of working in a bureaucracy which no longer knows which mistress it serves. Not for me, the energy of Mahler:
“Mahler’s schedule in these [early] years was punishing—260 performances in two seasons in Kassel—and his determination to keep progressing as a composer required superhuman exertion. In 1895, he prepared the premiere of his Second Symphony with the Berlin Philharmonic while maintaining his day job at the Hamburg opera. Each evening, after the curtain fell in Hamburg, he would board a night train to Berlin. There he would direct a morning rehearsal of the symphony, lay down his baton, and get the train back to Hamburg for that evening’s performance. Composing his own work had to be squeezed into tiny gaps in his daunting schedule.”
In search of a method
If retirement might (note the emphasis) solve the time problem what about the method issue? For most day-to-day modern science, you read papers within a field, do some experiments that you hope might surprise you, then write a manuscript in a highly formalised manner. The papers are often dull, intelligible only to a subgroup of scientists, and, in David Hubel’s words, reading them is like chewing sawdust. This is not to say that some people cannot write better than others — Sydney Brenner’s papers used to make my head hurt not because they were unclear, but because his logic and far-sighted imagination stood austerely above all that surrounded him. Pierre Chambon, whose laboratory I spent some time in, could craft the most condense and persuasive English scientific prose despite it being his second language. The journal Nature loved it.
I think I knew how my sort of Wissenschaft worked. But what interested me more and more was not that constrained within the narrow definition of Wisssenschaft that is used in the Anglo-Saxon world, but other areas of systematic study that in German are described as a totality within the term Wissenschaft: Geisteswissenschaften und Naturwissenschaften. Saying things this way may seem pompous, but human knowledge and experience is of a piece. We forget this at our peril — and we are doing so.
Many years ago, I came across a quote from the Canadian / US economist, J. Kenneth Galbraith, that throughout my career has acted like an electronic beacon in space, sending out a message about the presence of intelligent life. His words were to the effect that the denigration of value judgment is one of the ways that the scientific establishment maintains its irrelevance. If you work within the practice of medicine, you realise that biology is not synonymous with medicine, that scientific advance is not synonymous with clinical advance, and there is more to human rationalities and human experience than science. Not all non-science is nonsense.
A meeting over coffee in Quartermile
Much of the University of Edinburgh campus is scattered across an area south of the Old Town. My academic base was within this space (alongside our dermatology clinic) whereas the ‘new’ medical school and main hospital campus is found in ‘Little France’, a few miles out from the main University campus. Little France has IMHO little going for it, and students almost invariably dislike being ‘out there’. I agree, it’s a dump. The central campus has so much more to offer, not least, lots of restaurants and eateries, and coffee shops.
One of the nicest coffee shops (note a theme of this essay?) is Peter’s Yard. It is glass-walled floor-to-ceiling on three sides, so great for people watching, and it is laptop tolerant with wooden communal tables and benches. Headphones are not viewed as odd. If you don’t like the prices, there is a Starbucks 20 metres away.
I was there to meet an academic from a well known US university. She worked in the humanities and wanted to talk with me about some aspects of my previous research10. This has happened to me a few times before, and I enjoy such encounters. I tend to talk freely, assuming that this may be helpful, although in truth, I don’t have a functioning mute button. So, no surprise, I tend to kick myself afterwards for not finding out more about their methods — how things work in their world — of the person interviewing me. This time was different: I was in search of a method.
She was writing a book and, with the exception of teaching, this was what she did as a scholar. Advancement in her career, as for virtually all academics, depended on getting published. In science, it tends to be articles, whereas in much of the humanities, the ‘unit’ is a book or monograph. I enquired more. I could imagine writing a textbook (even of the skincancer909 sort) where, since I know where A is, and where Z is, the work is made up of filling in the gaps. The gaps exist already and the task is to make certain you do not stop off too often along the way, such that the reader thinks you have short-changed them by changing the final resting point (a bit like this essay…).
But her task was different. Her search space was literally infinite and there was no linear A to Z path laid out — yet. She would find out where she was going once she got there, but not before. Not quite in the same universe of a Flann O’Brien novel, but closer to his than my world. Her words were not these, but these are the ones I choose to remember (as Flann O’Brien might have said): she would collect items as index cards, swoosh them round and see what came out to play. The skill was in ensuring that the sun was shining. Now I had a method.
She had not invented this method, although she may have re-discovered it, nor was it confined to the humanities. The following quotes are from David Epstein in his book Range11. He is talking about Charles Darwin.
He had at least 231 scientific penpals who can be grouped roughly into 13 broad seems based on his interests, from worms to human sexual selection. He peppered them with questions. He cut up their letters to paste pieces of information in his own notebooks, in which ‘ideas tumble over each other in a seemingly chaotic fashion’. When his chaotic notebooks became too unwieldy, he tore pages out and filed them by themes of inquiry. Just for his own experience with seeds, he corresponded with geologists, botanists, ornithologists, and conchologists in France, South Africa, the United States, the Azores, Jamaica, Norway, not to mention a number of amateur naturalists and some gardeners he happened to know.
Epstein quotes the psychologist Howard Gruber on Charles Darwin
Howard Gruber referred to what appeared from the outside, ‘a bewildering miscellany’ and ‘Charles Darwin’s greatest works represent interpretative compilations of facts first gathered by others.’
Working with the Ominium Gatherum
Keith Thomas, a Welsh historian, who spent most of his life at Oxford University, wrote an article ten years ago in the London Review of Books on such methods. First, he pointed out that historians don’t like to talk much about their methodology. They are a competitive bunch, so why give the magic of method away? Many will just rediscover it themselves because it is the only way to work successfully (‘survivor bias’).
Thomas writes:
In his splendid recent autobiography, History of a History Man, Patrick Collinson reveals that when as a young man he was asked by the medievalist Geoffrey Barraclough at a job interview what his research method was, all he could say was that he tried to look at everything which was remotely relevant to his subject: ‘I had no “method”, only an omnium gatherum of materials culled from more or less everywhere.’ Most of us would say the same.
He goes on
Scholars have always made notes. The most primitive way of absorbing a text is to write on the book itself. It was common for Renaissance readers to mark key passages by underlining them or drawing lines and pointing fingers in the margin — the early modern equivalent of the yellow highlighter. According to the Jacobean educational writer John Brinsley, ‘the choycest books of most great learned men, and the notablest students’ were marked through, ‘with little lines under or above’ or ‘by some prickes, or whatsoever letter or mark may best help to call the knowledge of the thing to remembrance’. Newton used to turn down the corners of the pages of his books so that they pointed to the exact passage he wished to recall. J.H. Plumb once showed me a set of Swift’s works given him by G.M. Trevelyan; it had originally belonged to Macaulay, who had drawn a line all the way down the margin of every page as he read it, no doubt committing the whole to memory.
And as for those who have gone before
The pencilled dots in the margin of many books in the Codrington Library at All Souls are certain evidence that A.L. Rowse was there before you. My old tutor, Christopher Hill, used to pencil on the back endpaper of his books a list of the pages and topics which had caught his attention. He rubbed out his notes if he sold the book, but not always very thoroughly, so one can usually recognise a volume which belonged to him.
Of course, you then need to make meta-notes (or an index) about all the primary notes in the books you have read. Painful experience warns you not to write on both sides of the paper. Before the modern computer, index cards and knitting needles could be used to aid your attempts to find notes that matched your search criteria, something I first learned about in my introductory computing classes (Wikipedia explains all— see the image via Peter Frankfurt on WikiP below).
Back to Thomas
The resulting fragments are of varying size, depending on the length of the passage transcribed. These sliced-up pieces of paper pile up on the floor. Periodically, I file them away in old envelopes, devoting a separate envelope to each topic…. I also keep an index of the topics on which I have an envelope or a file. The envelopes run into thousands.
As for the act of creation:
When the time comes to start writing, I go through my envelopes, pick out a fat one and empty it out onto the table, to see what I have got.
For his work, he points out that since his interests have never been narrowly focussed, they are unsuitable for a doctoral thesis which might need to be finished in a few short years…they are just the thing for a lifetime’s reading [emphasis added].
The lesson of Lord Acton
Now, 313 days in, I know where I have been and just maybe where I am going. Perhaps. I have little need of paper, just check out here or here on the web, or guides on note taking, or Andy Matuschak’s tools for thinking. It is nice to know I am not alone. Still, I fear grey clouds ahead, “methods do not maketh the man”, somebody should have said in some ancient text. Thomas again.
The awful warning is Lord Acton, whose enormous learning never resulted in the great work the world expected of him. An unforgettable description of Acton’s Shropshire study after his death in 1902 was given by Sir Charles Oman. There were shelves and shelves of books, many of them with pencilled notes in the margin. ‘There were pigeonholed desks and cabinets with literally thousands of compartments into each of which were sorted little white slips with references to some particular topic, so drawn up (so far as I could see) that no one but the compiler could easily make out the drift.’ And there were piles of unopened parcels of books, which kept arriving, even after his death. ‘For years apparently he had been endeavouring to keep up with everything that had been written, and to work their results into his vast thesis.’ ‘I never saw a sight,’ Oman writes, ‘that more impressed on me the vanity of human life and learning.’
Hmm.
- For a topical and important example see here. Also reported in the Guardian although the arguments are less clearly presented. ↩
- The title of one of Max Weber’s lectures. I make a play here and later on the meanings of the term Beruf and Wissenschaft. ↩
- For the record, it was in England. ↩
- As in essay mills, writing and selling essays that students submit as their own. Clearly immoral, but, as ever, crooks are often keen observers of the breakpoints of a society as well as early adopters of technology. ↩
- I think the RAE/REF have been a disaster for universities and in particular medical schools. I think the measures are flawed on many levels and for medical schools have damaged both undergraduate education and patient care. As for “counting”, Larry Lessig at a Harvard talk in 2014 said: “ I would push hard to resist the tyranny of counting. There is no necessary connection between ease of counting and the production of education (as in ‘likes’ etc. after leaving lecture halls) . And so it will be easy for the institution to say this is what we should be doing but we need to resist that to the extent that that kind of counting isn’t actually contributing to education. The best example of this, I am sure many of you know are familiar with this, is the tyranny of counting in the British educational system for academics, where everything is a function of how many pages you produce that get published by journals. So your whole scholarship is around this metric which is about counting something which is relatively easy to count. All of us have the sense that this can’t be right. That can’t be the way to think about what is contributing to good scholarship.” ↩
- The relation between coffee rooms and intellectual pursuits is worthy of note. See for instance Martin Nowak in Nature who refers to them as genius loci, and Jacob Bronowski’s comments in the Ascent of Man. ↩
- Peter Friedmann, whom I owe an enormous debt to for so many things — both professional and personal — put me on to this paper. I fear I didn’t take enough heed of the omen. ↩
- The US National Institutes of Health, in Bethesda, close to Washington DC. NIH is a collection of separate research institutes on one campus. Their level of funding tends to induce envy in many scientists who work out with the US. ↩
- Which brings to mind an interview with the Nobelist Bert Sakmann (who inventing patch clamping). Sakmann didn’t wear a watch as watches “make me nervous”. In the lab, he said, he doesn’t want to be nervous, and for most of his life he didn’t have to take care of time. “My family was always very generous. I could leave the lab whenever I wanted without people being offended.” in Reinventing the Future, Thomas A Bass, 1994, ISBN 0-201-62642-X ↩
- I am not trying to be obscure or show a lack of generosity, but simply do not want to second guess her plans or progress. ↩
- The full title is Range: Why Generalists Triumph in a Specialized World. ISBN 978-1-5098-4349-7 ↩