Clinical competence as dark matter
One of the most telling metaphors about competence I know comes from Frank Davidoff:
Competence, in contrast, is like “dark matter” in astronomy: although it makes up most of the universe of working knowledge, we understand relatively little about it. What does it really consist of? Which of its components are most important? How do people acquire it? What’s the best way to measure it? And how can you tell when they have enough of it? (Carracio et al ., 2002; Epstein and Hundert, 2002).
The importance of this view is echoed by another quote from Atul Gawande when he pointed out that lack of competence is now [perhaps] as big a problem as our lack of knowledge about the natural world. Again, with a lovely turn of phrase, he wrote
To make ineptitude as much our struggle as ignorance
Competence is however not just a characteristic of an individual doctor, but of health systems too. You can debate which is the bigger cause of clinical incompetence—individual doctors, or dysfunctional and dishonest health systems— but limits on practising high quality medicine are I think increasingly a problem at the institutional level.
An example of this in my clinical area, is the way that largely untested assumptions about health delivery are forced on patients without real scrutiny of their effects on the quality of treatment. In this week’s BMJ there is a personal view by me in which I make a case for reconsidering how we deliver dermatology care, with the aim of increasing system-wide competence.
The organisation of care for patients with skin disease in the UK makes little sense, and reflects history, neglect, and an unrealistic expectation of the level of clinical skills that generalists can acquire and maintain.
Firstly, skin disease is not a big killer, and historically the NHS has been driven by mortality rather than disability. Skin disease has never been a priority.
Secondly, we continue to overestimate the knowledge and expertise of many doctors, and to underestimate our patients. Any debate about how to improve dermatological care is entirely predicated on the assumption that patients must first see their general practitioner.
The dogma is that if only we could teach more dermatology to our students or trainees then dermatological care would improve. The reality is different: we will only improve dermatological care when we omit general practitioners and let patients go direct to dermatologists who work outside hospitals. The resource that dermatology consumes in general practice simply needs redirecting.
Dermatology is one of the few specialties still heavily dependent on clinical perceptual skills.1 The learning curve is steep, and exposure to many cases is necessary. General practitioners may see only one melanoma every five or more years; this is not the way to remain competent. Most dermatological diagnosis is pattern recognition, so rule-based approaches, the stuff of so much general practice, is often unhelpful.2 Undergraduate training in dermatology is limited to about 10 days or fewer in most UK medical schools, and postgraduate training is even scarcer.3 It is surely conceit to imagine that clinical abilities in this area are meaningfully tested for most non-specialist practitioners at undergraduate or postgraduate level. Patients should know this and be aware of the contrast with much of Europe, where providing skin care outside hospital requires a certificate of specialist training in dermatology (dermatovenereology)—the same as a UK hospital consultant is required to hold.
This link gets you to the whole article (open access via this link only).