Rankings matter! But what matters in rankings?
Apart from money, that is.
HigherEd is awash with rankings. Governments like them, and so do publishers. Just look at the THE, with its myriad of bullshit scores. The allure of bogus numbers, over judgment. A feel-good frenzy of metrics. When rankings of US colleges first came in, the assessors used to actually live on campus for a while, go to lectures, and talk to students. There was an attempt at face validity. Not any more. All you need is GIGO data, and you can sell it, or use it to buy power and kickbacks like the politicians. There was even a time when the notion of common sense mattered, but then came the RAE/REF. Then the TEF. Larry Lessig’s comment is worth repeating, again.
The best example of this, I am sure many of you know are familiar with this, is the tyranny of counting in the British educational system for academics, where everything is a function of how many pages you produce that get published by journals. So your whole scholarship is around this metric which is about counting something which is relatively easy to count. All of us have the sense that this can’t be right. That can’t be the way to think about what is contributing to good scholarship.
Anyway, at the back of last week’s Economist I came across a single page advert in the ‘Courses’ section, about IMD (shown below).
I suspect I would have passed over it, except that I used to meet up from time to time with a Professor of finance who lived in Edinburgh and worked at IMD. We had many conversations about teaching, and what impressed me was the focus on ‘education and teaching’ and thinking hard how to do it better. There is a lot about MBA programs that I do not understand, and a fair bit I am suspicious of, but I have little doubt they offer something medicine could learn from.
It piqued my interest enough to follow it up, so here is some more text from the IMD page.
At the end of September, we were informed that IMD would be included in the 2016 MBA ranking, despite the Economist’s initial agreement. This is surprising as we had not supplied any information and our participants and alumni had not been surveyed for this ranking. This contradicts the paper’s statistical method, which requires a minimum 25% survey response rate to be ranked.
Needless to say, IMD has serious reservations regarding the Economist’s methodology and its outcomes. In 2015, relative to the previous year’s ranking, LBS & IESE fell 9 ranks, IMD fell 11, and ESMT fell 23. Meanwhile, IE, Warwick and Macquire all jumped up 19 scores. As a result, Queensland, Warwick, Henley were ranked better than Cornell, London Business School, Carnegie Mellon and IMD!
They then go on to argue that the Economist ranking is scale dependent, and will discriminate against small schools (and, as we know small universities like Cal Tech are terrible…).They finish off with:
Again, IMD was not surveyed for the 2016 ranking and did not actively participate. Given this, we do not know which data the Economist will use to establish our position. What we do know is that the Economist ranking has just lost its last bit of credibility. Unfortunately, there is little IMD can do to stop the Economist from proceeding.
Says it all. Wake up. At least the Economist accepted the money (for the advert).