If you’re an avid reader of the Chronicle of Higher Education’s daily newsletter each morning, you may have been watching with interest the battle of the university rankings that’s going on.
Two long-time partners (well, 6 years which is a very long time in the university ranking business)– The Times Higher Educational Supplement (“THE,” Britain’s answer to the Chronicle) and Quality Systems International (“QSI”) who used to partner to deliver the UK’s world ranking of universities have gone through a Hollywood-style messy and very public divorce. QSI got the house – the previous ranking system – and is already dating. In fact, U.S. News and World Report has moved in and will be publishing QSI’s 2010/11 Survey in the U.S. next week. Not to be outdone, THE has found its own place and they’ve hooked up with Thomson Reuters.
You can read all about it in the Chronicle: http://chronicle.com/article/Times-Higher-Education/124455/?sid=at&utm_source=at&utm_medium=en QSI’s rankings came out last week -- http://chronicle.com/blogPost/Cambridge-Takes-Top-Spot-From/26757/ and certainly pleased the home crowd by ranking the University of Cambridge top. As an Oxford man, I laughed, of course.
The Times rankings are out today http://www.timeshighereducation.co.uk/world-university-rankings/2010-2011/top-200.html and continue to rank Harvard top, followed by CalTech, MIT, Stanford and Princeton, with Oxford and Cambridge tied for 6th – plausible but unlikely IMHO. But I digress.
The third world ranking system is that published by Shanghai Jiao Tong University, which actually pioneered the genre in 2003, a year ahead of THE/QSI, who debuted in 2004. Their 2010 ranking, called the Academic Ranking of World Universities (ARWU) can be found at: http://www.arwu.org/ARWU2010.jsp It also places Harvard first, with Cambridge 5th and Oxford 10th – clearly ludicrous.
My gut feel is that the THE rankings will become the leading methodology. I base this on (a) the reputation of THE; (b) their partnership with Thomson Reuters; and (c) their objectivity of their methodology (see below), which does not rely on reputational surveys, the primary criticism of all of U.S. News and World Report’s academic ranking systems. But just like in boxing where the dueling ranking schemes allow lots of people to be world champions, people will tend to favor whichever system they show up highest in.
I go into the rankings in some detail because I think they’re going to start to impact our lives. The THE rankings are highly quantitative and relatively objective, as you would expect from their partnership with Thomson Reuters, one of the world’s premier purveyors of business information. Here’s what the Chronicle said about THE’s methodology:
Nonetheless, Times Higher Education is emphasizing what it describes as the increased rigor of its new methodology, which according to its news release “places less importance on reputation and heritage than in previous years and gives more weight to hard measures of excellence in all three core elements of a university’s mission—research, teaching, and knowledge transfer.”
Foremost among the criticisms of the previous compilation was that it relied too heavily on a reputational survey of academics, based on fewer than 4,000 responses in 2009. THE's new methodology is based on 13 indicators in five broad performance categories—teaching (weighted 30 percent); research influence as measured in citations (32.5 percent); research, based on volume, income, and reputation (30 percent); internationalization, based on student and staff ratios (5 percent); and knowledge transfer and innovation based on industry income (2.5 percent).
(my emphasis)
In the discussion of the indices at http://www.timeshighereducation.co.uk/world-university-rankings/2010-2011/analysis-methodology.html#industry , THE says:
Industry income — innovation
This category is designed to cover an institution's knowledge-transfer activity. It is determined by just a single indicator: a simple figure giving an institution's research income from industry scaled against the number of academic staff.
We plan to supplement this category with additional indicators in the coming years, but at the moment we feel that this is the best available proxy for high-quality knowledge transfer. It suggests the extent to which users are prepared to pay for research and a university's ability to attract funding in the commercial marketplace — which are significant indicators of quality.
However, because the figures provided by institutions for this indicator were patchy, we have given the category a relatively low weighting for the 2010-11 tables: it is worth just 2.5 percent of the overall ranking score.
I have to feel that this is only the start. Licensing income can’t be far behind. What’s next? Startups?
You manage what you measure. Suddenly, innovation and knowledge transfer is going to really matter to our Presidents and Trustees. AUTM’s “New Metrics” initiative just became very, very real.
Let the games begin!