Wednesday, August 30, 2006

Comparing the Newsweek and THES Top 100 Universities

It seems to be university ranking season again. Shanghai Jiao Tong University has just come out with their 2006 edition and it looks like there will be another Times Higher Education Supplement (THES) ranking quite soon. Now, Newsweek has joined in with its own list of the world’s top 100 universities.

The Newsweek list is, for the most part, not original but it does show something extremely interesting about the THES rankings.

What Newsweek did was to combine bits of the THES and Shanghai rankings (presumably for 2005 although Newsweek does not say). They took three components from the Shanghai index, the number of highly cited researchers, number of articles in Nature and Science, and the number of articles in the ISI Social Sciences and Arts and Humanities Indices (the SJTU ranking actually also included the Science Citation Index.) and gave them a weighting of 50 per cent. Then, they took four components from the THES rankings, percentage of international faculty, percentage of international students, faculty-student ratio and citations per faculty. They also added a score derived from the number of books in the university library.

Incidentally, it is a bit irritating that Newsweek, like some other commentators, refers to the THES as The Times of London. The THES has in fact long been a separate publication and is no longer even owned by the same company as the newspaper.

The idea of combining data from different rankings is not bad, although Newsweek does not indicate why they assign the weightings that they do. It is a shame, though, that they keep THES’s data on international students and faculty and faculty-student ratio, which do not show very much and are probably easy to manipulate.

Still, it seems that this ranking, as far as it goes, is probably better than either the THES or the Shanghai ones, considered separately. The main problem is that it includes only 100 universities and therefore tells us nothing at all about the thousands of others.

The Newsweek ranking is also notable for what it leaves out. It does not include the THES peer review which accounted for 50 per cent of the ranking in 2004 and 40 per cent in 2005 and the rating by employers which contributed 10 per cent in 2005. If we compare the top 100 universities in the THES ranking with Newsweek’s top 100, some very interesting patterns emerge. Essentially, the Newsweek ranking tells us what happens if we take the THES peer review out of the equation.

First, a lot of universities have a much lower position on the Newsweek ranking that they do on the THES’s and some even disappear altogether from the former. But the decline is not random by any means. All four French institutions suffer a decline. Of the 14 British universities, 2 go up, 2 have the same place and 10 go down. Altogether 26 European universities fall and five (three of them from Switzerland) rise.

The four Chinese (PRC) universities in the THES top 100 disappear altogether from the Newsweek top 100 while most Asian universities decline. Ten Australian universities go down and one goes up.


There are some truly spectacular tumbles. They include Peking University (which THES likes to call Beijing University), the best university in Asia and number 15 in the world, according to THES, which is out altogether. The Indian Institutes of Technology have also gone. Monash falls from 33 to 73, Ecole Polytechnique in Paris from 10 to 43, and Melbourne from 19 to 53.

So what is going on? Basically, it looks as though the function of the THES peer and employer reviews was to allow universities from Australia, Europe, especially France and the United Kingdom, and Asia, especially China, to do much better that they would on any other possible measure or combination of measures.

Did THES see something that everybody else was missing? It is unlikely. The THES peer reviewers are described as experts in their fields and as being research-active academics. They are not described as experts in teaching methodology or as involved in teaching or curricular reform. So it seems that this is supposed a review of the research standing of universities and not of teaching quality or anything else. And for some countries it is quite a good one. For North America, the United Kingdom, Germany, Australia and Japan, there is a high correlation between the scores for citations per faculty and the peer review. For other places it is not so good. There is no correlation between the peer review and citations for Asia overall, China, France, and the Netherlands. For the whole of the THES top 200 there is only a weak correlation.

So a high score on the peer review does not necessarily reflect a high research profile and it is hard to see that it reflects anything else.

It appears that the THES peer review, and therefore the ranking as a whole, was basically a kind of ranking gerrymandering in which the results were influenced by the method of sampling. QS assigned took about a third each of its peers from North America, Europe and Asia and then asked them to name the top universities in their geographic areas. No wonder that we have large numbers of European, Asian and especially Australian universities in the top 200. Had the THES surveyed an equal number of reviewers from Latin America and Africa (“major cultural regions”?) the results would have been different. Had they asked reviewers to nominate universities outside their own countries (surely quality means being known in other countries or continents?) they would have been even more different.

Is it entirely a coincidence that the regions that are disproportionately favoured by the peer review, the UK, France, China and Australia, are precisely those where QS, the consultants who carried the survey, have offices and are precisely those regions that are active in the production of MBAs and the lucrative globalised trade in students, teachers and researchers?

Anyway, it will be interesting to see if THES is going to do the same sort of thing this year.