Tuesday, September 12, 2006

More on the THES Peer Review

There are some odd things about the peer review section of the Times Higher Education Supplement (THES) world universities ranking. If you compare the scores for 2004 and 2005 you will find that there is an extremely high correlation, well over .90, between the two sets of figures. (You can do this simply by typing the data into an SPSS file) This suggests that they might not be really independent data.

THES has admitted this. It has said that in 2005 the ratings of the 2004 reviewers were combined with those of an additional and larger set of reviewers. Even so, I am not sure that this is sufficient to explain such a close association.

But there is something else that is, or ought to be, noticeable. If you look at the figures one by one (doing some quick conversions becaue in 2004 top scoring University of California at Berkeley gets 665 in this category and in 2005 Harvard is top with 100) you will notice that everybody except Berkeley goes up. The biggest improvement is the University of Melbourne but some European and other Australian universities also do much better than average.

How is it possible that all universities can improve compared to the 2004 top scorer, with some places showing a much bigger improvement than others, while the correlation between the two scores remains very high?

I've received information recently about the administration of the THES peer review that might shed some light on this.

First, it looks as though QS, THES's consultants, sent out a list of universities divided into subject and geographical areas from which respondents were invited to choose. One wonders how the original list was chosen.

Next, in the second survey of 2005 those who had done the survey a year earlier received their submitted results and were invited to make additions and subtractions.

So, it looks as if in 2005 those who had been on the panel in 2004 were given their submissions for 2004 and asked if they wanted to make any changes. What about the additional peers in 2005? I would guess that they were given the original list and asked to make a selection but it would be interesting to find out for certain.

I think this takes us a bit further in explaining why there is such a strong correlation betweeen the two years. The old reviewers for the most part probably returned their lists with a few changes and probably added more than they withdrew. This would help to explain the very close correlation between 2004 and 2005 and the improvements for everyone except Berkeley. Presumably, hardly anybody added Berkeley in 2004 and a few added Harvard and others.

There is still a problem though. The improvement in peer review scores between 2004 and 2005 is much greater for some universities than for others and it does not appear to be random. Of the 25 universities with the greatest improvements, eight are located in Australia and New Zealand, including Auckland, and 7 in Europe, including Lomonosov Moscow State University in Russia. For Melbourne, Sydney, Auckland and the Australian National University there are some truly spectacular improvements. Melburne goes up from 31 to 66, Sydney, from 19 to 53, Auckland from 11 to 45 and the Australian National University from 32 to 64. (Berkeley's score of 665 in 2004 was converted to 100 and the other scores adjusted acordingly).

How can this happen? Is it plausible that Australian universities underwent such a dramatic improvement in the space of just one year? Or is it a product of a flawed survey design? Did QS just send out a lot more questionnaires to Australian and European universities in 2005?

One more thing might be noted. I've heard of one case where a respondent passed the message from QS on to others in the same institution, at least one of whom apparently managed to submit a response to the survey. If this sort of thng was common in some places and if it was accepted by QS, it might explain why certain unversities did strikingly better in 2005.

THES will, let's hope, be a lot more transparent about how they do the next ranking.

No comments: