Wednesday, September 27, 2006

Undeserved Reputations?

As well as producing an overall ranking of universities last year, the Times Higher Education Supplement (THES) also published disciplinary rankings. These comprised the world's top 50 universities in arts and humanities, social sciences, science, technology and biomedicine.

The publication of the disciplinary rankings was welcomed by some universities that did not score well on the general rankings but were at least able to claim that they had got into the top fifty for something.

But there are some odd things about these lists. They are based exclusively on peer review and nothing else. For all but one list (arts and humanities), THES provides data about the number of citations per paper, although this is not used to rank the universities. This is a measure of the quality of the papers published since other researchers would normally only cite interesting research. It is noticeable that the relationship between the peer reviewers' opinions of a university and the quality of its research is not particularly high. For example, in science Cambridge comes top, but the average number of citations per paper is 12.9. This is excellent (I believe that the average number of citations of a scientific paper is just one) but Berkeley, Harvard, MIT, Princeton, Stanford, Caltech, ETH Zurich, Yale, Chicago, UCLA, University of California at Santa Barbara, Columbia, Johns Hopkins and the University of California at San Diego all do better.

It is, of course, possible that the reputation of Cambridge rests upon the amount of research produced rather than its overall quality or that the overall average disguises the fact that it has a few research superstars who contribute to its reputation and that is reflected in the peer review. But the size of the difference between the subjective score of the peer review and the objective one of the citation count is still a little puzzling.

Another thing is that for many universities there are no scores for citations per paper. Apparently, this is because they did not produce enough papers to be counted although what they did produce might have been of a high quality. But how could they get a reputation that puts them in the top 50 while producing so little research?

There are 45 universities that got into a disciplinary top 50 without a score for citations. Of these, 25 are in countries where QS, THES's consultants, have offices, and ten are in located in exactly the same city where QS has an office. Of the 11 universities (the seven Indian Institutes of Technology count as one) that got into more than one top 50 list, no less than eight are in countries where QS has an office, Monash, the China University of Science and Technology, Tokyo, the National University of Singapore, Beijing (Peking University), Kyoto, New South Wales and the Australian National University. Four of the eleven are in cities -- Beijing, Tokyo, Singapore and Sydney -- where QS has an office.

So, it seems that proximity to a QS office can count as much as quantity or quality of research. I suspect that QS chose its peer reviewers from those that they knew from meetings, seminars or MBA tours or those that had been personally recommended to them. Whatever happened, this suggests another way to get a boost in the rankings -- start a branch campus in Singapore or Sydney and show up at any event organised by QS and get on the reviewers' panel.

No comments: