Tuesday, November 27, 2007

The THES-QS Rankings: Citations per Faculty

This year there have been two main changes in this section of the World University Rankings. First, a new database has been used. Second, as in the other sections, scores have been converted into Z scores.

The use of the Scopus database, which is run by the Dutch-based publishing company, Elsevier, is questionable. QS correctly state that, with over 15,000 journals and many other sources, it is generally more inclusive than the ESI database, which was used in previous years. A more comprehensive database is, however, not necessarily a better one if the objective is to evaluate quality as well as quantity of research. The Scopus database includes 785 conference proceedings and 703 trade journals out of 25, 483 titles. Such items are likely to be subject to a much less rigorous process of review or perhaps to none at all. Furthermore, 7,972 of the titles are listed as inactive.

It is possible therefore that the shift to Scopus means that a lot of mediocre or inferior research is being counted. Whether this is desirable in a measure of quality is debatable.

The most obvious feature of the Scopus database is its geographical bias. Here are the number of titles from selected countries:

US 8,090
UK 4,968
Netherlands 2,184
Germany 1,878
Japan 1,174
France 748
Australia 667
Canada 576
Switzerland 491
Russia 429
Korea 208
Belgium 39
Singapore 121
Taiwan 119
Hong Kong 59

In relation to population, number of universities, output of research, quality of research or almost anything else the UK appears overrepresented in relation to the USA. There are also, perhaps not surprisingly, many more journals from the Netherlands than from Belgium or countries with a similar population.

The citations per faculty section is now as biased towards the UK as the “peer review”. With a forty % weighting given to the "peer review", in which in 2006 UK respondents alone were 71% of those from the US and 20 % towards a citations count in , which UK items alone are 61 % of those from the USA, it is difficult to avoid the conclusion that this is a blatant exercise in academic gerrymandering.

What is more, this measure seems to have little validity. Looking at the rankings in relation to the other criteria, we find that the correlations are very low and usually insignificant.

“Peer review” .260
Employer review - .008
Faculty student ratio .088
International faculty .018
International students .039

The only significant correlation, a slight one, is with the "peer review". There is then no association between the university’s performance on this criterion and four of the five others. In 2006, when the ESI database was used the correlations were much stronger:

"Peer review” .480
Employer review . 348
Faculty student ratio .135
International faculty –045
International students .094

There is also a very modest correlation of .467 between the citations per faculty in 2006 and in 2007 (among the 174 universities that were in the top 200 in both years). It seems that Alejandro Pisanty is quite correct when he says that these look like two completely different sets of data.

“The canvassing of publications and citations seems to bring results which are so different, using Scopus instead of the Thomson/ISI products of the last three years, and the changes are in such way non-uniform among institutions, that it seems appropriate to consider the new version really a new ranking. There will hardly be any comparability with the previous years.”



Furthermore, there are some entries here that look a bit strange. Is the University of Alabama really the fifth best university in the world on this measure, Pohang University of Science the 12th, Renssalaer Polytechnic Institute the 36th? And do leading British universities really deserve to be so low, with Cambridge in 80 th place and Imperial College in 86th? Certainly, they are grossly overrated in the “peer review” but are they as bad at research as this data suggests?

There are also dramatic and suspicious changes from 2006. Cambridge is down from 47th place to 80th, National University of Singapore up from 160th to 74th, Kuopio (Finland) down from 14th to 70th, Tokyo Institute of Technology up from 58th to29th.

So, we now have a database that emphasises quantity rather than quality, which has an even more pronounced pro-British and anti-American bias and which is noticeably lacking in validity.
I will conclude by returning to the extraordinarily poor performance of Cambridge, Oxford and Imperial on this criterion, below previous years and well below their performance on the more contemporary parts of the Shanghai Jiao Tong rankings.

I wonder whether at least a part of the answer can be found in the faculty part of the equation. Is it possible that faculty numbers in British universities have been inflated to give a high score for student faculty ratio at the price, probably very acceptable, of driving down the score for citations per faculty?

1 comment:

Anonymous said...

I seldom leave comments on blog, but I have been to this post which was recommend by my friend, lots of valuable details, thanks again.