Tuesday, November 27, 2007
This year there have been two main changes in this section of the World University Rankings. First, a new database has been used. Second, as in the other sections, scores have been converted into Z scores.
The use of the Scopus database, which is run by the Dutch-based publishing company, Elsevier, is questionable. QS correctly state that, with over 15,000 journals and many other sources, it is generally more inclusive than the ESI database, which was used in previous years. A more comprehensive database is, however, not necessarily a better one if the objective is to evaluate quality as well as quantity of research. The Scopus database includes 785 conference proceedings and 703 trade journals out of 25, 483 titles. Such items are likely to be subject to a much less rigorous process of review or perhaps to none at all. Furthermore, 7,972 of the titles are listed as inactive.
It is possible therefore that the shift to Scopus means that a lot of mediocre or inferior research is being counted. Whether this is desirable in a measure of quality is debatable.
The most obvious feature of the Scopus database is its geographical bias. Here are the number of titles from selected countries:
Hong Kong 59
In relation to population, number of universities, output of research, quality of research or almost anything else the UK appears overrepresented in relation to the USA. There are also, perhaps not surprisingly, many more journals from the Netherlands than from Belgium or countries with a similar population.
The citations per faculty section is now as biased towards the UK as the “peer review”. With a forty % weighting given to the "peer review", in which in 2006 UK respondents alone were 71% of those from the US and 20 % towards a citations count in , which UK items alone are 61 % of those from the USA, it is difficult to avoid the conclusion that this is a blatant exercise in academic gerrymandering.
What is more, this measure seems to have little validity. Looking at the rankings in relation to the other criteria, we find that the correlations are very low and usually insignificant.
“Peer review” .260
Employer review - .008
Faculty student ratio .088
International faculty .018
International students .039
The only significant correlation, a slight one, is with the "peer review". There is then no association between the university’s performance on this criterion and four of the five others. In 2006, when the ESI database was used the correlations were much stronger:
"Peer review” .480
Employer review . 348
Faculty student ratio .135
International faculty –045
International students .094
There is also a very modest correlation of .467 between the citations per faculty in 2006 and in 2007 (among the 174 universities that were in the top 200 in both years). It seems that Alejandro Pisanty is quite correct when he says that these look like two completely different sets of data.
“The canvassing of publications and citations seems to bring results which are so different, using Scopus instead of the Thomson/ISI products of the last three years, and the changes are in such way non-uniform among institutions, that it seems appropriate to consider the new version really a new ranking. There will hardly be any comparability with the previous years.”
Furthermore, there are some entries here that look a bit strange. Is the University of Alabama really the fifth best university in the world on this measure, Pohang University of Science the 12th, Renssalaer Polytechnic Institute the 36th? And do leading British universities really deserve to be so low, with Cambridge in 80 th place and Imperial College in 86th? Certainly, they are grossly overrated in the “peer review” but are they as bad at research as this data suggests?
There are also dramatic and suspicious changes from 2006. Cambridge is down from 47th place to 80th, National University of Singapore up from 160th to 74th, Kuopio (Finland) down from 14th to 70th, Tokyo Institute of Technology up from 58th to29th.
So, we now have a database that emphasises quantity rather than quality, which has an even more pronounced pro-British and anti-American bias and which is noticeably lacking in validity.
I will conclude by returning to the extraordinarily poor performance of Cambridge, Oxford and Imperial on this criterion, below previous years and well below their performance on the more contemporary parts of the Shanghai Jiao Tong rankings.
I wonder whether at least a part of the answer can be found in the faculty part of the equation. Is it possible that faculty numbers in British universities have been inflated to give a high score for student faculty ratio at the price, probably very acceptable, of driving down the score for citations per faculty?
Wednesday, November 21, 2007
Three years ago the administration at Universiti Malaya (UM) was celebrating getting into the world's top 100 universities according to the THES-QS rankings. A year later it turned out that it was just the result of one of many errors by QS Quacquarelli Symonds, the consultants who produced the rankings.
Now it looks as though the same thing is happening all over again, but in the opposite direction.
This year four Malaysian universities have fallen in the rankings. UM and Universiti Kebangsaan Malaysia (UKM) went right out of the top 200.
Commentators have listed several factors responsible for the apparent slide and proposed remedies. Tony Pua at Education in Malaysia says that
our universities are not competitive, are not rigourous in nature, do not promote and encourage merit and the total lack of transparency in admission and recruitment exercises served the perfect recipe for continual decline in global recognition and quality
According to the Malaysian opposition leader, Lim Kit Siang
JUST as Vice Chancellors must be held responsible for the poor rankings of their universities, the Higher Education Minister, Datuk Mustapha Mohamad must bear personal responsibility for the dismal international ranking of Malaysian universities - particularly for Malaysia falling completely out of the list of the world’s Top 200 Universities this year in the 2007 Times Higher Education Supplement (THES)-Quacquarelli Symonds (QS) World University Rankings.An article in the Singapore Straits Times reported that
eminent academician Khoo Kay Kim felt there was too much emphasis on increasing the number of PhD holders, instead of producing quality doctorate graduates. 'If this goes on, next year I expect the rankings to slip further,' he said.
Everyone seems to assume that the decline in the rankings reflects a real decline or at least a lack of quality that the rankings have finally exposed.
But is this in fact the case ?
To understand what really happened it is necessary to look at the methodological changes that have been introduced this year.
QS`have done four things. They have stopped respondents to their "peer review" selecting their own institutions. They are using full time equivalent (FTE) numbers for staff and students instead of counting heads. They now use the Scopus data base instead of ISI. They use Z scores which means that the mean of all scores is subtracted from the raw score. The result is divided by the standard deviation. Then the resulting figures are normalised with the mean score converted to 50.
The prohibition on voting for one's own university would seem like a good idea if was also extended to voting for one's alma mater. I suspect that Oxford and Cambridge are getting and will continue to get many votes from their graduates in Asia and Australia, which would seem just as bad as picking one's current employer.
Using FTEs is not a bad idea in principle. But note that QS` are still apparently counting research staff as faculty, giving a large and undeserved boost to places like Imperial College London.
I am sceptical about the shift to Scopus . This database includes a lot of conference papers, reviews and so on and therefore not all of the items included would have been subject to rigorous peer review It therefore might include research that is a a lower quality than that in the ISI database. There are some also strange things the citations section this year . The fifth best university for citations is the University of Alabama. (You have to look at the details for the top 400 to see this because it is not in the overall top 200.) According to QS's data, theEcole Normale Superieure in Paris is better than Harvard. Oxford is down at number 85, which seems a bit too low. It could be that the database is measuring quantity more than quality or perhaps there have been s number of errors.
Using Z scores is a standard practice among other rankers but it does cause huge fluctuations when introduced for the first time. What Z scores do is, in effect, to compress scores at the top and stretch them out lower down the rankings. They make it easier to distinguish among the mediocre universities at the price of blurring differences among the better ones.
So how did Malaysian universities do in 2007?
There is no point in looking at the scores for the various criteria. The introduction of Z scores means that scores in 2006 and 2007 cannot be compared. What we have to do is to work out the relative position of the universities on each component.
[Two days ago the scores for the six components of the rankings were available (registration required) at the QS topuniversities site for the top 400 universities. They could not be accessed today, which might mean that the details for the 400-500 univerities are being prepared or, just possibly, that errors are being corrected.]
In 2006 UM was 90th for peer review among the top 400 in that year. In 2007 it fell to 131st position among the top 400.
In 2006 it was 238th for recruiter rating . In 2007 it rose to159th place.
Student faculty ratio
In 2006 it was in 274th place for student faculty ratio. In 2007 it rose to 261st place.
In 2006 it was 245th for international faculty. In 2007 it rose to 146th place.
In 2006 it was 308th for international students. In 2007 it rose to 241st place.
In 2007 it was 342nd for citations per faculty . In 2007 it fell to 377th place.
This means that UM did much better compared to other universities on the following measures:
- Recruiter rating
- Student faculty ratio
- International students
- International faculty
It did somewhat worse on two items , peer review and citations. But notice that the number of places by which it fell are much less than than the number of places by which it rose, except for student faculty ratio.
The peer review was given a weighting of forty per cent and this meant that the modest fall here cancelled out the greater rises on the other sections.
It was, however, the citations part that scuppered UM this year. Without this, it would have remained roughly where it was. Basically falling from position 342 to 377 meant losing a bit more than 30 points on this section or about six points on the total score, sufficient to eject UM from the top two hundred.
Why should such a modest fall have such dire consequences?
Basically what happened is that Z scores , as noted earlier, compress scores at the top and stretch them out over the middle . Since the mean for the whole group is normalised at fifty and since the maximum score is hundred, an institution like Caltech will never get more than twice as many points as a university scoring around the mean, even if it were, as in fact it does, to produce ten times as much research.
So, in 2006 Caltech scored 100, Harvard 55 , the National University of Singapore (NUS) 8, Peking University 2 and UM 1.
Now in 2007, Caltech gets 100, Harvard 96, NUS 84, Peking 53 and UM 14.
The scores have been stretched out at the bottom and compressed at the top. But there has almost certainly been no change in the underlying reality.
So what is the real position? UM, it seems, has, relative to other universities, recruited more international staff and admitted more international students. Its faculty student ratio has improved very slightly relative to other universities. The employers contacted by QS think more highly of its graduates this year.
This was all cancelled out by the fall in the "peer review", which may in part have been caused by the prohibition on voting for the respondent's own institution.
The real killer for UM, however, was the introduction of Z scores I'll leave it to readers to decide whether a procedure that represents Caltech as only slightly better than Pohong University of Science and the University of Helsinki is superior to one that gives Peking only twice as many points as UM.
The pattern for the other Malaysian universities was similar, although less pronounced. It is also unfortunately noticeable that UKM got a very high score for international faculty, suggesting that an error similar to that of 2004 has occurred.
What is the real situation with regard to Malaysian universities? Frankly, I consider the peer review a dubious exercise and the components relating to student faculty ratio and internationalisation little better.
Counts of research and citations produced by third parties are, however, fairly reliable. Looking at the Scopus database (don't trust me -- get a 30 day free trial)l I found that 1,226 research papers (defined broadly to include things like reviews and conference papers) by researchers affiliated to Malaysian universities and other institutions were published in 2001 and 3,372 in 2006. This is an increase of 175% over 5 years.
For Singapore the figures are 5, 274 and 9,630, an increase of 83%.
For Indonesia they are 511 and 958, an increase of 87%.
For Australia they are 25,939 and 38, 852, an increase of 56%.
For Japan they are 89, 264 and 103, 428, an increase of 16%.
It is of course easy to grow rapidly when you start from a low base and these statistics say nothing about the quality of the research. Nor do they distinguish between a conference paper with a graduate student as sixth author and an article in a leading indexed journal.
Still the picture is clear. The amount of research done in Malaysia has increased rapidly over the last few years and has increased more rapidly than in Singapore, Japan and Australia. Maybe Malaysia is not improving fast enough but it is clear that there has been no decline, either relative or absolute, and that the THES-QS rankings have, once again, given a false impression of what has happened.
Tuesday, November 20, 2007
Note: corrections have been made to an earlier draft. Some of the figures for 2006 have been changed.
No, I am not being sarcastic. Information just released by QS, the consultants who prepare the data for the THES-QS rankings, shows that Malaysian universities have improved quite significantly in some respects over the last year.
QS have now published detailed information on the top 400 universities in the 2007 rankings. This confirms what I had suspected, namely that there has been no real decline in the quality of any Malaysian university and that the apparent fall in the positions of Universiti Malaya (UM), Universiti Kebangsaan Malaysia (UKM), Universiti Sains Malaysia (USM) and Universiti Putra Malaysia (UPM) is largely the result of nothing more than a change in methodology.
There is no point in comparing the scores in 2006 and 207 for the various components because of changes in methods this year. Basically, the introduction of Z scores means that any such comparison has no meaning. Last year, for example, UM was given a score of 1 for citations and 14 this year. That does not mean anything since the mean score among the top 400 universities for citations was 9 in 2006 and 66 in 2007. To measure genuine change it is necessary to look at the relative positions of the universities.
I have calculated the position of the Malaysian universities on the various criteria in 2006 and 2007 . In both years I have looked only at the top 400 since information for the universities below the 400th place in 2007 is not currently available. .
In 2006, University Malay was 90th for the "peer review", 238th for recruiter rating, 274th for student faculty ratio , 245th for international faculty, 308th for international students and 342th for citations per faculty. UM managed to get into the top 200 because the score for "peer review" was given a much larger weighting than any other criterion.
This year , UM was 131st for the "peer review", 159th for recruiter rating, 261st for student faculty ratio , 146th for international faculty, 241st for international students and 377th for citations per faculty.
Thus, if QS are to be believed, UM has improved its standing with local employers, recruited more teaching staff, and increased the numbers of international faculty and students, all relative to other universities . It did slightly worse on citations per faculty.
The only serious blemish was that UM did rather worse on the "peer review", perhaps because as QS has suggested, respondents were no longer allowed to vote for their own institutions.
So, how could UM suffer such a catastrophic fall?
The answer lies in the in the use of Z scores. To summarise a Z score is constructed when the population mean is subtracted from the raw score, divided by the standard deviation and then normalised,.
The effect of this is to squash scores together at the top and not at the bottom as was previously the case. To go back to the scores for citations, in 2006 UM got a score of 1 for citations, which was quite a bit below average. In 2007, because in the introduction of Z scores, the average was much higher. So UM got 1 for citations in 2006 and Peking ("Beijing" then) got 2. This year UM got 14 and Peking got 53. So Peking got an extra 39 points instead of one.
The switch from ISI to Scopus may also had had some effect but probably not all that much.
Similarly, we find that UPM improved its position for two criteria and USM and UKM for 3 each. All suffered a decline on the "peer review".
UM's fall in the "peer review" section did not make a dramatic difference. Had UM remained in 90th place it would have made a difference of only ten 10 points, 76 instead of 66, for that section.
UM 's supposed tumble happened solely because universities that are doing a bit more research are now getting a lot more points than before.
There has been no decline. Maybe Malaysian universities are not improving fast enough but that is quite a different thing from what the rankings appear to show and what is causing so much anxiety among Malaysian commentators.
Monday, November 19, 2007
- Imperial College Press is a joint venture of Imperial College and World Scientific.
- World Scientific is a Singapore-based publishing company whose subscription list is used by QS to construct their "peer review".
- Imperial gets a perfect score of 100 (rounded) for the "peer review" in the 2007 THES-QS rankings.
- Until last year, citation data were collected for QS by Evidence Ltd, a company headed by a former Imperial faculty member.
- QS gave Imperial a much better student faculty ratio than even the college itself claimed.
- Imperial is, according to the THES-QS rankings, the fifth best university in the world.
- Richard Sykes, Vice-chancellor of Imperial, is the second highest paid in the UK.
- Richard Sykes wants a massive increase in fees.
- Imperial is now the most popular UK destination for Singapore students.
- Richard Sykes is on QS's questionnaire telling respondents that it takes smart people to recognise smart people.
Is it conceivable that some of these might just possibly have something to do with one another?
One of the most remarkable things about the THES-QS rankings is the steady rise of Imperial College London. It has now reached 5th place, just behind Oxford, Cambridge and Yale and ahead of Princeton, MIT, Stanford and Tokyo.
How did this happen? Imperial's research performance is rather lacklustre compared with many American universities. The Shanghai Jiao Tong index puts it at 23rd overall, 33rd for highly cited researchers , 28th for publications in Science and Nature, and 29th for citations in the Science Citation Index.
Google Scholar also indicates that Imperial does much worse than many other places. A quick search comes up with 22,500 items for research published since 2002, compared to 22, 700 for Seoul National University, 25,800 for McGill, 44,00 for Tokyo and 151,00 for Princeton.
Imperial does well on the THES QS rankings partly because of outstanding scores on the peer review (99 out of 100) , employer review (99) and international students (100).
It also comes first (along with 15 others with scores of 100) for student faculty ratio. Is this justified?
On its web site QS indicates that Imperial has 2,963 full time equivalent (FTE) faculty and 12,025 FTE students, a ratio of 4.06.
However, if we look at Imperial's site we find that the college claims 12,129 FTE students and 1,114 academic and 1,856 research staff.
It appears that QS has counted both academic and research staff when calculating Imperial's ratio. Looking at other universities, it appears that it is QS's standard practice to count research staff who do not teach as part of the faculty total. In contrast, Imperial itself calculates the ratio by dividing students by academic staff to produce a ratio of 11.2. If that ratio had applied Imperial would have been many places lower.
If QS has been counting research staff in the total faculty score it would lead to the truly bizarre result that universities could hire a large number of researchers and get a substantial boost for the student faculty score.
So far it looks as though this is s general procedure and not a special privilege granted to Imperial alone but it would introduce a definite bias in favour of those universities that, like Imperial, employ large numbers of non-teaching researchers.
Sunday, November 18, 2007
The QS topuniversities site does not provide information about ranking components for universities outside the top 200. It does , however, provide links to pages with the raw data and their sources, for which they are to be highly commended.
It seems that this year Macquarie is recorded as having 1,018 full time equivalent faculty (865 headcount) and 255 full time equivalent international faculty (267 headcount). So, 25 % of Macquarie's faculty are international.
The information on total faculty was submitted by Baerbel Eckelmann on 8/10/07 and on international faculty by "Director" on 15/6/07.
The figure for 2006 was presumably much higher and incorrect. It would be interesting if Macquarie or QS could indicate how it was derived.
It would seem that one reason for the apparent decline of Macquarie was simply the correction of a previous error.
Friday, November 16, 2007
There have been quite different reactions to the latest THES-QS rankings in the USA and Australia. It seems that nobody has noticed that Washington University in St. Louis fell from 48th place in 2006 to 161st this year. But there has been a great deal of discussion about why Macquarie University fell from 82nd to 168th.
It is difficult to figure out exactly what happened since the introduction of a new scoring method makes it difficult to compare 2006 and 2007 but it is possible to compare the relative positions in these years of a university for the components of the rankings.
In 2006 Macquarie was 93rd in the "peer review", 46th for recruiter rating, 198th for student faculty rating, 159th for citations per faculty, 1st for international faculty and 13th for international students.
In 2007 Macquarie was 142nd for the "peer review", 62nd for recruiter rating, 189th for student faculty ratio, 190th for citations per faculty, 55th for international faculty and 11th for international students.
It seems that the decline of Macquarie is due primarily to a poorer score for the "peer review", possibly because of a change in QS's methodology that meant that respondents could not select their own institutions, and to a dramatic fall from first place for numbers of international faculty.
It is impossible that the latter represents a real change over the year unless Macquarie has been expelling hundreds of international lecturers. Either QS used the wrong figure for 2006 and the correct one this year or they got it right in 2006 but made a mistake this year.
The administration at Macquarie ought to be able to answer these questions:
Did Macquarie provide QS with any information about international faculty in 2006 and in 2007? If so, what information was given and was it correct?
Sunday, November 11, 2007
Since THES has suggested that Malaysian and Singaporean universities suffered in this year's rankings because respondents were not allowed to vote for their own institutions and since this is obviously not true of the Singaporean universities, who got scores of 100 and 84 for the "peer review", I think it would be a good idea to wait a bit before making assumptions about the cause of the apparent decline of Malaysian universities. It is not totally impossible that QS has made another error or errors.
Saturday, November 10, 2007
This year QS has introduced several "methodological enhancements" into the THES-QS rankings. One is the use of Z-scores. Basically, this means that the mean for all universities is deducted from the raw score and the result is then divided by the standard deviation. In effect, the score represents not an absolute number but how far each university is from the average. One consequence of using Z-scores is that differences at the very top are reduced.
In principle this is not a bad idea and other rankers do it but it has produced some odd results in this case.
In the survey of academic opinion, for example, the following universities all get a maximum score of 100: Harvard, Cambridge, Oxford, Yale, Caltech, MIT, Columbia, McGill, Australian National University, Stanford, Cornell, Berkeley, Melbourne, British Columbia, National University of Singapore, Peking and Toronto.
Do THES and QS really expect us to believe that Melbourne, British Columbia and Peking are just as good at research as Harvard ? Especially since Harvard is far ahead on every one of the subject rankings?
THES has a headline about fine tuning revealing distinctions. Really?
The National University of Singapore is among the best in Asia and has always been ranked highly by THES-QS. This year, however, it has fallen from 19th to 33rd.
THES suggest that Malaysian and Singaporean universities have suffered because the "peer review" no longer allows respondents to pick their own institutions. This would, not however, seem to apply to NUS -- and I wonder whether it applies to Malaysian universities either -- which got the maximum score of 100 (along with Oxford, Harvard and Caltech) on the survey. What happened was that NUS scored very poorly on the faculty student section.
It got 100 for "peer review", international faculty and international students, 93 for recruiter review, 84 for citations per faculty and 34 for faculty student ratio.
NUS has a self-reported ratio of about 17 students per faculty. Peking reports about 10 but QS gives it a score of 98, almost the same as Caltech at 100 with a well known ratio of about three.
There is something about this that needs some explanation.
Friday, November 09, 2007
The Kuala Lumpur New Straits Times has a report on the performance of Malaysian universities on the latest THES-QS rankings
Malaysian universities are on a slippery slope. None of them made it to the top 200 placing in the Times Higher Education Supplement (THES)-Quacquarelli Symonds (QS) World University Rankings this year.
This poor showing comes on the back of a recent government survey of local public universities which found that none deserved a place in the outstanding category. Last year, Universiti Kebangsaan Malaysia and Universiti Malaya made it to the top 200 in the THES-QS rankings. UKM ranked 185th, up from the 289th spot in 2005, beating well-known universities like University of Minnesota in the United States and University of Reading, Britain. This year, it has fallen to 309th. Similarly UM, which was ranked among the world's top 100 universities three years ago, was in 169th position in 2005 and tied with University of Reading in the 192nd spot last
year. It has dropped to the 246th spot. Universiti Sains Malaysia has fallen
to 307 from 277 last year.
UKM and UM vice-chancellors attributed their fall to the new methodology used to calculate rankings this year."Even the National University of Singapore (NUS) has dropped to the 33rd spot when it was always within the top 10," Universiti Malaya vice-chancellor Datuk Rafiah Salim said."The way I look at it, smaller countries like Malaysia are bound to lose out as THES has introduced new criteria which is peer review and has changed the citation and list of publications."Rafiah said with more
than 3,000 universities getting ranked by THES annually, Malaysian
universities had to improve if they wanted to remain on top of the list."If
we want to compete with some of the top universities in the world, first we
have to be in the same league. "Right now, we are not. One way to overcome
that is through adequate funding."She said NUS received an annual funding of
S$1.2 billion (RM2.7 billion) a year compared to UM's RM400 million annual
There is no mention of Universiti Putra Malaysia or Universiti Teknologi Malaysia both of which were on the list of universities sent out by QS this year.
It is impossible to be sure until the full data is released but I suspect that the "decline" of Malaysian universities has nothing to do with any real change but with QS preventing survey respondents from voting for their own institutions this year.
The THES-QS Top 200 universities list is available here.
There is a press release here.
The two Malaysian universities, UM and USM, are out of the top 200. Most probably this is because of new procedures for the "peer review".
Berkeley, National University of Singapore, Peking (well done QS for getting the name right), and LSE have fallen dramatically.
The IITs and IIMs are out of the top 200, maybe out of the rankings altogether.
Two Brazilian universities have risen dramatically.
Changes such as these could not possibly result from real changes but are most likely the consequence of "methodological enhancements", errors or the correction of errors.
The Sydney Morning Herald Herald reports on the THES-QS`rankings. Macquarie has fallen from 82 to 168.
AUSTRALIAN universities have slipped in one of the most respected world rankings. The most dramatic drop was suffered by Macquarie University - jeopardising a $100,000 bonus for its vice-chancellor, Steven Schwartz.
His bonus depends on improving Macquarie University's ranking in the Australian sector, but it has plummeted from 82 to 168 in the Times Higher Education Supplement's annual survey, released in Britain overnight. It has dropped from seventh to ninth among local universities.
Professor Schwartz, an American academic who had previously been head of Brunel University in Britain, replaced Di Yerbury in a messy coup last year. There was a bitter dispute between Macquarie and Professor Yerbury over ownership of paintings and other material she had accumulated over 19 years.
We will have to wait until the online results are available but the fall of Macquarie and perhaps of Steven Schwartz may have something to do with a reported change in the percentage of international faculty or possibly the introduction of z scores in the rankings. Last year Macquarie held top place for international faculty but QS did not reveal how they got the information and Macquarie did not confirm what the correct number was. Given the money at stake, it would not be totally astonishing if the 2006 figure for international faculty had been massaged a little bit.
The Economic Times of India has a report on the THES-QS`rankings:
Three Latin American universities make it to the world’s top 200, while even Africa makes a debut, with Cape Town ranked at 200. IIMs and IITs are not universities.
According to Martin Ince, who compiles and edits the survey, “The 2007 THES-QS World University Rankings are the most rigorous and complete so far. They show that the US and the UK model of independent universities supported with significant state funding produces great results.”
UK universities are closing in on their American counterparts, with University College, London, making it into the top 10 for the first time, and Imperial College, London, moving up from 9th to 5th this year. Chicago too, is a first time entrant into the top 10.
While the top 10 list is still restricted to the US and the UK universities, in the top 50, the addition to the Netherlands, 12 countries are featured in the top 50 compared to 11 in 2006.
Universities of Tokyo, Hong Kong, Kyoto, National University of Singapore, Peking, Chinese University of Hong Kong, Tsinghua and Osaka lead Asian higher education, all featuring in the top 50. The top 100 sees the number of Asian universities increase to 13 (12 in 2006), while the number of European Universities has dropped to 35 (41 in 2006).
North America strengthened its tally to 43 Universities (37 in 2006). McGill tops in Canada, and a number of universities from New Zealand and Australia have also joined the top 50 list.
The increasing trend in internationalisation is also borne out by the fact that 143 of the top 200 universities reported an increase in their percentage of international faculty to total faculty, while 137 of the top 200 universities reported an increase in their percentage of international students to total students.
The last comment is rather interesting. Is this genuine internationalisation or simply a manipulation of data provided by universities?
Thursday, November 08, 2007
Beerkens Blog has the top 100. Here are the top 20.
|1||HARVARD University||United States|
|2=||University of CAMBRIDGE||United Kingdom|
|2=||YALE University||United States|
|2=||University of OXFORD||United Kingdom|
|5||Imperial College LONDON||United Kingdom|
|6||PRINCETON University||United States|
|7=||CALIFORNIA Institute of Technology (Caltech)||United States|
|7=||University of CHICAGO||United States|
|9||UCL (University College LONDON)||United Kingdom|
|10||MASSACHUSETTS Institute of Technology (MIT)||United States|
|11||COLUMBIA University||United States|
|13||DUKE University||United States|
|14||University of PENNSYLVANIA||United States|
|15||JOHNS HOPKINS University||United States|
|16||AUSTRALIAN National University||Australia|
|17||University of TOKYO||Japan|
|18||University of HONG KONG||Hong Kong|
|19||STANFORD University||United States|
|20=||CORNELL University||United States|
|20=||CARNEGIE MELLON University||United States|
The Canadian newspaper The Gazette has a report on the performance of Canadian universities in the 2007 rankings. McGill has risen from 21st place to 12th. The Gazette reports that:
McGill University is the cream of Canadian schools, the best public university in North America and ranks 12th among the world's top 200 universities, according to a prestigious global survey.
Released today, the Times Higher Education Supplement has McGill bounding up from last year's 21st place showing based on such factors as emphasis on science programs, the strong contingent of international students and faculty, student/faculty ratios, and publications by faculty and graduate researchers. Harvard placed first on the list, while Oxford, Cambridge and Yale tied for second spot.
The report puts McGill ahead of such research-intensive powerhouses as Duke, Johns Hopkins, Stanford and Cornell. Findings are based on a combination of facts and opinions, with more than 5,000 academics around the world invited to rate a given institution. A key change in methodology this year made it impossible for professors to rate their own school.
"I'm really thrilled," said McGill principal Heather Munroe-Blum, who sees the results as a vindication of McGill's disciplined approach to academic planning, targeted hiring of 800 new professors and efforts to enhance both research and the undergraduate experience.
There are even more spectacular rises by Montreal (181 to 108), Queens (176 to 88), Waterloo (204 to 112), Western Ontario (215 to 126) and Simon Fraser (266 to 166).
Changes like this are most unlikely to be produced by real improvements by the universities concerned. Either the methodological changes introduced by QS`are having a greater impact than expected or some serious errors have occurred.
Also, if the statement about 5,000 academics originated from QS does this mean that this year QS sent out only 5,000 e-mails instead of nearly 200,000 as they claimed they did last year or that they received 5,000 forms or that they counted 5,000?
Three British universities in the THES-QS Top Five
Bits and pieces about the THES-QS 2007 rankings are appearing in online newspapers. Here is a quotation from the London Times
Cambridge and Oxford are the second best universities in the world according to the latest rankings, and British universities are closing the gap with those in the United States.
Oxford and Cambridge share the number two spot with Yale, with Harvard ranked number one in the latest league tables from The Times Higher Education Supplement.
The findings will bring cheer to the higher education sector in Britain at a time of growing concern among vice-chancellors and employers that British universities will lose students to better-financed institutions abroad and that business will then follow them with jobs and investment.
The commercial implications of the rankings are made very clear:
Professor Rick Trainor, the president of Universities UK, representing vice-chancellors, added: “Our competitors are increasingly marketing themselves more aggressively so it is vital that the UK remains among the foremost destinations for international students and staff.”Harvard, whose endowment of $35 billion (£16.6 billion) is roughly equal to the combined annual funding for all English universities, tops the table, but its lead over its closest rivals has fallen from 3.2 to 2.4 points. Nunzio Quacquarelli, the managing director of QS, the careers and education group that compiled the rankings, said: “In an environment of increasing student mobility, the UK is putting itself forward as a top choice for students worldwide.
“They are taking a closer look at the quality of faculty, international diversity and, of course, to the education they will receive.”
A detailed analysis will have to wait until the component scores are available but the continued closing of the gap between Oxbridge and Harvard and the rise of University College London from 25th to 9th and Imperial College London from 8th to 5th are rather suspicious.
The top ten are
1 Harvard University US
2 University of Cambridge UK
2 University of Oxford UK
2 Yale University US
5 Imperial College, London UK
6 Princeton University US
7 California Institute of Technology (Caltech) US
7 University of Chicago US
9 University College London (UCL) UK
10 Massachusetts Institute of Technology (MIT) US
Wednesday, November 07, 2007
Changes in the THES-QS Rankings
QS Quacquarelli Symonds have announced that the 2007 World Universities Rankings will be published on November 9th and that there will be a number of changes.
Firstly, QS will not allow respondents to their academic survey to vote for their own institutions. I am not sure how this can could be enforced if QS send out over a quarter of a million e-mails to World Scientific subscribers but it would in principle appear to a be a sensible change. However, this in itself will not affect other more serious problems with the “peer review” such as its marked regional bias and a suspiciously and unbelievably low response rate.
The second change is that QS will now use Scopus rather than the Web of Science for data about citations. .This will favour universities outside the
I suspect that the difference will not be very great. The dominance of the
Thirdly, QS will give Full Time Equivalent (FTE) counts for numbers of students and faculty rather than headcounts. This would eliminate some of the worst errors in previous rankings such as those relating to Ecole Polytechnique and Ecole Normale Superieure. However, there could be problems if the procedure is not applied consistently. QS say that where an FTE number has not been supplied, one will be calculated from the relationship between headcount and FTE numbers at other institutions in the same country or region.
This raises questions about the country or region that is used for benchmarking and whether QS will indicate how the ratio between headcount and FTE`is derived. Also, it seems rather dangerous to allow universities to submit their own data.
Finally, QS will calculate z scores for all components. Basically a z score is calculated by subtracting the population mean from the raw score and then dividing by the standard deviation. The effect of this will be to flatten the curves for each component and to ensure that similar changes will have similar effects on each section of the ranking.
QS are to be commended for introducing these changes providing that they implemented transparently and competently. There is no point in calculating z scores if you enter data for every university in the wrong row, as someone did for the student faculty ratio in QS’s book Guide to the World’s Top Universities, and create hundreds of errors
I have a further reservation. QS seems to have done nothing about using a database for the “peer review” that is provided by an Asian based and orientated publishing company, explaining how they could get an unprecedentedly low response rate without filtering the data in some way or .giving a large weighting to such an obviously biased and suspect set of data. It will be interesting to recalculate the scores to see what they look like without the peer review.
The combined effect of these changes is likely to be that some countries outside the top 100 may go down several places, even though nothing has really changed, leading to anguished debates about declining standards.