Showing posts sorted by relevance for query beijing university. Sort by date Show all posts
Showing posts sorted by relevance for query beijing university. Sort by date Show all posts

Thursday, October 19, 2006

Why Beijing University is not the Best in Asia

According to the Times Higher Education Supplement's (THES) recent ranking of world universities, Beijing University ( the correct name is actually Peking University, but never mind) is the best university in Asia and 14th in the world.

Unfortunately, it is not. It is just another mistake by QS Quacquarelli Symonds, THES's consultants. Unless, of course, they have information that has been kept secret from everybody else.

In 2005, Beijing University was, according to THES, ranked 15th in the world. This was partly due to remarkably high scores for the peer review and the recruiter ratings. It also did quite well on the faculty/student section with a score of 26. In that year the top score on that part of the ranking was Ecole Polytechnique in Paris whose 100 score appears to represent a ratio of 1.3 students per faculty. It seems that QS derived this ratio from their datafile for the Ecole, although they also give other figures in another part of their page for this institution. Comparing the Ecole's score to others confirms that this was the data used by QS. It is also clear that for this measure QS was counting all students, not just undergraduates, although there is perhaps some inconsistency about the inclusion of non-teaching faculty. It seems then that, according to QS, Beijing University had a ratio of five students per faculty.

Here is the page from QS's web site with the 2005 data for Beijing University.



Datafile
Demographic
No. of faculty:
15,558
No. of international
faculty:
617
No. of students:
76,572
No. of international
students:
2,015
No. of undergraduates:
15,182
No. of
international
undergraduates:
1,025
No. of postgraduates:
13,763
No. of
international postgrads:
308
Financial
Average
undergrad course
fees:
USD$ 3,700
Average postgrad course fees:
USD$
4,700


Annual library spend:
USD$ 72,000
Source:
World University Research (QS & Times Higher Education Supplement)

Postgraduate Course List
For information on undergraduate
courses, please look out for thelaunch of TopUniversities.com in March
2006



Notice that it indicates that there are 76,572 students and 15, 558 faculty, which would give a ratio of 4.92, very close to 5. We can therefore safely assume that this is where QS got the faculty/student ratio.

But there is something wrong with the data. QS gives a total of 76,572 students but there are only 15,182 undergraduates and 13,763 postgraduates, a total of 28, 945. So where did the 46,000 plus students come from? When there is such a glaring discrepancy in a text it usually means that two different sources were used and were imperfectly synthesised. If we look at Beijing University's web site (it calls itself Peking University), we find this data.





Faculty
At present, Peking University has over 4,574 teachers, 2,691 of whom
are full or associate professors. Among the teachers are not only a number
of senior professors of high academic standing and world fame, but also a host
of creative young and middleaged experts who have been working at the forefront
of teaching and research


And this.





At present, Peking University has 46,074 students.
15,001
undergraduates8,119 master candidates3,956 doctoral candidates18,998 candidates
for a correspondence courses or study at the night school1,776 international
students from 62 countries and regions


QS's data were used for the 2005 ranking exercise. The information on Peking University's web site has no doubt been updated since then. However, it looks like QS obtained the numbers of undergraduates and post graduates from Peking University's site although they left out the 18,998 correspondence and night school students that the university counted.

According to the university's definition of students and teachers, the faculty student ration would be 10.07. Excluding correspondence and night school students but counting international students gives us a ratio of 6.31. The former ratio would probably be the correct one to use. THES's definition of a student is someone "studying towards degrees or substantial qualifications" and there is no indication that these students are studying for anything less. Therefore, it seems that the correct ratio for ratio for Beijing University should be around 10 students per faculty.

Looking at the reference work The World of Learning 2003 (2002) we find that Beijing University had 55,000 students and 4,537 teachers. Probably the data reported to this reference included several thousand students from research institutes or branch campuses or was simply an overstatement. The number of teachers is however, almost identical. But whatever the exact numbers, it is clear that QS made a serious mistake and this meant that the score for faculty/student ratio in 2005 was incorrect. Since it appears that a similar or identical ratio was used for this year's ranking as well, the ratio for 2006 is also wrong.

We still have the problem of where QS came up with the figure of 76,572 students and 15, 558 faculty on its web site. It did not come from Peking University.

Or maybe it did. This is from a brief history of Peking University on its site.





After the readjustment, Peking University became a university comprising
departments of both liberal Arts and Sciences and emphasizing the teaching and
research of basic sciences. By 1962, the total enrollment grew to 10,671
undergraduate students and 280 graduate students. Since 1949, Peking University
has trained for the country 73,000 undergraduates and specialty students,
10,000 postgraduates and 20,000 adult-education students, and many of them have become the backbones on all fronts in China.

There has evidently been a massive expansion in the number of postgraduate students recently. The figure of 73,000 undergraduates who ever completed studies at Peking University is close enough to QS's total of students to arouse suspicion that somebody may have interpreted the data for degrees awarded as that for current enrollment.

There is another possible source. There are several specialist universities in the Beijing area, which is one reason why it is rather silly of THES and QS to refer to Peking University as Beijing University. These include the Beijing Foreign Studies University, the Beijing University of Aeronautics and Astronautics, the Beijing University of Business and Technology and so on.

The sum total of students at these institutions, according to the World of Learning is 75,746 students and 12, 826 teachers. The first is very close to QS's figure and the latter somewhat so. A bit of double counting somewhere might have brought the number of teachers closer to that given by QS. I am inclined to suspect that the figures resulted from an enquiry that was interpreted as a request for information about the specialist Beijing universities.

So what about 2006? Wherever the numbers came from this much is clear. Using Yale as a benchmark for 2006 ( there are problems discussed already with top scoring Duke) it would appear that the ratio of 5 students per faculty was used in the latter year as well as in 2005. But according to the data on the university web site, the ratio should be around 10.

What this means is that Beijing University should have got a score for faculty/student ratio of 31 and not 69. I calculate that Beijing University's overall score, applying THES's weighting, dividing by Harvard's total score and then multiplying by 100, should be 57.3. This would put Beijing University in 28th position and not 14th. It would also mean that Beijing University is not the best in the Asia-Pacific region. That honour belongs to the Australian National University. Nor is it the best in Asia. That would be the National University of Singapore. Also Tokyo and Melbourne are ahead of Beijing University.

If there is a mistake in these calculations please tell me and I will correct it.

This is of course assuming that the data for these universities is correct. We have already noted that the score for Duke is too high but if there are no further errors (a very big assumption I admit) then Beijing should have a much lower position than the one assigned by QS. If QS have information from Beijing University that has not been divulged to the public then they have a duty to let us know.


In a little while I shall write to THES and see what happens.




Tuesday, June 24, 2008

Resumption of Posting

Teaching and family affairs have kept me away from this blog for a few months. I hope to start posting regularly again soon.


QS’s Greatest Hits: Part One

For the moment, it might be interesting to review some of the more spectacular errors of QS Quacquarelli Symonds Ltd (QS), the consultants who collect the data for the Times Higher Education Supplement’s (THES) World University Rankings.

During its short venture into the ranking business QS has shown a remarkable flair for error. In terms of quantity and variety they have no peers. All rankers make mistakes now and then but so far there has been nobody quite like QS.

Here is a list of what I think are QS’s ten best errors, mainly from the rankings and their book, Guide to the World’s Top Universities (2007). Most of them have been discussed in earlier posts on this blog. The date of these posts is given in brackets. There is one error, relating to Washington University in St. Louis, from last year’s rankings,

It has to be admitted that QS seem to be doing better recently. Or perhaps I have not been looking as hard as I used to. I hope that another ten errors will follow shortly.


One: Faculty Student Ratio in Guide to the World’s Top Universities (2007). (July 27, 2007; May 11, 2007)

This is a beautiful example of the butterfly effect, with a single slip of the mouse leading to literally hundreds of mistakes.

QS’s book, Guide to the World’s Top Universities, was produced at the end of 2006 after the publication of the rankings for that year and contained data about student faculty ratios of over 500 ranked universities. It should have been obvious immediately that there was something wrong with this data. Yale is given a ratio of 34.1, Harvard 18, Cambridge 18.9 and Pretoria 590 .3. On the other hand, there are some ridiculously low figures such as 3.5 for Dublin Institute of Technology and 6.1 for the University of Santo Tomas (Philippines).

Sometimes the ratios given flatly contradict information given on the same page. So, on page 127 in the FACTFILE, we are told that Yale has a student faculty ratio of 34.3. Over on the left we are informed that Yale has around 10,000 students and 3,333 faculty.

There is also no relationship between the ratios and the scores out of 100 in the THES QS rankings for student faculty ratio, something that Matt Rayner asked about, without ever receiving a reply, on QS’s topuniversities web site.

So what happened? It’s very simple. Someone slipped three rows when copying and pasting data and every single student faculty ratio in the book, over 500 of them, is wrong. Dublin Institute of Technology was given Duke’s ratio (more about that later), Pretoria got Pune’s, Aachen RWT got Aberystwyth’s (Wales). And so on. Altogether over 500 errors.


Two: International Students and Faculty in Malaysian Universities.

In 2004 there was great jubilation at Universiti Malaya (UM) in Malaysia. The university had reached 89th place in the THES-QS world rankings. Universiti Sains Malaysia (USM) also did very well. Then in 2005 came disaster. UM crashed 100 places, seriously damaging the Vice-Chancellor’s career, and USM disappeared from the top 200 altogether. The Malaysian political opposition had a field day blasting away at the supposed incompetence of the university leadership.

The dramatic decline should have been no surprise at all. A Malaysian blogger had already noticed that the figures for international students and faculty in 2004 were unrelated to reality. What happened was that in 2004 QS were under the impression that larger numbers of foreigners were studying and teaching at the two Malaysian universities. Actually, there were just a lot of Malaysian citizens of Indian and Chinese descent. In 2005 the error was corrected causing the scores for international faculty and students to fall precipitously.

Later, THES referred to this as “a clarification of data”, a piece of elegant British establishment obfuscation that is almost as good as “being economical with the truth”


Three: Duke’s student faculty ratio 2005 ( October 30, 2006 )

Between 2004 and 2005 Duke rose dramatically in the rankings. It did so mainly because it had been given a very low and incredible student faculty ratio in the latter year, less than two students per faculty. This was not the best ratio in the rankings. That supposedly belonged to Ecole Polytechnique in Paris (more of that later). But it was favourable enough to give Duke a powerful boost in the rankings.

The ratio was the result of a laughable error. QS listed Duke as having 6,244 faculty, well in excess of anything claimed on the university’s web site. Oddly enough, this was exactly the number of undergraduate students enrolled at Duke in the fall of 2005. Somebody evidently had copied down the figure for undergraduate students and counted them as faculty, giving Duke four times the number of faculty it should have.


Four: Duke’s student faculty ratio 2006 (December 16, 2006)

Having made a mess of Duke’s student faculty ratio in 2005, QS pulled off a truly spectacular feat in 2006 by making an even bigger mess. The problem, I suspect, was that Duke’s public relations office had its hands full with the Lacrosse rape hoax and that the web site had not been fully updated since the fall of 2005. For students, QS apparently took undergraduate student enrollment in the fall of 2005, subtracted the number of undergraduate degrees awarded and added the 2005 intake. This is a bit crude because some students would leave without taking a degree, Reade Seligmann and Colin Finnerty for example, but probably not too inaccurate. Then, there was a bit of a problem because while the number of postgraduate degrees awarded was indicated on the site there was no reference to postgraduate admissions. So, QS seem to have deducted the degrees awarded and added what they thought was number of postgraduate students admitted, 300 of them, to the Pratt School of Engineering, which is an undergraduate, not a graduate school. Then, in a final flourish they calculated the number of faculty by doubling the figure on the Duke site, apparently because Duke listed the same number classified first by department and then by status.

The result was that the number of students was undercounted and the number of faculty seriously overcounted, giving Duke the best student faculty ratio for the year. Although the ratio was higher than in 2005 Duke was now in first place for this section because QS had calculated more realistic ratios for the Ecole Polytechnique and the Ecole Normale Superieure.


Five: Omission of Kenan Flagler from the Fortune business school rankings. (March 05, 2007)

On the surface this was a trivial error compared to some that QS has committed. They got the business school at the University of North Carolina mixed up with that of North Carolina State University. The grossness of this error is that while most American universities seem unconcerned about the things that QS writes or does not write about them, business schools evidently feel that more is at stake and also have considerable influence over the magazines and newspaper that publish rankings. Kenan-Flagler protested vociferously over its omission, Fortune pulled the ranking off its site, Nunzio Quacquarelli, director of QS, explained that it was the result of a lapse by a junior employee and stated that this sort of thing had never happened before and would never happen again.


Six: "Beijing University"

China’s best or second best university is Peking University. The name has not been changed to Beijing University apparently to avoid confusion with Beijing Normal University. There are also over twenty specialist universities in Beijing: Traditional Chinese Medicine, Foreign Languages, Aeronautics and so on.

In 2004 and 2005 THES and QS referred to Beijing University finally correcting it to Peking University in 2006.

This was perhaps not too serious an error except that it revealed something about QS’s knowledge of its own sources and procedures.

In November 2005. Nunzio Quacquarelli went to a meeting in Kuala Lumpur, Malaysia. Much of the meeting was about the international students and faculty at UM and USM. There was apparently also a question about how Beijing University could have got such a magnificent score on the peer review while apparently producing almost no research. The correct answer would have been that QS was trying to find research written by scholars at Beijing University, which does not exist. Quacquarelli, however, answered that “we just couldn’t find the research” because Beijing University academics published in Mandarin (Kuala Lumpur New Straits Times 20/11/05).

This is revealing because QS’s “peer review” is actually nothing more than a survey of the subscribers to World Scientific, a Singapore-based company that publishes academic books and journals, many of them Asia-orientated and mostly written in English. World Scientific has very close ties with Peking University. If Quacquarelli knew very much about the company that produces his company’s survey he would surely have known that it had a cozy relationship with Peking University and that Chinese researchers, in the physical sciences at least, do quite a lot of publishing in English.


Seven: Student faculty ratios at Yonsei and Korea universities (November 08, 2006)

Another distinguished university administrator whose career suffered because of a QS error was of Yonsei University. This university is a rival of Korea University and was on most measures its equal or superior. But on the THES – QS rankings it was way behind, largely because of a poor student faculty ratio. As it happened, the figure given for Korea University was far too favourable and much better even than the ratio admitted by the university itself. This did not, however, help Jung Chang-Young who had to resign.


Eight: Omission of SUNY – Binghamton, Buffalo and Albany

THES and QS have apologized for omitting the British universities of Lancaster, Essex and Royal Holloway. A more serious omission is the omission of the State University of New York’s (SUNY) University Centres at Buffalo, Albany and Binghamton. SUNY has four autonomous university centres which are normally treated as independent and are now often referred to as the Universities of Buffalo and Albany and Binghamton University. THES-QS does refer to one university centre as Stony Brook University, probably being under the impression that this is the entirety of the SUNY system. Binghamton is ranked 82nd according to the USNWR and 37th among public national universities (2008). It can boast several internationally known scholars such as Melvin Dubofsky in labour history and Immanuel Wallerstein in sociology. To exclude it from the rankings while including the likes of Dublin Institute of Technology and the University of Pune is ridiculous.


Nine: Student faculty ratio at Ecole Polytechnique (September 08, 2006)

In 2005 the Ecole Polytechnique went zooming up the rankings to become the best university in continental Europe. Then in 2006 it went zooming down again. All this was s because of extraordinary fluctuations in the student faculty ratio. What happened could be determined by looking at the data on QS’s topgraduate site. Clicking on the rankings for 2005 led to the data that was used for that year (it is no longer available). There were two sets of data for students and faculty for that year, evidently one containing part-time faculty and another with only full time faculty. It seems that in 2005 part-time faculty were counted but not in 2006.


Ten: Washington University in St Louis (November 11, 2007)

This is a leading university in every respect. Yet in 2007, QS gave it a score of precisely one for citations per faculty, behind Universitas Gadjah Mada, the Dublin Institute of Technology and Politecnico di Milano and sent it falling from 48th to 161st in the overall rankings. What happened was that QS got mixed up with the University of Washington (in Seattle) and gave all WUSL’s citations to the latter school.

Wednesday, September 22, 2010

The THE World University Rankings With Citations Set to Not Important

The THE rankings Iphone app has the excellent feature of allowing users to adjust the weightings of the five indicator groups. This is the top 200 when the citations -- research impact indicator is set to 'not important'. The number in brackets on the right is the position in the official ranking.


  1. Harvard (1)
  2. Caltech (2)
  3. MIT (3)
  4. Stanford (4)
  5. Princeton (5)
  6. Imperial College London (9)
  7. Cambridge ( 6)
  8. Oxford (6)
  9. Yale (10)
  10. UC Berkeley (8)
  11. UC Los Angeles (11)
  12. Johns Hopkins (13)
  13. Swiss Federal Institute of technology Zurich (15)
  14. University of Michigan (15)
  15. Chicago (12)
  16. Tokyo (26)
  17. Cornell (14)
  18. Toronto (17)
  19. University College London (22)
  20. Columbia (18)
  21. University of Pennsylvania (19)
  22. University of Illinoi-Urbana (33)
  23. McGill (35)
  24. Carnegie Mellon (20)
  25. Hong Kong (21)
  26. Georgia Institute of Technology (27)
  27. Kyoto (57)
  28. British Columbia (30)
  29. University of Washiongton (23)
  30. National University of Singapore (34)
  31. Duke (24)
  32. Peking (37)
  33. Universityof North Carolina (30)
  34. Karolinska Institute (34)
  35. Tsinghua University, Beijing (58)
  36. Northwestern University (25)
  37. Pohang University of scienc and technology (28)
  38. UC San Diego (32)
  39. Melbourne (36)
  40. UC Santa Barbara (29)
  41. Korean Advanced Institute of Science and Technology (79)
  42. UC Davis (54)
  43. University of Masachusetts (56)
  44. Washington University St Louis (38)
  45. Edinburgh (40)
  46. Australian National University (43)
  47. Minnesota (52)
  48. Purdue (106)
  49. Vanderbilt (51)
  50. LSE (86)
  51. Ecole Polytechnique (39)
  52. Case Western Reserve (65)
  53. Wisconsin (43)
  54. Ohio State (66)
  55. Delft University of Technology (151)
  56. Sydney (71)
  57. Brown (55)
  58. EPF Lausanne (48)
  59. Tokyo Institute of Technology (112)
  60. Osaka (130)
  61. Catholic University of Leuven (119)
  62. Univerity of Virginia (72)
  63. Tohoku (132)
  64. Ecole Normale Superieure Paris (64)
  65. Tufts (53)
  66. University of Munich (61)
  67. Manchester (87)
  68. Hing Kong University of Science and Technology (41)
  69. Emory (61)
  70. Gottingen (43)
  71. Seoul National University (109)
  72. Pittsburgh (54)
  73. Rutgers (105)
  74. New York University (60)
  75. Yeshiva (68)
  76. University of Southern California (73)
  77. Alberta (127)
  78. Uppsala (147)
  79. UC Irvine (49)
  80. University of Science and Technology China (49)
  81. Queensland (81)
  82. Ghent (124)
  83. Zurich (90)
  84. King’s College London (77)
  85. Eindhoven University of Technology (114)
  86. Ruprecht Karl University of Heidelberg (83)
  87. National Chiao Tung University (181)
  88. Rice (47)
  89. Lund (89)
  90. University of Utah (83)
  91. Royal Institute of Technology Sweden (193)
  92. Bristol (68)
  93. McMaster (93)
  94. Boston (59)
  95. Rensselaer Polytchnic Institute (1040
  96. University Of Colorado (67)
  97. Montreal (138)
  98. University of Iowa (132)
  99. National Taiwan University (115)
  100. Leiden (124)
  101. Notre Dame (63)
  102. University of Arizona (95)
  103. George Washington (103)
  104. Texas A & M (207)
  105. Georgetown (164)
  106. Lomonosov Moscow State (237)
  107. National Tsing Hua University (107)
  108. Geneva 118)
  109. Birmingham (145)
  110. Southampton (90)
  111. Wagening (114)
  112. Medical College of Georgia (158)
  113. Technical University of Munich (101)
  114. New South Wales (152)
  115. Illinois-Chicago (197)
  116. Michigan State (122)
  117. Trinity College Dublin (76)
  118. Tokyo Medical and Dental (217)
  119. Nanyang Technological (174)
  120. Technical University of Denmark (122)
  121. Sheffield (137)
  122. York (81)
  123. St Andrews 103)
  124. Nanjing (120)
  125. Lausanne (136)
  126. Glasgow (128)
  127. VU Amsterdam (13()
  128. Twente (185)
  129. Utrecht (143)
  130. Sung Kyun Kwan (230)
  131. Stony Brook (78)
  132. Wake Forest (90)
  133. Helsinki (102)
  134. Basel (95)
  135. Freiborg (132)
  136. Adelaide (73)
  137. Nagoya (206)
  138. Ruhr University Bochum
  139. Sao Paol o (232)
  140. Free University of Berlin (212)
  141. Maryland College Park (98)
  142. Warwick (220)
  143. Technion (221)
  144. Iowa State (156)
  145. Chalmers university of Technology (223)
  146. Dartmouth (99)
  147. RWTH Aachen (182)
  148. Kansas (232)
  149. Swedish University Agricultural sciences (199)
  150. Groningen (170)
  151. State University of Campinas (248)
  152. Nottingham (174)
  153. Leeds (168)
  154. Penn State (109)
  155. Maastricht (209)
  156. Zhejiang (197)
  157. Humboldt (178)
  158. Vienna (195)
  159. Hong Kong Polytechnic (149)
  160. Queen Mary London (120)
  161. Aarhus (167)
  162. Sussex (79)
  163. University of Georgia (246)
  164. National Sun Yat-Sen (163)
  165. William and Mary (75)
  166. Kiel (210)
  167. Lancaster (214)
  168. Indiana University ((156)
  169. Newcastle, UK (152)
  170. UC Santa Cruz (68)
  171. Aberdeen (149)
  172. Durham
  173. University College Dublin
  174. Liverspool (165)
  175. Dalhousie (193)
  176. University of delaware (159)
  177. UC Riverside (117)
  178. University of Amsterdam (165)
  179. Surrey (302)
  180. Konstanz (186)
  181. University of South Carolina (214)
  182. Wurzburg (168)
  183. Cape Town (107)
  184. Tokushima (317)
  185. Reading (210)
  186. Stockholm (129)
  187. University of Waterloo, Canada (267)
  188. Wshington State University (264)
  189. Copenhagen (177)
  190. Hokkaido (293)
  191. Hawaii (105)
  192. Yonsei (190)
  193. Leicester (216)
  194. Kyushu (294)
  195. Bergen (135)
  196. Shanghai Jiao Tong (258)
  197. Pierre and Marie Curie (140)
  198. ENS De Lyon (100)
  199. Erasmus (159)
  200. Tromso (227)


Monday, October 30, 2006

More on the Duke and Beijing Scandals

Sorry, there's nothing here about lacrosse players or exotic dancers. This is about how Duke supposedly has the best faculty-student ratio of any university in the world and how Beijing (Peking) university is supposedly the top university in Asia.

In previous posts I reported how Duke, Beijing and Ecole Polytechnique in Paris (see srchives) had apparently been overrated in the Times Higher Educational Supplement (THES) world university rankings because of errors in counting the number of faculty and students.

QS Quacquarelli Symonds, the consultants who conducted the collection of data for THES, have now provided links to data for each of the universities in the top 200 in the latest THES ranking.
Although some errors have been corrected, it seems that new ones have been committed.

First of all, this year Duke was supposed to be top for faculty-student ratio. The QS site gives a figure of 3,192 faculty and 11,106 students, that is a ratio of 3.48, which is roughly what I suspected it might be for this year. Second placed Yale, with 3,063 faculty and 11,441 students according to QS, had a ratio of 3.74 and Beijing (Peking University -- congratulations to QS for getting the name right this year even if THES did not), with 5,381 faculty and 26,912 students, a ratio of 5.01.

But are QS's figures accurate? First of all, looking at the Duke site, there are 13,088 students. So how did QS manage to reduce the number by nearly 2,000? No doubt, the site needs updating but universities do not lose nearly a sixth of their students in a year.

Next, the Duke site lists 1,595 tenure and tenure track faculty and 925 non-teaching faculty. Even counting the latter we are still far short of QS's 3,192.

If we count only teaching faculty the Duke faculty-student ration would be 8.21 students per faculty. Counting non-teaching faculty would produce a ratio of 5.20, still a long way behind Yale.
It is clear then from data provided by QS themselves that Duke should not be in first place in this part of the rankings. This means that all the data for this component are wrong since all universities are benchmarked against the top scorer in each category and, therefore, that all the overall scores are wrong. Probably not by very much, but QS does claim to be the best.

Where did the incorrect figures come from? Perhaps Duke gave QS a different set of figures from those on its web site. If so, this surely is deliberate deception. But I doubt if that is what happened for the Duke administration seems to have been as surprised as anyone by the THES rankings.

I am wondering if this has something to do with Duke in 2005 being just below Ecole Polytechnique Paris in the overall ranking, The Ecole was top scorer for the faculty-student component in 2005. Is it possible that the data for 2006 was entered into a form that also included the 2005 data and that the Ecole's 100 for 2005 was typed in for Duke for 2006? Is it possible then that the data for numbers of students and faculty were constructed to fit the score of 100 for Duke?

As for Beijing (Peking), QS this year provides a drastically reduced number of faculty and students, 5,381 and 26,912 respectively. But even these figures seem to be wrong. The Peking University site indicates 4,574 faculty. So where did the other 800 plus come from?

The number of students provided by QS is roughly equally to the number of undergraduates, master's and doctoral students listed on Peking University's site. It presumably excludes night school and correspondence students and international students. It could perhaps be argued that the first two groups should not be counted but this would be a valid argument only if the the university itself did not count them in the total number of students and if their teachers were not counted in the number of faculty. It still seems that the most accurate ratio would be about 10 students per faculty and that Beijing's overall position is much too high.

Finally, QS has now produced much more realistic data for the number of faculty at the Ecole Polytechnique Paris, Ecole Normale Superieure Paris and Ecole Polytechnique Federale Lausanne. Presumably, this year part-time staff were not counted.

Saturday, December 16, 2006

Open Letter to the Times Higher Education Supplement

This letter has been sent to THES

Dear John O’Leary
The Times Higher Education Supplement (THES) world university rankings have acquired remarkable influence in a very short period. It has, for example, become very common for institutions to include their ranks in advertising or on web sites. It is also likely that many decisions to apply for university courses are now based on these rankings.

Furthermore, careers of prominent administrators have suffered or have been endangered because of a fall in the rankings. A recent example is that of the president of Yonsei University, Korea, who has been criticised for the decline of that university in the THES rankings compared to Korea University (1) although it still does better on the Shanghai Jiao Tong University index (2). Ironically, the President of Korea University seems to have got into trouble for trying too hard and has been attacked for changes designed to promote the international standing, and therefore the position in the rankings, of the university. (3) Another case is the Vice-Chancellor of Universiti Malaya, Malaysia, whose departure is widely believed to have been linked to a fall in the rankings between 2004 and 2005, which turned out to be the result of the rectification of a research error.

In many countries, administrative decisions and policies are shaped by the perception of their potential effect on places in the rankings. Universities are stepping up efforts to recruit international students or to pressure staff to produce more citable research. Also, ranking scores are used as ammunition for or against administrative reforms. Recently, we saw a claim the Oxford’s performance renders any proposed administrative change unnecessary (4).

It would then be unfortunate for THES to produce data that is any way misleading, incomplete or affected by errors. I note that the publishers of the forthcoming book that will include data on 500+ universities include a comment by Gordon Gee, Chancellor of Vanderbilt University, that the THES rankings are “the gold standard” of university evaluation (5)). I also note that on the website of your consultants, QS Quacquarelli Symonds, readers are told that your index is the best (6)).

It is therefore very desirable that the THES rankings should be as valid and as reliable as possible and that they should adhere to standard social science research procedures. We should not expect errors that affect the standing of institutions and mislead students, teachers, researchers, administrators and the general public.

I would therefore like to ask a few question concerning three components of the rankings that add up to 65% of the overall evaluation.

Faculty-student ratio
In 2005 there were a number of obvious, although apparently universally ignored, errors in the faculty-student ratio section. These include ascribing inflated faculty numbers to Ecole Polytechnique in Paris, Ecole Normale Superieure in Paris, Ecole Polytechnique Federale in Lausanne, Peking (Beijing) University and Duke University, USA. Thus, Ecole Polytechnique was reported on the site of QS Quacquarelli Symonds (7)), your consultants, to have 1,900 faculty and 2,468 students, a ratio of 1.30 students per faculty, Ecole Normale Superieure 900 faculty and 1800 students, a ratio of one per two faculty, Ecole Polytechnique Federale 3,210 faculty and 6,530 students, a ratio of 2.03, Peking University 15,558 faculty and 76,572 students, a ratio of 4.92, and Duke 6,244 faculty and 12,223 students, a ratio of 1.96

In 2006 the worst errors seem to have been corrected although I have not noticed any acknowledgement that any error had occurred or explanation that dramatic fluctuations in the faculty-student ratio or the overall score were not the result of any achievement or failing on the part of the universities concerned.

However, there still appear to be problems. I will deal with the case of Duke University, which this year is supposed to have the best score for faculty-student ratio. In 2005 Duke, according to the QS Topgraduates site, had, as I have just noted, 6,244 faculty and 12,223 students, giving it a ratio of about one faculty to 2 students. This is quite implausible and most probably resulted from a data entry error with an assistant or intern confusing the number of undergraduates listed on the Duke site, 6,244 in the fall of 2005, with the number of faculty. (8)

This year the data provided are not so implausible but they are still highly problematical. In 2006 Duke according to QS has 11,106 students but the Duke site refers to 13,088. True, the site may be in need of updating but it is difficult to believe that a university could reduce its total enrollment by about a sixth in the space of a year. Also, the QS site would have us believe that in 2006 Duke has 3,192 faculty members. But the Duke site refers to 1,595 tenure and tenure track faculty. Even if you count other faculty, including research professors, clinical professors and medical associates the total of 2,518 is still much less than the QS figure. I cannot see how QS could arrive at such a low figure for students and such a high figure for faculty. Counting part timers would not make up the difference, even if this were a legitimate procedure, since, according to the US News & World Report (America’s Best Colleges 2007 Edition), only three percent of Duke faculty are part time. My incredulity is increased by the surprise expressed by a senior Duke administrator (9) and by Duke's being surpassed by several other US institutions on this measure, according to the USNWR.

There are of course genuine problems about how to calculate this measure, including the question of part-time and temporary staff, visiting professors, research staff and so on. However, it is rather difficult to see how any consistently applied conventions could have produced your data for Duke.

I am afraid that I cannot help but wonder whether what happened was that data for 2005 and 2006 were entered in adjacent rows in a database for all three years and that the top score of 100 for Ecole Polytechnique in 2005 was entered into the data for Duke in 2006 – Duke was immediately below the Ecole in the 2005 rankings – and the numbers of faculty and students worked out backwards. I hope that this is not the case.

-- Could you please indicate the procedures that were employed for counting part-timers, visiting lecturers, research faculty and so on?
-- Could you also indicate when, how and from whom the figures for faculty and students at Duke were obtained?
-- I would like to point out that if the faculty-student ratio for Duke is incorrect then so are all the scores for this component, since the scores are indexed against the top scorer, and therefore all the overall scores. Also, if the ratio for Duke is based on an incorrect figure for faculty, then Duke’s score for citations per faculty is incorrect. If the Duke score does turn out to be incorrect would you consider recalculating the rankings and issuing a revised and corrected version?


International faculty
This year the university with the top score for international faculty is Macquarie, in Australia. On this measure it has made a giant leap forward from 55 to 100 (10).

This is not, I admit, totally unbelievable. THES has noted that in 2004 and 2005 it was not possible to get data for Australian universities about international faculty. The figures for Australian universities for these years therefore simply represent an estimate for Australian universities as a whole with every Australian university getting the same, or almost the same, score. This year the scores are different suggesting that data has now been obtained for specific universities.

I would like to digress a little here. On the QS Topgraduate website the data for 2005 gives the number of international faculty at each Australian university. I suspect that most visitors to the site would assume that these represent authentic data and not an estimate derived from applying a percentage to the total number of faculty. The failure to indicate that these data are estimates is perhaps a little misleading.

Also, I note that in the 2005 rankings the international faculty score for the Australian National University is 52, for Monash 54, for Curtin University of Technology 54 and for the University of Technology Sydney 33. For the other thirteen Australian and New Zealand universities it is 53. It is most unlikely that if data for these four universities were not estimates they would all differ from the general Australasian score by just one digit. It is likely then that in four out of seventeen cases there have been data entry errors or rounding errors. This suggests that it is possible that there have been other errors, perhaps more serious. The probability that errors have occurred is also increased by the claim, uncorrected for several weeks at the time of writing, on the QS Topuniversities site that in 2006 190,000 e-mails were sent out for the peer review.

This year the Australian and New Zealand universities have different scores for international faculty. I am wondering how they were obtained. I have spent several hours scouring the Internet, including annual reports and academic papers, but have been unable to find any information about the numbers of international faculty in any Australian university.

-- Can you please describe how you obtained this information? Was it from verifiable administrative or government sources? It is crucially important that the information for Macquarie is correct because if not then, once again, all the scores for this section are wrong.

Peer Review
This is not really a peer review in the conventional academic sense but I will use the term to avoid distracting arguments. My first concern with this section is that the results are wildly at variance with data that you yourselves have provided and with data from other sources. East Asian and Australian and some European universities do spectacularly better on the peer review, either overall or in specific disciplinary groups, than they do on any other criteria. I shall, first of all, look at Peking University (which you usually call Beijing University) and the Australian National University (ANU).

According to your rankings, Peking is in 2006 the 14th best university in the world (11). It is 11th on the general peer review, which according to your consultants explicitly assesses research accomplishment, and twelfth for science, twentieth for technology, eighth for biomedicine, 17th for social science and tenth for arts and humanities.

This is impressive, all the more so because it appears to be contradicted by the data provided by THES itself. On citations per paper Peking is 77th for science and 76th for technology. This measure is an indicator of how a research paper is regarded by other researchers. One that is frequently cited has aroused the interest of other researchers. It is difficult to see how Peking University could be so highly regarded when its research has such a modest impact. For biomedicine and social sciences Peking did not even do enough research for the citations to be counted.

If we compare overall research achievements with the peer review we find some extraordinary contrasts. Peking does much better on the peer review than California Institute of Technology (Caltech), with a score of 70 to 53 but for citations per faculty Peking’s score is only 2 compared to 100.

We find similar contrasts when we look at ANU. It was 16th overall and had an outstanding score on the peer review, ranking 7th on this criterion. It was also 16th for science, 24th for technology, 26th for biomedicine, 6th for social science and 6th for arts and humanities.

However, the scores for citations per paper are distinctly less impressive. On this measure, ANU ranks 35th for science, 62nd for technology and 56th for social science. It does not produce enough research to be counted for biomedicine.

Like Peking, ANU does much better than Caltech on the peer review with a score of 72 but its research record is less distinguished with a score of 13.

I should also like to look at the relative position of Cambridge and Harvard. According to the peer review Cambridge is more highly regarded than Harvard. Not only that, but its advantage increased appreciably in 2006. But Cambridge lags behind Harvard on other criteria, in particular citations per faculty and citations per paper in specific disciplinary groups. Cambridge is also decidedly inferior to Harvard and a few other US universities on most components of the Shanghai Jiao Tong index (12).

How can a university that has such an outstanding reputation perform so consistently less well on every other measure? Moreover, how can its reputation improve so dramatically in the course of two years?

I see no alternative but to conclude that much of the remarkable performance of Peking University, ANU and Cambridge is nothing more than an artifact of the research design. If you assign one third of your survey to Europe and one third to Asia on economic rather than academic grounds and then allow or encourage respondents to nominate universities in those areas then you are going to have large numbers of universities nominated simply because they are the best of a mediocre bunch. Is ANU really the sixth best university in the world for social science and Peking the tenth best for arts and humanities or is just that there are so few competitors in those disciplines in their regions?

There may be more. The performance on the peer review of Australian and Chinese universities suggests that a disproportionate number of e-mails were sent to and received from these places even within the Asia-Pacific region. The remarkable improvement of Cambridge between 2004 and 2006 also suggests that a disproportionate number of responses were received from Europe or the UK in 2006 compared to 2005 and 2004.

Perhaps there are other explanations for the discrepancy between the peer review scores for these universities and their performance on other measures. One is that citation counts favour English speaking researchers and universities but the peer review does not. This might explain the scores of Peking University but not Cambridge and ANU. Perhaps, Cambridge has a fine reputation based on past glories but this would not apply to ANU and why should there be such a wave of nostalgia sweeping the academic world between 2004 and 2006? Perhaps citation counts favour the natural sciences and do not reflect accomplishments in the humanities but the imbalances here seem to apply across the board in all disciplines.

There also are references to some very suspicious procedures. These include soliciting more responses to get more universities from certain areas in 2004. In 2006, there is a reference to weighting responses from certain regions. Also puzzling is the remarkable closing of the gap between high and low scoring institution between 2004 and 2005. Thus in 2004 the mean score for the peer review of all universities in the top 200 was 105.69 compared to a top score of 665 while in 2005 it was 32.82 compared to a top score of 100.

I would therefore like to ask these questions.

-- Can you indicate the university affiliation of your respondents in 2004, 2005 and 2006?
-- What was the exact question asked in each year?
-- How exactly were the respondents selected?
-- Were any precautions taken to ensure that those and only those to whom it was sent completed the survey?
-- How do you explain the general inflation of peer review scores between 2004 and 2005?
-- What exactly was the weighting given to certain regions in 2006 and to whom exactly was it given?
-- Would you considering publishing raw data to show the number of nominations that universities received from outside their regions and therefore the genuine extent of their international reputations?

The reputation of the THES rankings would be enormously increased if there were satisfactory answers to these questions. Even if errors have occurred it would surely be to THES’s long-term advantage to admit and to correct them.

Yours sincerely
Richard Holmes
Malaysia


Notes
(1) htttp://times.hankooki.com/lpage/nation/200611/kt2006110620382111990.htm
(2) http://ed.sjtu.edu.cn/ranking.htm
(3) http://english.chosun.com/w21data/html/news/200611/200611150020.html
(4) http://www.timesonline.co.uk/article/0,,3284-2452314,00.html
(5) http://www.blackwellpublishing.com/more_reviews.asp?ref=9781405163125&site=1
(6) http://www.topuniversities.com/worlduniversityrankings/2006/faqs/
(7) www.topgraduate.com
(8) http://www.dukenews.duke.edu/resources/quickfacts.html
(9) www.dukechronicle.com
(10) www.thes.co.uk
(11) www.thes.co.uk
(12) http://ed.sjtu.edu.cn/ranking.htm








Thursday, May 03, 2007

Book Review

This is a draft of a review that may appear shortly in an academic journal.

Guide to the World’s Top Universities, John O’Leary, Nunzio Quacquarelli and Martin Ince. QS Quacquarelli Symonds Ltd.: London. 2006.


The THES (Times Higher Education Supplement)-QS World University Rankings have aroused massive interest throughout the world of higher education, nowhere more so than in East and Southeast Asia. Very few university teachers and administrators in the region can be unaware of the apparent dramatic collapse of quality at Universiti Malaya, which was in fact nothing of the sort. That this resulted from nothing more than an error by THES’s consultants and its belated correction has done little to diminish public fascination.

Now, QS Quacquarelli Symonds, the consultants who compiled the data for the rankings, have published a large 512-page volume. The book, written by John O’Leary and Martin Ince of THES and Nunzio Quacquarelli of QS, comes with impressive endorsements. It is published in association with IELTS, TOEFL and ETS, names that quite a few Asian students and teachers will know, and is distributed by Blackwell Publishing of Oxford. At the top of the front cover, there is a quotation from Tim Rogers, former Head of Student Recruitment and Admissions, London School of Economics: “A must – have book for anyone seeking a quality university education at home and abroad.” Tim Rogers, by the way, has been a consultant for QS.

The Guide to the World’s Top Universities certainly contains a large amount of material. There are thirteen chapters as follows.

Welcome to the world’s first top university guide
Ranking the world’s universities
How to choose a university and course
The benefits of studying abroad
What career? Benefits of a top degree
Tips for applying to university
What parents need to know -- guide to study costs and more
Financing and scholarships
The world’s top 200 universities. This is the ranking that was published last year in the THES.
The world’s top universities by subject. This was also published in the THES.
The top 100 university profiles. This provides two pages of information about each university.
12. The top ten countries
13. Directory of over 500 top world universities.

Basically, there are two parts. The earlier chapters mostly consist of advice that is generally interesting, well written and sensible. Later, we have data about various characteristics of the universities, often ranking them in order. The latter comprise much of the book. The profiles of the top 100 universities take up 200 pages and the directory of 500 plus universities another 140.

So, is this a must-have book? At ₤19.99, $35.95 or Euro 28.50 the answer has to be not really. Maybe it would be a good idea to glance through the earlier advisory chapters but as a source of information and evaluation it is not worth the money. First of all, there are serious problems with the information presented in the rankings, the profiles and the directory. The book’s credibility is undermined by a succession of errors, indicating an unacceptable degree of carelessness. At 35 dollars or 20 pounds we surely have the right to expect something a little better, especially from the producers of what is supposed to be “the gold standard” of university rankings.

Thus we find that the Technical University of Munich appears twice in the profiles in positions 82 (page 283) and 98 (Page313). The latter should be the University of Munich. In the directory the University of Munich is provided with an address in Dortmund (page 407). The Technical University of Helsinki is listed twice in the directory (pages 388 and 389). A number of Swiss universities are located in Sweden (pages 462 and 463). The authors cannot decide whether there is only one Indian Institute of Technology and one Indian Institute of Management (page 416) or several (pages 231 and 253). New Zealand is spelt ‘New Zeland’ (page 441). The profile for Harvard repeats the same information in the factfile under two different headings (page 119). There is something called the ‘Official University of California, Riverside’ on page 483. Kyungpook National University in Korea has a student faculty ratio of zero (page 452). Something that is particularly irritating is that the authors or their assistants still cannot get the names of Malaysian universities right. So we find ‘University Putra Malaysia’ on page 435 and ‘University Sains Malaysia’ on page 436. After that famous blunder about Universiti Malaya’s international students and faculty one would expect the authors to be a bit more careful.

Still, we must give some credit. At least the book has at last started to use the right name for China’s best or second best university – Peking University, not Beijing University -- and ‘University of Kebangsaan Malaysia’ in the 2006 rankings in the THES has now been corrected to ‘Universiti Kebangsaan Malaysia’.

The Guide really gets confusing, to put it mildly, when it comes to the number of students and faculty. A perceptive observer will note that the data for student-faculty ratio in the top 200 rankings reproduced in chapter 9 is completely different from those in the profiles in chapter 11 and the directory in chapter 13.

For example, in the rankings Duke University, in North Carolina, is given a score of 100, indicating the best student faculty ratio. Going to QS’s topuniversities website we find that Duke supposedly has 11,106 students and 3,192 faculty, representing a ratio of 3.48 students per faculty. But then we turn to the profile and see that Duke is assigned a ratio of 16.7 students per faculty (page 143). On the same page we are told that Duke has 6,301 undergraduates and 4,805 postgraduates and “just under 1,600 faculty”. That makes a ratio of about 6.94. So, Duke has 3.48 or 6.94 or 16.7 students per faculty. Not very helpful.

Looking at Yale University, the book tells us on the same page (127) that the student faculty ratio is 34.3 and that the university has “around 10,000 students” and 3,333 faculty, a ratio of 3 students for each faculty member.

On page 209 we are told that the University of Auckland has a student–faculty ratio of 13.5 and in the adjacent column that it has 2,000 academic staff and 41, 209 students, a ratio of 20.6. Meanwhile, the top 200 rankings give it a faculty student score of 38 which works out at a ratio of 9.2. So, take your pick from 9.2, 13.5 and 20.6.

The data for research expertise is also contradictory. Universities in Australia and China get excellent scores for the “peer review” of best research in the rankings of the top 200 universities in chapter 9 but get relatively poor scores for research impact. The less glamorous American universities like Boston and Pittsburgh get comparatively low scores for peer review of research but actually do excellent research.

Errors and contradictions like these seriously diminish the book’s value as a source of information.

It would not be a good idea to buy this book although it might be worth looking at the early chapters if you can borrow it from a library. To judge the overall global status of a university, the best bet would be to look east and turn to at the Shanghai Jiao Tong University Index, available on the Internet, which ranks the top 500 universities. This index focuses entirely on research but there is usually at least a modest relationship between research activity and other variables such as the quality if the undergraduate student intake and teaching performance. Those thinking about going to the US should look at the US News and World Report ‘s America’s Best Colleges. Anyone concerned about costs – who isn’t? -- should look at Kiplinger’s Index, which calculates the value for money of American universities. Incidentally, the fifth place here goes to the State University of New York at Binghamton, which is not even mentioned in the Guide. The Times (which is not the same as the Times Higher Education Supplement) and Guardian rankings are good for British universities.

Students who are not certain about going abroad or who are thinking about going to a less well known local institution could try doing a Google Scholar search for evidence of research proficiency and a Yahoo search for miscellaneous activity. Whatever you do, it is not a good idea to rely on any one source alone and certainly not this one.

Saturday, October 28, 2006

The Best Universities for Technology?

The Times Higher Education Supplement (THES) have published a list of the supposed top 100 universities in the world in the field of technology. The list purports to be based on opinion of experts in the field. However, like the ranking for science, it cannot be considered valid. First, let us compare the top 20 universities according to peer review and then the top 20 according to the data provided by THES for citations per paper, a reasonable measure of the quality of research.

First, the peer review:

1. MIT
2. Berkeley
3. Indian Institutes of Technology (all of them)
4. Imperial College London
5. Stanford
6. Cambridge
7. Tokyo
8. National University of Singapore
9. Caltech
10. Carnegie-Mellon
11. Oxford
12. ETH Zurich
13. Delft University of Technology
14. Tsing Hua
15. Nanyang Technological University
16. Melbourne
17. Hong Kong University of science and Technology
18. Tokyo Institute of Technology
19. New South Wales
20. Beijing (Peking University)

Now, the top twenty ranked according to citations per paper:

1. Caltech
2. Harvard
3. Yale
4. Stanford
5. Berkeley
6. University of California at Santa Barbara
7. Princeton
8. Technical University of Denmark
9. University of California at San Diego
10. MIT
11. Oxford
12. University of Pennsylvania
13. Pennsylvania State University
14. Cornell
15. Johns Hopkins
16. Boston
17. Northwestern
18. Columbia
19. Washington (St. Louis)
20. Technion (Israel)

Notice that the Indian Institutes of Technology, Tokyo, National University of Singapore, Nanyang Technological University, Tsing Hua, Melbourne, New South Wales and Beijing are not ranked in the top 20 according to quality of published research. Admittedly, it is possible that in this field a substantial amount of research consists of unpublished reports for state organizations or private companies but this would surely be more likely to affect American rather than Asian or Australian universities.

Looking a bit more closely at some of the universities in the top twenty for technology according to the peer review, we find that, when ranked for citations per paper, Tokyo is in 59th place, National University of Singapore 70th, Tsing Hua 86th, Indian Institutes of Technology 88th, Melbourne 35th, New South Wales 71st, and Beijing 76th. Even Cambridge, sixth in the peer review, falls to 29th.

Again, there are a large number of institutions that did not even produce enough papers to be worth counting, raising the question of how they could be sufficiently well known for there to be peers to vote for them. This is the list:

Indian Institutes of Technology
Korean Advanced Institute of Science and Technology
Tokyo Institute of Technology
Auckland
Royal Institute of Technology Sweden
Indian Institutes of Management
Queensland University of Technology
Adelaide
Sydney Technological University
Chulalongkorn
RMIT
Fudan
Nanjing

Once again there is a very clear pattern of the peer review massively favoring Asian and Australasian universities. Once again, I can see no other explanation than an overrepresentation of these regions, and a somewhat less glaring one of Europe, in the survey of peers combined with questions that allow or encourage respondents to nominate universities from their own regions or countries.

It is also rather disturbing that once again Cambridge does so much better on the peer review than on citations. Is it possible that THES and QS are manipulating the peer review to create an artificial race for supremacy – “Best of British Closing in on Uncle Sam’s finest”. Would it be cynical to suspect that next year Cambridge and Harvard will be in a circulation-boosting race for the number one position?

According to citations per faculty Harvard was 4th for science, second for technology and 6th for biomedicine while Cambridge was 19th, 29th and 9th.

For the peer review, Cambridge was first for science, 6th for technology and first for biomedicine. Harvard was 4th, 23rd and second.

Overall, there is no significant relationship between the peer review and research quality as measured by citations per paper. The correlation between the two is .169, which is statistically insignificant. For the few Asian universities that produced enough research to be counted, the correlation is .009, effectively no better than chance.

At the risk of being boringly repetitive, it is becoming clearer and clearer that that the THES rankings, especially the peer review component, are devoid of validity.

Thursday, October 26, 2006

The World’s Best Science Universities?

The Times Higher Education Supplement (THES) has now started to publish lists of the world’s top 100 universities in five disciplinary areas. The first to appear were those for science and technology.

THES publishes scores for its peer review by people described variously as “research-active academics” or just as “smart people” of the disciplinary areas along with the number of citations per paper. The ranking is, however, based solely on the peer review, although a careless reader might conclude that the citations were considered as well.

We should ask for a moment what a peer review, essentially a measure of a university’s reputation, can accomplish that an analysis of citations cannot. A citation is basically an indication that another researcher has found something of interest in a paper. The number of citations of a paper indicates how much interest a paper has aroused among the community of researchers. It coincides closely with the overall quality of research, although occasionally a paper may attract attention because there is something very wrong with it.

Citations then are a good measure of a university’s reputation for research. For one thing, votes are weighted. A researcher who publishes a great deal has more votes and his or her opinion will have more weight than someone who publishes nothing. There are abuses of course. Some researchers are rather too fond of citing themselves and journals have been known to ask authors to cite papers by other researchers whose work they have published but such practices do not make a substantial difference.

In providing the number of citations per paper as well as the score for peer review, THES and their consultants, QS Quacquarelli Symonds, have really blown their feet off. If the scores for peer review and the citations are radically different it almost certainly means that there is something wrong with the review. The scores are in fact very different and there is something very wrong with the review.

This post will review the THES rankings for science.

Here are the top twenty universities for the peer review in science:

1. Cambridge
2. Oxford
3. Berkeley
4. Harvard
5. MIT
6. Princeton
7. Stanford
8. Caltech
9. Imperial College, London
10. Tokyo
11. ETH Zurich
12. Beijing (Peking University)
13. Kyoto
14. Yale
15. Cornell
16. Australian National University
17. Ecole Normale Superieure, Paris
18. Chicago
19. Lomonosov Moscow State University
20. Toronto


And here are the top 20 universities ranked by citations per paper:


1. Caltech
2. Princeton
3. Chicago
4. Harvard
5. John Hopkins
6. Carnegie-Mellon
7. MIT
8. Berkeley
9. Stanford
10. Yale
11. University of California at Santa Barbara
12. University of Pennsylvania
13. Washington (Saint Louis?)
14. Columbia
15. Brown
16. University of California at San Diego
17. UCLA
18. Edinburgh
19. Cambridge
20. Oxford


The most obvious thing about the second list is that it is overwhelmingly dominated by American universities with the top 17 places going to the US. Cambridge and Oxford, first and second in the peer review, are 19th and 20th by this measure. Imperial College London. Beijing, Tokyo, Kyoto and the Australian National University are in the top 20 for peer review but not for citations.

Some of the differences are truly extraordinary. Beijing is 12th for peer review and 77th for citations, Kyoto13th and 57th, the Australian National University 16th and 35th Ecole Normale Superieure, Paris 17th and 37th, Lomsonov State University, Moscow 18th and 82nd National University of Singapore, 25th and 75th, Sydney 35th and 70th , Toronto 20th and 38th. Bear in mind that there are almost certainly several universities that were not in the peer review top 100 but have more citations per paper than some of these institutions.

It is no use saying that citations are biased against researchers who do not publish in English. For better or worse, English is the lingua franca of the natural sciences and technology and researchers and universities that do not publish extensively in English will simply not be noticed by other academics. Also, a bias towards English does not explain the comparatively poor performance by Sydney, ANU and the National University of Singapore and their high ranking on the peer review.

Furthermore, there are some places for which no citation score is given. Presumably, they did not produce enough papers to be even considered. But if they produce so few papers, how could they become so widely known that their peers would place them in the world’s top 100? These universities are:

Indian Institutes of Technology (all of them)
Monash
Auckland
Universiti Kebangsaan Malaysia
Fudan
Warwick
Tokyo Institute of Technology
Hong Kong University of Science and Technology
Hong Kong
St. Petersburg
Adelaide
Korean Advanced Institute of Science and Technology
New York University
King’s College London
Nanyang Technological University
Vienna Technical University
Trinity College Dublin
Universiti Malaya
Waterloo

These universities are overwhelmingly East Asian, Australian and European. None of them appear to be small, specialized universities that might produce a small amount of high quality research.

The peer review and citations per paper thus give a totally different picture. The first suggests that Asian and European universities are challenging those of the United States and that Oxford and Cambridge are the best in the world. The second indicates that the quality of research of American universities is still unchallenged, that the record of Oxford and Cambridge is undistinguished and that East Asian and Australian universities have a long way to go before being considered world class in any meaningful sense of the word.

A further indication of how different the two lists are can be found by calculating their correlation. Overall, the correlation is, as expected, weak (.390). For Asia-Pacific (.217) and for Europe (.341) it is even weaker and statistically insignificant. If we exclude Australia from the list of Asia-Pacific universities and just consider the remaining 25, there is almost no association at all between the two measures. The correlation is .099, for practical purposes no better than chance. Whatever criteria the peer reviewers used to pick Asian universities, quality of research could not have been among them.

So has the THES peer review found out something that is not apparent from other measures? Is it possible that academics around the world are aware of research programmes that have yet to produce large numbers of citations? This, frankly, is quite implausible since it would require that nascent research projects have an uncanny tendency to concentrate in Europe, East Asia and Australia.

There seems to be no other explanation for the overrepresentation of Europe, East Asia and Australia in the science top 100 than a combination of a sampling procedure that included a disproportionate number of respondents from these regions, allowing or encouraging respondents to nominate universities in their own regions or even countries and a disproportionate distribution of forms to certain countries within regions.

I am not sure whether this is the result of extreme methodological naivety, with THES and QS thinking that they are performing some sort of global affirmative action by rigging the vote in favour of East Asia and Europe or whether it is a cynical attempt to curry favour with those regions that are involved in the management education business or are in the forefront of globalization.

Whatever is going on, the peer review gives a very false picture of current research performance in science. If people are to apply for universities or accept jobs or award grants in the belief that Beijing is better at scientific research than Yale, ANU than Chicago, Lomonosov than UCLA, Tsinghua than Johns Hopkins then they are going to make bad decisions.

If this is unfair then there is no reason why THES or QS should not indicate the following:

The universities and institutions to which the peer review forms were sent.
The precise questions that were asked.
The number of nominations received by universities from outside their own regions and countries.
The response rate.
The criteria by which respondents were chosen.

Until THES and /or QS do this, we can only assume that the rankings are an example of how almost any result can be produced with the appropriate, or inappropriate, research design.