Sunday, May 13, 2007

The University of Santo Tomas

Varsitarian, the newspaper of the University of Santo Tomas (UST) in the Philippines has published an article questioning the credibility of the THES-QS world university rankings.

The complaint appears to be valid although the newspaper makes several errors about the rankings.

Alberto Laurito, assistant to the rector for planning and development at UST, has claimed that QS got the number of students wrong. The consultants reported 11, 764 students whereas the correct number is 32,971. The university’s figure seems to be correct. An article by Guzman and Torres in the Asia Pacific Education Review reports 32,322 students in 2002-3. However, QS’s deflating of student numbers, if it were the only mistake, would work to UST’s advantage in a number of ways. Firstly fewer students mean fewer students per faculty, if the number of the latter is constant, and hence a lower score on the student–faculty ratio component of the rankings. Secondly, if the number of international students is the same, fewer students overall means a higher percentage of international students.

However, this is not QS’s only error. They report that UST has 524 faculty, making a student faculty ratio of 22.45. According to the article, in 2002-3 UST had 1500 faculty. With 32,322 students, this would mean a faculty student ratio of 21.55. QS has made two errors and they have pretty much cancelled each other out.

Laurito then complained:

that THES-QS research on peer review was also irregular, considering that it was worth 40 per cent of the entire survey when only 1,600 universities turned in their responses or about one per cent of the 190,000 needed

The low response rate does of course invalidate the “peer review” but it was individual academics who were surveyed, not universities.

Laurito then points out that UST got a zero for research citations:

The score is obtained through a research collation database maintained by Thomson, an information-based solutions provider, called Essential Science Indicators (ESI). For every citation given to a university researcher or professor, the university would acquire a point.

The procedure is not like this at all. Laurito continues:

Based also on the survey, UST received the lowest grade on international outlook (meaning UST has no international students or faculty) when the University actually has seven international professors and 300 international students.”

Again, not quite. UST gets a score from QS of 3.6 for international faculty and 0.6 for international students, representing 12 international faculty members and 47 international students.

Laurito has got the wrong end of several sticks but the basic point still remains that QS got the data for students, faculty and international students wrong.

The newspaper then quotes Laurito as saying:

We were told by the research representative (of THES-QS) that the data they used were personally given to them by a University personnel, but they were not able to present who or from what office it came from

If Laurito is reported correctly and if this is what the “research representative” told him, there is something very strange here.

IF QS have a documentary record of an e-mail or a phone call to UST how could the record not indicate the person or office involved?

If they do not, how can QS be sure that the information came from an official university source or that there was any contact at all?

Friday, May 11, 2007

More about Student-Faculty Ratios

I have just discovered a very good site by Ben Wilbrink, Prestatie-indicatoren (indicator systems). He starts off with "Een fantastisch document voor de kick-off", referring to a monograph by Sharon L. Nichols and David C. Berliner (2005), The Inevitable Corruption of Indicators and Educators Through High-Stakes Testing. Education Policy Studies Laboratory, Arizona State University pdf (180 pp.).

The summary of this study reports that:

"This research provides lengthy proof of a principle of social science known as Campbell's law: "The more any quantitative social indicator is used for social decisionmaking, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor." "

This insight might well be applied to current university ranking systems. We have seen, for example, some US universities making it optional for applicants to submit their SAT results. It is predictable that good scores will be submitted to admissions officers, but not bad ones. Universities will then find that the average scores of their applicants will rise and therefore so will their scores on rankings that include SAT data.

I would like to propose a new law, an inversion of Gresham's. Good scores drive out bad.

Wilbrink has some good comments on the THES-QS rankings but I would like to focus on what he says about the student-faculty ratio.

"The faculty/student score (20%)The scores in this rubric are remarkable, to say the least. I do not think the student/staff ratio is less reliable than the other indicators, yet the relation to the world rank score seems to be nil. The first place is for (13) Duke, the second for (4=) Yale, the third for (67) Eindhoven University of Technology. Watch who have not made it here in the top twenty: Cambridge is 27th, Oxford 31st, Harvard 37th, Stanford 119, Berkeley 158. This is one more illustration that universities fiercely competing for prestige (see Brewer et al.) tend to let their students pay at least part of the bill.
"We measure teaching by the classic criterion of staff-to-student ratio." Now this is asking for trouble, as Ince is well aware of. Who is a student, who is a teacher? In the medieval universities these were activities, not persons. Is it much different nowadays? How much? ...


Every administration will creatively fill out the THES/QS forms asking them for the figures on students and teachers, this much is absolutely certain. If only because they will be convinced other administrations will do so. Ince does not mention any counter-measure, hopefully the THES/QS people have a secret plan to detect fraudulent data."

It is possible to test whether Wilbrink's remarks are applicable to the student-faculty scores for the 2006 THES-QS rankings. THES have published a table of student-faculty ratios at British universities from the University and College Union that is derived from data from the Higher Education Statistics Agency (HESA). These include further education students and exclude research-only staff. These results can be compared to the data in the THES-QS rankings


In 2006 QS reported that the top scorer for student-faculty ratio was Duke. Looking at QS's website we find that this represents a ratio of 3.48 students per faculty. Cross-checking shows that QS used the data on their site to construct the scores on the 2006 rankings. Thus, the site reports that Harvard had 3,997 faculty and 24,648 students , a ratio of 6.17 students per faculty, ICL 3,090 faculty and 12,185 students, a ratio 0f 3.94, Peking 5,381 faculty and 26,972 students, a ratio of 5.01, Cambridge 3,886 faculty and 21,290 students, a of ratio of 5 .48. These ratios yielded scores of 56, 88, 69 and 64 on the student-faculty component of the 2006 rankings.


Now we can compare the QS data with those from HESA for the period 1005-06. Presumably ,this represents the period covered in the rankings. If Wilbrink is correct, then we would expect the ratios of the rankings to be much lower and more favourable than those provided by HESA.

That in fact is the case. Seven British universities have lower ratios in the HESA statistics. The se are Cranfield, Lancaster, Warwick, Belfast, Swansea, Strathclyde and Goldsmith's College. In 35 cases the THES-QS score was much better. The most noticeable differences were ICL, 3.95 and 9.9, Cambridge , 5,48 and 12,.30, Oxford 5.70 and 11.9, LSE 6.57 and 13, Swansea, 8.49 and 15.1 and Edinburgh, 8.29 and 14.

It is possible that the differences are the result of different consistent and principled conventions. Thus one set of data might specifically include people excluded by the other. The HESA data, for example, includes further education students, presumably meaning non-degree students, but the THES-QS data apparently does not. This would not, however, seem to make much of a difference between the two sets of data for places like Oxford and LSE.

Both HESA`and QS claim not to count staff engaged only in research.

It is possible then that the data provided by universities to QS has been massaged a bit to give favourable scores. I suspect that this does not amount deliberate lying. It is probably more a case of choosing the most beneficial option whenever there is any ambiguity.

Overall, the ratios provided by QS`are much lower, 11.37 compared to 14.63.

Wednesday, May 09, 2007

Another Comment on QS and Kenan-Flagler

A blog by MBA student Shawn Snyder remarks:

"So CNN recently published its "Top 50 Business Schools to get Hired in 2007" and I was glad to see Maryland's Smith school listed, but I was confused to see the George Washington University right above Smith. After all, by their own ranking the GW grads had one less job offer and starting salary almost $10,000 lower. Umm, maybe recruiters think that George Washington is a better deal because they can snag grads for cheap, but from a business student perspective (the people reading the rankings) wouldn't Smith be the better choice? And why wouldn't it rank higher? Business rankings are crap in my opinion....and yet I still read all of them as if it matters. Maybe I have the problem."


And there is a comment by Dave:


" I too noticed some discrepancies in the ratings on CNN.com. Specifically, UNC Kenan-Flagler is not in the top 50! I dug a bit deeper and looked at the data from topmba.com - the website where the list came from - and found some startling errors. UNC KFBS average salary is listed as $76k when the actual average is $89k! I wrote a letter to TopMBA.com and found that not only did they screw up the salaries, but they did not distinguish between University of North Carolina and North Carolina State U in the recruiter rankings! It's really incredible the garbage that these people are allowed to print. What ever happened to 'trust but verify'?"
More on QS and Kenan-Flagler


There is an interesting post at Accepted Admissions Almanac about the QS-Kenan-Flagler affair. The writer remarks:


"It's safe to say that this mess is a nightmare for QS, CNNMoney, and Fortune. Providing and publishing rankings so sloppily slapped together is beneath criticism for an industry that even when the data is accurate has more than its share of critics and is deserving of skepticism. The CNNMoney/QS fiasco is about as bad as it gets for rankings."


I am afraid that it gets very much worse for QS. They have made errors as bad as this in the compilation of the THES-QS World University rankings -- a response rate of less than 1 per cent to an online survey, counting ethnic minority students in Malaysia as international students, renaming Peking University Beijing University, boosting Duke University's score for student-faculty ratio by counting undergraduates as faculty and so on.

But nobody seems to mind very much when it comes to the THES rankings. Is it something about the brand name?

The post concludes with a very appropriate comment:

"When accurate, unlike the removed QS/CNNMoney version, they are sources of information. Sometimes valuable information. Databanks. I use the data, and so should you. If you want to know the average salaries of graduates from particular schools or their average entering test scores, the rankings will have that information compiled in one place. Like a library, they are sources of information. They are not an excuse for decision-making; using them mindlessly could be the equivalent of a lobotomy. And an expensive one at that."
Best Value Colleges

The Princeton Review (registration required) has published a list of the best value colleges in the US.



Here is what they say about their methodology:

"We chose the schools that appear on our Top Ten Best Value Public and Private Colleges ranking lists based on institutional data we collected from more than 650 schools during the 2005-2006 academic year and our surveys of students attending them. Broadly speaking, the factors we weighed covered undergraduate academics, costs, and financial aid.

More specifically, academic factors included the quality of students the schools attracted, as measured by admissions credentials, as well as how students rated their academic experiences. Cost considerations were tuition, room and board, and required fees.

Financial aid factors included the average gift aid (grants, scholarships, or free money) awarded to students, the average percentage of financial need met for students who demonstrated need, the percentage of students with financial need whose need was fully met by the school, the percentage of graduating students who took out loans to pay for school, and the average debt of those students. We also took into consideration how satisfied students were with the financial aid packages they received."



There are a few questions that should be asked about the methodology, especially concerning the student surveys, but this approach may be more useful for undergraduate students than that of the THES-QS and Shanghai Jiao Tong rankings.

The top 10 best value private colleges for undergraduates are:

1. Rice University
2. Williams College
3. Grinell College
4. Swarthmore College
5. Thomas Aquinas college
6. Wabash College
7. Whitman College
8. Amherst College
9. Scripps College
10. Harvard College


The top 10 best value public colleges are:

1. New College of Florida
2. Truman State University
3. University of North Carolina at Asheville
4. University of Virginia
5. University of California at Berkeley
6. University of California at San Diego
7. University of California at Santa Cruz
8. University of Minnesota, NMorris
9. University of Wisconson-Madison
10. St. Mary's College of Maryland

Thursday, May 03, 2007

‘again!?’ Yep... Quacquarelli Symonds Ltd (QS) did it again.


Eric Beeekens at Bog,u +S has written some excellent posts on the internationalization of higher education.

A recent one concerns QS Quacquarelli Symonds Ltd (QS) who were responsible for collecting data for a ranking of business schools by Fortune magazine. It seems that QS committed a major blunder by leaving out the Kenan-Flagler School at the University of North Carolina at Chapel Hill, one of the top American business schools and one that regularly appears among the high fliers in other business school rankings. Apparently QS got mixed up with North Carolina State University’s College of Management. They also left out the Boston University School of Business. Beerkens refers to an article in the Economist (subscription required) and remarks:

“After reading the first line, I thought: 'again!?' Yep... Quacquarelli Symonds Ltd (QS) did it again.”

Beerkens then points out that this is not the first time that QS has produced flawed research, referring – for which many thanks – to this blog and others. He concludes:

“It's rather disappointing that reputable publications like THES and Forbes use the services of companies like QS. QS clearly doesn't have any clue about the global academic market and has no understanding of the impact that their rankings are having throughout the world. There has been a lot of critique about the indicators that they use, but at least we can see these indicators. It are the mistakes and the biases that are behind the indicators that make it unacceptable!”


There was a vigorous response from the University of North Carolina. They pointed out that QS had admitted to not contacting the university about the rankings, using outdated information and getting the University of North Carolina mixed up with North Carolina State University. QS did not employ any proper procedures for verification and validation, apparently failed to check with other rankings, gave wrong or outdated information about salaries and provided data from 2004 0r 2005 although claiming that these referred to 2006.

Fortune has done the appropriate and honest, although probably expensive, thing and removed the rankings from its website.

What is remarkable about this is the contrast between Fortune and the THES All of the errors committed by QS with regard to the Fortune rankings are parallelled in the World University Rankings. They have, for example grossly inflated the scores of Ecole Normale Superieure in Paris in 2004 and Ecole Polytechnique in 2005 by counting part-time faculty as full time, and done the same for Duke University – QS does seem to have bad luck in North Carolina, doesn’t it? -- in 2005 by counting undergraduate students as faculty and in 2006 by counting faculty twice, used a database from a Singapore based academic publishing company that specializes in Asia-Pacific publications to produce a survey to represent world academic opinion, conducted a survey with an apparent response rate of less than one per cent and got the names of universities wrong – Beijing University and the Official University of California among others.

It is probably unrealistic for THES to remove the rankings from its website. Still, they could at the very least start looking around for another consultant.
Book Review

This is a draft of a review that may appear shortly in an academic journal.

Guide to the World’s Top Universities, John O’Leary, Nunzio Quacquarelli and Martin Ince. QS Quacquarelli Symonds Ltd.: London. 2006.


The THES (Times Higher Education Supplement)-QS World University Rankings have aroused massive interest throughout the world of higher education, nowhere more so than in East and Southeast Asia. Very few university teachers and administrators in the region can be unaware of the apparent dramatic collapse of quality at Universiti Malaya, which was in fact nothing of the sort. That this resulted from nothing more than an error by THES’s consultants and its belated correction has done little to diminish public fascination.

Now, QS Quacquarelli Symonds, the consultants who compiled the data for the rankings, have published a large 512-page volume. The book, written by John O’Leary and Martin Ince of THES and Nunzio Quacquarelli of QS, comes with impressive endorsements. It is published in association with IELTS, TOEFL and ETS, names that quite a few Asian students and teachers will know, and is distributed by Blackwell Publishing of Oxford. At the top of the front cover, there is a quotation from Tim Rogers, former Head of Student Recruitment and Admissions, London School of Economics: “A must – have book for anyone seeking a quality university education at home and abroad.” Tim Rogers, by the way, has been a consultant for QS.

The Guide to the World’s Top Universities certainly contains a large amount of material. There are thirteen chapters as follows.

Welcome to the world’s first top university guide
Ranking the world’s universities
How to choose a university and course
The benefits of studying abroad
What career? Benefits of a top degree
Tips for applying to university
What parents need to know -- guide to study costs and more
Financing and scholarships
The world’s top 200 universities. This is the ranking that was published last year in the THES.
The world’s top universities by subject. This was also published in the THES.
The top 100 university profiles. This provides two pages of information about each university.
12. The top ten countries
13. Directory of over 500 top world universities.

Basically, there are two parts. The earlier chapters mostly consist of advice that is generally interesting, well written and sensible. Later, we have data about various characteristics of the universities, often ranking them in order. The latter comprise much of the book. The profiles of the top 100 universities take up 200 pages and the directory of 500 plus universities another 140.

So, is this a must-have book? At ₤19.99, $35.95 or Euro 28.50 the answer has to be not really. Maybe it would be a good idea to glance through the earlier advisory chapters but as a source of information and evaluation it is not worth the money. First of all, there are serious problems with the information presented in the rankings, the profiles and the directory. The book’s credibility is undermined by a succession of errors, indicating an unacceptable degree of carelessness. At 35 dollars or 20 pounds we surely have the right to expect something a little better, especially from the producers of what is supposed to be “the gold standard” of university rankings.

Thus we find that the Technical University of Munich appears twice in the profiles in positions 82 (page 283) and 98 (Page313). The latter should be the University of Munich. In the directory the University of Munich is provided with an address in Dortmund (page 407). The Technical University of Helsinki is listed twice in the directory (pages 388 and 389). A number of Swiss universities are located in Sweden (pages 462 and 463). The authors cannot decide whether there is only one Indian Institute of Technology and one Indian Institute of Management (page 416) or several (pages 231 and 253). New Zealand is spelt ‘New Zeland’ (page 441). The profile for Harvard repeats the same information in the factfile under two different headings (page 119). There is something called the ‘Official University of California, Riverside’ on page 483. Kyungpook National University in Korea has a student faculty ratio of zero (page 452). Something that is particularly irritating is that the authors or their assistants still cannot get the names of Malaysian universities right. So we find ‘University Putra Malaysia’ on page 435 and ‘University Sains Malaysia’ on page 436. After that famous blunder about Universiti Malaya’s international students and faculty one would expect the authors to be a bit more careful.

Still, we must give some credit. At least the book has at last started to use the right name for China’s best or second best university – Peking University, not Beijing University -- and ‘University of Kebangsaan Malaysia’ in the 2006 rankings in the THES has now been corrected to ‘Universiti Kebangsaan Malaysia’.

The Guide really gets confusing, to put it mildly, when it comes to the number of students and faculty. A perceptive observer will note that the data for student-faculty ratio in the top 200 rankings reproduced in chapter 9 is completely different from those in the profiles in chapter 11 and the directory in chapter 13.

For example, in the rankings Duke University, in North Carolina, is given a score of 100, indicating the best student faculty ratio. Going to QS’s topuniversities website we find that Duke supposedly has 11,106 students and 3,192 faculty, representing a ratio of 3.48 students per faculty. But then we turn to the profile and see that Duke is assigned a ratio of 16.7 students per faculty (page 143). On the same page we are told that Duke has 6,301 undergraduates and 4,805 postgraduates and “just under 1,600 faculty”. That makes a ratio of about 6.94. So, Duke has 3.48 or 6.94 or 16.7 students per faculty. Not very helpful.

Looking at Yale University, the book tells us on the same page (127) that the student faculty ratio is 34.3 and that the university has “around 10,000 students” and 3,333 faculty, a ratio of 3 students for each faculty member.

On page 209 we are told that the University of Auckland has a student–faculty ratio of 13.5 and in the adjacent column that it has 2,000 academic staff and 41, 209 students, a ratio of 20.6. Meanwhile, the top 200 rankings give it a faculty student score of 38 which works out at a ratio of 9.2. So, take your pick from 9.2, 13.5 and 20.6.

The data for research expertise is also contradictory. Universities in Australia and China get excellent scores for the “peer review” of best research in the rankings of the top 200 universities in chapter 9 but get relatively poor scores for research impact. The less glamorous American universities like Boston and Pittsburgh get comparatively low scores for peer review of research but actually do excellent research.

Errors and contradictions like these seriously diminish the book’s value as a source of information.

It would not be a good idea to buy this book although it might be worth looking at the early chapters if you can borrow it from a library. To judge the overall global status of a university, the best bet would be to look east and turn to at the Shanghai Jiao Tong University Index, available on the Internet, which ranks the top 500 universities. This index focuses entirely on research but there is usually at least a modest relationship between research activity and other variables such as the quality if the undergraduate student intake and teaching performance. Those thinking about going to the US should look at the US News and World Report ‘s America’s Best Colleges. Anyone concerned about costs – who isn’t? -- should look at Kiplinger’s Index, which calculates the value for money of American universities. Incidentally, the fifth place here goes to the State University of New York at Binghamton, which is not even mentioned in the Guide. The Times (which is not the same as the Times Higher Education Supplement) and Guardian rankings are good for British universities.

Students who are not certain about going abroad or who are thinking about going to a less well known local institution could try doing a Google Scholar search for evidence of research proficiency and a Yahoo search for miscellaneous activity. Whatever you do, it is not a good idea to rely on any one source alone and certainly not this one.