Sunday, August 10, 2008

Are Things Improving at QS?

I think it worth quoting from a recent comment.




Just wanted to add a note about submitting data to THES based on my experience
here at a large Australian university. We forwarded our submission earlier this
year and now we have received a query on some of our numbers - a check just to
confirm if they are correct. And I can understand why they would need to be
checked. The numbers we submitted for staff (in the thousands) seem to have
changed to a number in the very low hundreds. Also, comments (made by us) that
were attached to specific sections have, in the past come back with typos... I
think this indicates that there is greater scope for human error in the
compilation of the data, even at such an early and relatively uncomplicated
phase of the data gathering process...


There are signs that QS is making a commendable effort to avoid the errors that have been so prevalent in previous rankings. Still it is rather disconcerting that thousands of faculty have turned into hundreds, especially since it is not altogether impossible that some universities might conveniently forget to correct an error that might be to their advantage.

So, I was wondering how common simple basic errors are in the QS rankings. I have been looking at QS's topuniversities site and checking the number of students listed in the descriptions for each university, comparing the number of undergraduates and postgraduates with the total of all students. Here are the results just for the Universities beginning with A.

For these twelve universities no problems were noticed: Aarhus, Aberdeen, Aberystwyth, Antwerp, Arizona State University, Athens, Aston, Amsterdam, Adelaide, Australian National University, Austral, Vrije Universiteit Amsterdam. In some cases there were minor discrepancies but not enough to cause concern.

About two weeks ago there were, in three cases, discrepancies between the number of total students and the combined numbers of undergraduates and post graduates: University of Arizona (more combined undergraduates and post graduates than total students), University of Auckland (number of postgraduates and total students the same) and Athens University of Economics and Business (more undergraduate international students than total international students) .

In the above three cases, at the time of writing the errors have been corrected with new entries.
There were however three cases where the errors at the time of writing had not been corrected. These are:

University of Arkansas
Full Time Equivalent (FTE) undergraduates. 20,416.
FTE graduate/postgraduate students 4,163.
Total FTE students 15,182.
Over 9,000 students "missing" from the total.

(correction: not the University of Arizona as was indicated in an earlier version of this post)


University of Alabama
FTE undergraduates 33, 986
FTE graduate/postgraduate students 8,291.
FTE total of 19,651 students.
Over 22,000 students “missing” from the total.

University of Alberta
FTE undergraduates 29,178.
FTE graduate/postgraduate students 5,419.
Total FTE total students 32,341.
About 2,500 students “missing” from the total.

I suspect that the problem with these three schools is that the totals of students were complied and entered separately and that the data for undergraduates and postgraduates included students in branch campuses, professional schools and/or research institutes and the data for total students did not. It will be interesting to see whether these errors will be corrected and whether new ones will emerge.

No comments: