Monday, November 29, 2010


An article by William Patrick Leonard in the Korea Herald discusses how this year's rankings by Shanghai Jiao Tong University, THE and QS, whatever their faults, indicate a long term shift in academic excellence from the USA and Europe to China and other parts of Asia.

"All three release their findings as the school year begins. Each employs a similar blend of indicators, which purportedly measure the relative quality of the institutions surveyed. All emphasize institutional reputation expressed in the quality and quantity of faculty publications, peer assessments, faculty/student ratios, budgets and other input quality measures.

These rankings are clearly flawed; for example, it is not evident that the volume of scholarly publications or peer assessments reflects quality in the classroom. Nevertheless, the rankings show that other fast-growing countries are willing to apply their resources to higher education, just as the United States has been doing for years"
He then observes how in the US sport seems to take precedence over education.

"Yet, instead of strengthening our academic programming, some are planning costly recreational diversions. In May, the National Football Foundation & College Hall of Fame announced that “six new college football teams are set to take the field for the first time this season with 11 more programs set to launch between 2011 and 2013.” Such an announcement is simultaneously sad and humorous. The resources spent to implement and subsequently prop up these programs could be used to improve technology, science, and engineering programs. Sadly, some institutions have opted for stadiums over instructional infrastructure."

Saturday, November 20, 2010

Comment on the New York Times Article

This is from Paul Wouters at CWTS, Leiden University

"However, the reason for this high position is the performance of exactly one (1) academic: Mohamed El Naschie, who published 323 articles in the Elsevier journal Chaos, Solitons and Fractals of which he is the founding editor. His articles frequently cite other articles in the same journal by the same author. On many indicators, the Alexandrian university does not score very high, but on the number of citations indicator the university scores 99.8 which puts it as the 4th most highly cited in the world. This result clearly does not make any sense at all. Apparently, the methodology used by the THES is not only problematic because it puts a high weight on surveys and perceived reputation. It is also problematic because the way the THES survey counts citations and makes them comparable across fields (in technical terms, the way these counts are normalized) is not able to filter out these forms of self-promotion by self-citations. In other words: the way the THES uses citation analysis does not meet one of the requirements of sound indicators: robustness against simple forms of manipulation."

BTW, to be a bit pedantic, it's not THES any more.
The Influence of Rankings

From an announcement about merit scholarships from The Islamic Development Bank.

"The successful candidate must secure admission to one universities listed in the Times Higher Education Supplement (THES)."

That obviously needs updating but does it mean Alexandria but not Texas?

Thursday, November 18, 2010


A previous post, "Debate, Anyone" contained some data about comparative rates of self-citation among various univeristies. The methodology used was not appropriate (calcuting the number of articles that contained self-citations as a percentage of the sum of citations). I am recalculating although the relative level of self-citation among universities is most unlikely to be affected.
Article in New York Times

The New York Times has a long article, which is being quoted globally, about the THE World University Rankings. So far it has been cited in newspapers in Italy, Spain, France and Egypt and no doubt other places to come.

Sunday, November 14, 2010

Another Ranking

An organisation called Eduroute has produced a ranking of universities by "web quality". The idea sounds interesting but we need to be given some more information.The top five are:

1. Harvard
2. MIT
3.  Cornell
4.  Stanford
5.  UC Berkeley

After that there are some surprises with National Taiwan University in 7th place, University of the Basque Country 16th, Sofia University 44th, Yildiz Technical University 57th.

The structure of these rankings is as follows:

Volume 20%  "The volume of information published is measure by a set of commands that run on the major search engines. "

Online scientific information  10%  "Eduroute measures this aspect through the search engines which specialize in publishing researches and scholarly articles and which search a university's website for all available publications."

Links quantity 30%:   "Here Eduroute measures the number of incoming links whether these links are from academic or nonacademic websites."

Quality of links and contents 40%:  "it was of great importance to measure this aspect of any website in order to reflect the true size of a university's website on the internet and to measure the degree in which the university is concerned with the quality of content it provides on its website."

This is rather vague and, since only rank order is given, there is no explanation for the high position of the University of the Basque Country, Yildiz Technical University and so on.

The location and personnel of Eduroute also remains mysterious. I did, however, receive this message

"Eduroute is an organization involved in determining rankings for universities. We pride ourselves in offering people with a true collection of information that will assist them when it comes to classifying universities and their rankings.

When coming up with rankings for universities we put into consideration several parameters in order to come up with as accurate a conclusion as possible. This methodology is frequently evaluated and improved on to obtain a solid benchmark that can cast a true reflection of rankings for universities. Eduroute focuses on studying universities’ websites. We believe that the support and investment a university inputs into its website is proportional to the degree of interaction of the website and its users (students, staff and lecturers). The volume and content of the university’s website is analysed while also putting into consideration the traffic flow to the website. The number of external links leading to the university’s website is also a key factor as it is a reflection of how popular the site is. Such parameters and more are useful while determining the rankings for universities. Educational institutions also have the opportunity of registering with us so as to be included in the rankings.

At Eduroute we put all our energies in ensuring the rankings for universities we offer are as accurate as possible. We believe this information is of vital importance both to the general public and to the universities and as such we offer a professional service that satisfies both parties.


May Attia

Project Manager"

Friday, November 12, 2010

Article by Philip Altbach

Inside Higher Education has a substantial and perceptive article on international university rankings by Philip Altbach.

Towards the end there is a round-up of the current season. Some quotations:

Forty percent of the QS rankings are based on a reputational survey. This probably accounts for the significant variability in the QS rankings over the years. Whether the QS rankings should be taken seriously by the higher education community is questionable.


Some of AWRU’s criteria clearly privilege older, prestigious Western universities — particularly those that have produced or can attract Nobel prizewinners. The universities tend to pay high salaries and have excellent laboratories and libraries. The various indexes used also heavily rely on top peer-reviewed journals in English, again giving an advantage to the universities that house editorial offices and key reviewers. Nonetheless, AWRU’s consistency, clarity of purpose, and transparency are significant advantages.


Some of the rankings are clearly inaccurate. Why do Bilkent University in Turkey and the Hong Kong Baptist University rank ahead of Michigan State University, the University of Stockholm, or Leiden University in Holland? Why is Alexandria University ranked at all in the top 200? These anomalies, and others, simply do not pass the "smell test." Let it be hoped that these, and no doubt other, problems can be worked out.