Wednesday, November 15, 2017

Rankings Calendar: QS BRICS University Rankings

The QS BRICS (Brazil, Russia, India, China, South Africa) university rankings will be announced on November 23 at the QS-APPLE conference in Taiwan.


Tuesday, November 14, 2017

China overtakes USA in supercomputing

The website TOP500 keeps track of the world's most powerful computers. Six months ago the USA had 169 supercomputers in the top 500 and China 160. Now China has 202 and the USA 143.

They are followed by Japan with 35, Germany 20, France 18 and the UK 15.

There are four supercomputers in India, four in the Middle East (all in Saudi Arabia), one in Latin America (Mexico), one in Africa (South Africa), 



The closing gap: When will China overtake the USA in research output?

According to the Scopus database, China produced 387,475 articles in 2016 and the USA 409,364, a gap of 21,889.

To be precise, there were 387,475 articles with at least one author affiliated to a Chinese university or research center and 409,364 with at least one author affiliated to an American university or research center.

So far this year there have been 346,425 articles with Chinese affiliations and 352,275 with US affiliations.

The gap is now 5,850 articles.

I think it safe to say that at some point early next year the gap will close and that China will then pull ahead of the USA.

Some caveats. A lot of those articles are just routine stuff and not very significant. For a while, the US may do better in high impact research as measured by citations. Also, US universities contribute more heads of research projects.

On the other hand, I suspect that many of the researchers listed as having American affiliations did their undergraduate degrees or secondary education in China.

And if we counted Hong Kong as part of China, then the gap would already have been closed.

Sunday, November 05, 2017

Ranking debate: What should Malaysia do about the rankings?


A complicated relationship

Malaysia has had a complicated relationship with global university rankings. There was a moment back in 2004 when the first Times Higher Education Supplement- Quacquarelli Symonds (THES-QS) world rankings put the country's flagship, Universiti Malaya (UM), in the top 100. That was the result of an error, one of several QS made in its early days. Over the next few years UM has gone down and up in the rankings, but generally trending upwards with other Malaysian universities following behind. This year it is 114th in the QS world rankings and the top 100 seems in sight once again.

There has been a lot of debate about the quality of the various ranking systems, but it does seem that UM and some other universities have been steadily improving, especially with regard to research, although, as the recent Universitas 21 report shows, output and quality are still lagging behind the provision of resources.  

There is, however, an unfortunate tendency in many places, including Malaysia, for university rankings to get mixed up with local politics. A good ranking performance is proclaimed a triumph by the government and a poor one is deemed by the opposition to be punishment for failed policies.

QS rankings criticised

Recently Ong Kian Ming, a Malaysian opposition MP, said that it was a mistake for the government to use the QS world rankings as a benchmark to measure the quality of Malaysian universities and that the ranking performance of UM and other universities is not a valid measure of quality.

"Serdang MP Ong Kian Ming today slammed the higher education ministry for using the QS World University Rankings as a benchmark for Malaysian universities.
In a statement today, the DAP leader called the decision “short-sighted” and “faulty”, pointing out that the QS rankings do not put much emphasis on the criteria of research output.

According to the QS World University Rankings  for 2018, released on June 8, five Malaysian varsities were ranked in the top 300, with Universiti Malaya (UM) occupying 114th position."

The article went on to say that:


"However, Ong pointed to the Times Higher Education (THE) World University Rankings for 2018, which he said painted Malaysian universities in a different light.

According to the THE rankings, which were released earlier this week, none of Malaysia’s universities made it into the top 300.



Ong suggests that they should rely on locally developed measures.

“Instead of being “obsessed” with the ranking game, he added, the ministry should work to improve the existing academic indicators and measures which have been developed locally by the ministry and the Malaysian Qualifications Agency to assess the quality of local public and private universities”

Multiplication of rankings

It is certainly not a good idea for anyone to rely on any single ranking. There are now over a dozen global rankings and several regional ones that assess universities according to a variety of criteria. Universities in Malaysia and elsewhere could make more use of these rankings some of which are technically much better than the well known big three or four, QS, THE, The Shanghai Academic Ranking of World Universities (ARWU) and sometimes the US News Best Global Universities.

Dr. Ong is also quite right to point out the QS rankings have methodological flaws.  However, the THE rankings are not really any better, and they are certainly not superior in the measurement of research quality. They also have the distinctive attribute that 11 of their 13 indicators are not presented separately but bundled into three groups of indicators so that the public cannot, for example, tell whether a good score for research is the result of an increase in research income, more publications, an improvement in reputation for research, or a reduction in the number of faculty.

The important difference between the QS and THE rankings is not that the latter are focussed on research. QS's academic survey is specifically about research and its faculty student ratio, unlike THE's, includes research-only staff. The salient difference is that the THE academic survey is restricted to published researchers while QS's  allows universities to nominate potential respondents, something that gives an advantage to upwardly mobile institutions in Asia and Latin America.


Ranking vulnerabilities
All of the three well known rankings, THE, QS and ARWU now have  vulnerabilities, metrics that can be influenced by institutions and where a modest investment of resources can produce a disproportionate and implausible rise in the rankings.

In the Shanghai rankings the loss or gain of a single highly cited researcher can make a university go up or down dozens of places in the top 500. In addition the recruitment of scientists whose work is frequently cited, even for adjunct positions, can help universities excel in ARWU’s publications and Nature and Science indicators.

The THE citations indicator has allowed a succession of institutions to over-perform  in the world or regional rankings:  Alexandria University, Anglia Ruskin University in Cambridge, Moscow Engineering Physics Institute, Federico Santa Maria Technical University in Chile, Middle East Technical University, Tokyo Metropolitan University, Veltech University in India, Universiti Tunku Abdul Rahman (UTAR) in Malaysia. The indicator officially has a 30% weighting but in reality it is even greater because of THE’s “regional modification” that gives a boost to every university except those in the top scoring country. The modification used to apply to all of the citations but now covers half.

The vulnerability of the QS rankings is the two survey indicators accounting for  50% of the total weighting which allows universities to propose their own respondents. In recent years some Asian and Latin American universities such as Kyoto University, Nanyang Technological University (NTU), the University of Buenos Aires, the Pontifical Catholic University of Chile and the National University of Colombia have received scores for research and employer reputation that are out of line with their performance on any other indicator.

QS may have discovered a future high flyer in NTU but I have my doubts about the Latin American places. It is also most unlikely that Anglia Ruskin, UTAR and  Veltech will do so well in the THE rankings if they lose their highly cited researchers.

Consequently, there are limits to the reliability of the popular rankings and none of them should be considered the only sign of excellence. Ong is quite correct to point out the problems of the QS rankings but the other well known ones also have defects.


Beyond the Big Four


Ong points out that if we look at "the big four" then the high position of UM in the QS rankings is anomalous.  It is in 114th place in the QS world rankings (24th in the Asian rankings), 351-400 in THE, 356 in US News global rankings and 401-500  in ARWU.

The situation looks a little different when you consider all of the global rankings. Below is UM's position in 14 global rankings. The QS world  rankings are still where UM does best but here it is at the end of a curve. UM is 135th  for publications in the Leiden Ranking, generally considered by experts to be the best technically, although it is lower for high quality publications, 168th in the Scimago Institution Rankings, which combine research and innovation and 201-250 in the QS graduate employability rankings.

The worst performance is in the Unirank rankings (formerly ic4u), based on web activity, where UM is 697th.

The Shanghai rankings are probably a better guide to research prowess than either QS or THE since they deal only with research and, with one important exception, have a generally stable methodology. UM is 402nd overall, having fallen from 353rd in 2015 because of changes in the list of highly cited researchers used by the Shanghai rankers.  UM does better for publications, 143rd this year and 142nd in 2015.

QS World University Rankings: 114 [general, mainly research]
CWTS Leiden Ranking:  publications 135,  top 10% of journals 195 [research]
Scimago Institutions Rankings:  168 [research and innovation]
QS Graduate Employability Rankings: 201-250 [graduate outcomes]
Round University Ranking: 268 [general]
THE World University Rankings: 351-400 [general, mainly research]
US News Best Global Universities: 356 [research]
Shanghai ARWU: 402 [research]
Webometrics: overall 418 (excellence 228) [mainly web activity]
Center for World University Rankings: 539 [general, quality of graduates]
Nature Index: below 500 [high impact research]
uniRank: 697 [web activity]


The QS rankings are not such an outlier. Looking at indicators in other rankings devoted to research gives us results that are fairly similar. Malaysian universities would, however, be wise to avoid concentrating on any single ranking and  they should look at the specific indicators that measure features that are considered important.


Universities with an interest in technology and innovation could look at the Scimago rankings which include patents. Those with strengths in global medical studies might find it beneficial to go for the THE rankings but should always watch out for changes in methodology. 

Using local benchmarks is not a bad idea and it can be valuable for those institutions that are not so concerned with research but many Malaysian institutions are now competing on the global stage and are subject to international assessment and that, whether they like it or not, means assessment by rankings. It would be an improvement if benchmarks and targets were expressed as reaching a certain level in two or three rankings, not just one. Also, they should focus on specific indicators rather than the overall score and different rankings and indicators should be used to assess and compare different places.


For example, the Round University Rankings from Russia, which include five of the six metrics in the QS rankings plus others but with sensible weightings, could be used to supplement the QS world rankings.


For measuring research output and quality universities the Leiden Ranking might be a better alternative to either the QS or the THE rankings. Those universities with an innovation mission could refer to the innovation knowledge metric in the Scimago Institutions Rankings

When we come to measuring teaching and the quality of graduates there is little of value from the current range of global rankings. There have been some interesting initiatives such as the OECD's AHELO project and U-Multirank but these have yet to be widely accepted. The only international metric that even attempts to directly assess graduate quality is QS's employer survey.

So, universities, governments and stakeholders need to stop thinking about using one ranking as a benchmark for everyone and also to stop looking at the overall rankings. 

Friday, November 03, 2017

Ranking Calendar

Over on the right there will be a list of events such as conferences, workshops, and announcements of rankings.

First is the 7th World-Class Universities Conference in Shanghai starting next Monday, November 6th.




Resuming Posting

I have been busy with family and work matters recently but I shall resume posting tomorrow.

I shall be adding some features that I hope will make the blog more of a useful resource.

Sunday, September 17, 2017

Criticism of rankings from India

Some parts of the world seem to be increasingly sceptical of international rankings, or least those produced by Times Higher Education (THE). MENA (Middle East and North Africa) and Africa did not seem to be very enthusiastic about THE's snapshot or pilot rankings. Many Latin American universities have chosen not to participate in the world and regional rankings.

India also seems to be suspicious of the rankings. An article by Vyasa Shastri in the E-paper, livemint, details some of the ways in which universities might attempt to manipulate rankings to their advantage.

It is well worth reading although I have one quibble. The article refers to King Abdulaziz University recruiting faculty who would list the university as their secondary affiliation (now 41) when publishing papers. The original idea was to get top marks in the Shanghai Ranking's highly cited researchers indicator. The article correctly notes that the Shanghai rankings no longer count secondary affiliations but they can still help in the Nature and Science and publications indicators and in citations and publications metrics in other rankings.

Also, other Saudi universities do not recruit large numbers of secondary affiliations. There are only four for the rest of Saudi Arabia although I notice that there are now quite a few for Chinese and Australian universities, including five for the University of Melbourne.

Last word, I hope, on Babol Noshirvani University of Technology

If you type in 'Babol University of Technology" rather than 'Babol Noshirvani University of Technology' into the Scopus search box then the university does have enough publications to meet THE's criteria for inclusion the world rankings.

So it seems that it was those highly cited researchers in engineering that propelled the university into the research impact stratosphere. That, and a rather eccentric methodology.