Wednesday, December 21, 2016

The University of Tokyo did not fall in the rankings. It was pushed.


Times Higher Education (THE) has published an article by Devin Stewart that refers to a crisis of Japanese universities. He says:

"After Japan’s prestigious University of Tokyo fell from its number one spot to number seven in Times Higher Education’s Asia University Rankings earlier this year, I had a chance to travel to Tokyo to interview more than 40 people involved with various parts of the country’s education system.
Students, academics and professionals told me they felt a blow to their national pride from the news of the rankings drop. I found that the THE rankings result underscored the complex problems plaguing the country’s institutions of higher learning. "
If Japanese academics and university administrators do actually believe that the fall of the University of Tokyo (aka Todai) in the THE Asian rankings is an indicator of complex problems and if they do feel that it is a blow to their national pride then there is indeed a crisis in Japanese higher education and that is one of a failure of critical thinking and a naive trust in unstable, opaque and methodologically dubious international rankings.

The fall of Todai in the THE Asian rankings was preceded by a fall in the World University Rankings (WUR) from 23rd place in the 2014 rankings (2014-2015) to 43rd in 2015 (2015-2016). Among Asian universities in the WUR it fell from first place to third.

This was not the result of anything that happened to Todai over the course of a year.  There was no exodus of international students, no collapse of research output, no mass suicide of faculty, no sudden and miraculous disappearance of citations. It was the result of a changing methodology including the exclusion from citation counts of mega-papers, mainly in particle physics, with more than a thousand authors. This had a disproportionate impact on the University of Tokyo, whose citation score fell from 74.7 to 60.9, and some other Japanese universities.

The university made a bit of a comeback in the world rankings this year, rising to 39th (with a slightly improved citations score of 62.4) after THE did some more tweaking and gave limited credit for citations of the mega-papers.

Todai did even worse in the 2016 Asian rankings, derived from the world rankings, falling to an embarrassing seventh place behind two Singaporean, two Chinese and two Hong Kong universities. How did that happen?  There was nothing like this in other rankings. Todai's position in the Shanghai Academic Ranking of World Universities (ARWU) actually improved between 2015 and 2016, from 21st to 20th, and in the Round University Rankings from 47th to 37th, and it remained the top Asian university in the CWUR, URAP and National Taiwan University rankings.

Evidently THE saw things that others did not. They decided that Hong Kong and Mainland China were separately entities for ranking purposes and that Mainland students, faculty and collaborators in Hong Kong universities would be counted as international. The international orientation score of the University of Hong Kong (UHK) in the Asian rankings accordingly went up from 81.9 to 99.5 between 2015 and 2016. Peter Mathieson of the University of Hong Kong was aware of this and warned everyone not to get too excited. Meanwhile universities such as Hong Kong University of Science and Technology (HKUST) and Nanyang Technological University (NTU) Singapore were getting higher scores for citations, almost certainly as a result of the methodological changes.

In addition, as noted in earlier posts, THE recalibrated the weighting assigned to its indicators, reducing that given to the research and teaching reputation surveys, where Todai is a high flier, and increasing that for income from industry where Peking and Tsinghua universities have perfect scores and NTU, HKUST and UHK do better than Tokyo.

In 2015 THE issued a health warning:

"Because of changes in the underlying data, we strongly advise against direct comparisons with previous years’ World University Rankings."

They should have done that for the 2016 Asian rankings which added further changes. It is regrettable that THE has published an article which refers to a fall in the rankings. There has been no fall in any real sense. There has only been a lot of recalibration and changes in the way data is processed.

Japanese higher education should not be ashamed of any decline in quality. If there had been any, especially in research, it would have have shown up in other more stable and less opaque rankings. They should, however, be embarrassed if they allow national and university policies to be driven by methodological tweaking.





Friday, December 16, 2016

A new Super-University for Ireland?

University rankings have become extremely influential over the last few years. This is not entirely a bad thing. The initial publication of the Shanghai rankings in 2003, for example, exposed the pretensions of many European universities revealing just how far behind they had fallen in scientific research.  It also showed China how far it had to go to achieve scientific parity with the West.

Unfortunately, rankings have also had malign effects. The THE and QS world rankings have acquired a great deal of respect, trust, even reverence that may not be entirely deserved. Both introduced significant methodological changes in 2015 and THE has made further changes in 2016 and the consequence of this is that there have been some remarkable rises and falls within the rankings that have had a lot of publicity but have little to do with any real change in quality.

In addition, both QS and THE have increased the number of ranked universities which can affect the mean score for indicators from which the processed scores given to the public are derived. Both have surveys that can be biased and subjective. Both are unbalanced: QS with a 50 % weighting for academic and employer surveys and THE with field and year normalised citations plus a partial regional modification with an official weighting of 30% (the modification means that everybody except the top scorer gets a bonus for citations). The remarkable rise of Anglia Ruskin University to parity with Oxford and Princeton in this year’s THE research impact (citations) indicator and the high placing of the Pontifical Catholic University of Chile and the National University of Colombia in QS’s employers survey are evidence that these rankings continue to be implausible and unstable. To make higher education policy dependent on their fluctuations is very unwise.

This is particularly true of the two leading Irish universities, Trinity College Dublin (TCD)  and University College Dublin (UCD), which have in fact been advancing in the Round University Rankings produced by a Russian organisation and ShanghaiRanking’s Academic Ranking of World Universities. These two global rankings have methodologies that are generally stable and transparent.

I pointed out in 2015 that TCD had been steadily rising in the Shanghai ARWU  since 2004, especially in the Publications indicator (papers in the Science Citation Index - Expanded and the Social Science Citation Index) and PCP (productivity per capita, that is the combined indicator scores divided by the number of faculty). This year, to repeat an earlier post, TCD’s publication score again went up very slightly from 31 to 31.1 (27.1 in 2004) and the PCP quite significantly from 19 to 20.8 (13.9 in 2004), compared to top scores of 100 for Harvard and Caltech respectively.

UCD has also continued to do well in the Shanghai rankings with the publications score rising this year from 34.1 to 34.2 (27.3 in 2004) and PCP from 18.0 to 18.1 (8.1 in 2014).

The Shanghai rankings are, of course, famous for not counting the arts and humanities and not trying to measure anything related to teaching. The RUR rankings from Russia are based on Thomson Reuters data, also used by THE until two years ago and they do include publications in the humanities and teaching-related metrics. They have 12 out of the 13 indicators in the THE World University Rankings, plus eight others, but with a sensible weighting, for example 8% instead of 30% for field normalised citations.

The RUR rankings show that TCD rose from 174th overall in 2010 to 102nd in 2016. (193rd to 67th for research). UCD rose from 213th overall to 195th (157th to 69th for research) although some Irish universities such as NUI Galway, NUI Maynooth, University College Cork, and Dublin City University have fallen.

It is thoroughly disingenuous for Irish academics to claim that academic standards are declining because of a lack of funds. Perhaps they will do so in the future. But so far everything suggests that the two leading Irish universities are making steady progress especially in research.

The fall of UCD in this year’s THE rankings this year and TCD’s fall in 2015 and the fall of both in the QS rankings mean very little. When there are such large methodological changes it is pointless to discuss how to improve in the rankings. Methodological changes can be made and unmade and universities made and unmade as the Middle East Technical University found in 2015 when it fell from 85th place in the THE world rankings to below 501st.

The Irish Times of November 8th  had an article by Philip O’Kane that proposed that Irish universities should combine in some ways to boost their position in the global rankings.

He suggested that:
“The only feasible course of action for Ireland to avert continued sinking in the world rankings is to create a new “International University of Ireland”.

This could be a world-class research university that consists exclusively of the internationally-visible parts of all our existing institutions, and to do so at marginal cost using joint academic appointments, joint facilities and joint student registration, in a highly flexible and dynamic manner.

Those parts that are not internationally visible would be excluded from this International University of Ireland.”

It sounds like he is proposing that universities maintain their separate identity for some things but present a united front for international matters. This was an idea that was proposed in India a while ago but was quickly shot down by Phil Baty of THE. It is most unlikely that universities could separate data for faculty, students, and income, and publications of their international bits and send the data to the rankers.

The idea of a full merger is more practical but could be pointless or even counter-productive. In 2012 a group of experts, headed by European Commissioner Frans  Van Vught, suggested that UCD and TCD be merged to become a single world class university.

The ironic thing about this idea is that a merger would help with the Shanghai rankings that university bosses are studiously pretending do not exist but would be of little or no use with the rankings that the bureaucrats and politicians do care about.

The Shanghai rankings are known for being as much about quantity as quality. A merger of TCD and UCD would produce a significant gain for the university by combining the number of publications, papers in Nature and Science, and highly cited researchers. It would do no good for Nobel and Fields awards since Trinity has two now and UCD none so the new institution would still only have two (ShanghaiRanking does not count Peace and Literature). Overall, it is likely that the new Irish super-university would rise about a dozen places in the Shanghai rankings, perhaps even getting into the top 150 (TCD is currently 162nd).

But it would probably not help with the rankings that university heads are so excited about. Many of the indicators in the QS and THE rankings are scaled in some way. You might get more citations by adding together those of TCD and UCD, for instance, but QS divide them by number of faculty which would also be combined if there was a merger. You could combine the incomes of TCD and UCD but then the combined income would be divided by the combined staff numbers.

The only place where a merger would be of any point is the survey criteria, 50% in QS and 33% in THE but the problem here is that the reputation of a new University of Dublin or Ireland or whatever it is called is likely to be inferior to that of TCD and UCD for some years to come. There are places where merging universities is a sensible way of pooling the strengths of a multitude of small specialist schools and research centres, for example France and Russia. But for Ireland, there is little point if the idea is to get ahead in the QS and THE  rankings.


It would make more sense for Irish universities to focus on the Shanghai rankings where, if present trends continue, TCD will catch up with Harvard in about 240 years although by then the peaks of the intellectual world will probably be in Seoul, Shanghai, Moscow, Warsaw and Tallinn. 

Saturday, December 03, 2016

Yale Engages with the Rankings

Over the last few years, elite universities have become increasingly concerned with their status in the global rankings. A decade ago university heads were inclined to ignore rankings or to regard them as insignificant, biased or limited. The University of Texas at Austin, for example, did not take part in the 2010 Times Higher Education (THE) rankings although it relented and submitted data in 2011 after learning that other US public institutions had done so and had scored better than in the preceding THES-QS rankings

It seems that things are changing. Around the world there excellence initiatives, one element of which is often improving the position of aspiring universities in international rankings, are proliferating.

It should be a major concern that higher education policies and priorities are influenced or even determined by publications that are problematic and incomplete in several ways. Rankings count what can be counted and that usually means a strong emphasis on research. Indeed, in the case of the Taiwan, URAP and Shanghai rankings that is all they are concerned with. Attempts to measure teaching, especially undergraduate teaching, have been rather haphazard. Although the US News Best US Colleges ranking includes measures of class size, admission standards, course completion and peer evaluation indicators in global rankings such as THE and Quacquarelli Symonds (QS) focus on inputs such as staff student ratio or income that might have some relation to eventual student or graduate outcomes.

It is sad that some major universities are less interested in developing the assessment of teaching or student quality and more in adjusting their policies and missions to the agenda of the rankings, particularly the THE world rankings.

Yale is now jumping on the rankings carousel. For decades it has been happily sitting on top of the US News college rankings making up the top three along with Princeton and Harvard. But Yale does much less well in the current global rankings. This year it is ranked 11th by the Shanghai rankings, 9th among US universities, 15th by QS, 7th among US universities and behind Nanyang Technological University and Ecole Polytechnique Federale Lausanne, and 12th in THE world rankings, 8th in the USA.

And so:

"For an example of investing where Yale must be strong, I want to touch very briefly on rankings, although I share your nervousness about being overly reliant on what are far-from-perfect indicators. With our unabashed emphasis on undergraduate education, strong teaching in Yale College, and unsurpassed residential experience, Yale has long boasted one of the very highest-ranked colleges, perennially among the top three. In the ratings of world research universities, however, we tend to be somewhere between tenth and fifteenth. This discrepancy points to an opportunity, and that opportunity is science, as it is the sciences that most differentiate Yale from those above us on such lists."


The reasons for the difference between the US and the world rankings are that Yale is relatively small compared to the other Ivy League members and the leading state universities, that it is strong in the arts and humanities, and that it has a good reputation for undergraduate teaching.

One of the virtues of global ranking is the exposure of the weaknesses of western universities especially in the teaching of and research in STEM subjects and it does no harm for Yale to shift a bit from the humanities and social sciences to the hard sciences. To take account of research based rankings with a consistent methodology such as URAP, National Taiwan University or the Shanghai rankings is quite sensible.  But Yale is asking for trouble if it becomes overly concerned with rankings such as THE or QS that are inclined to destabilising changes in methodology, rely on subjective survey data, assign disproportionate weights to certain indicators, emphasise input such as income or faculty resources rather than actual achievement, are demonstrably biased, and include indicators that are extremely counter-intuitive (Anglia Ruskin with a research impact equal to Princeton and greater than Yale, Pontifical Catholic University of Chile 28th in the world for employer reputation) .

Yale would be better off if it encouraged the development of cross-national tools to measure student achievement and quality of teaching or ranking metrics that assigned more weight to the humanities and social sciences.