Sunday, December 22, 2013

Twenty Ways to Rise in the Rankings Quickly and Fairly Painlessly


Times Higher Education has just republished an article by Amanda Goodall, ‘Top 20 ways to improve your world university ranking’.  Much of her advice is very sensible -- appointing university leaders with a strong research record, for example -- but in most cases the road from her suggestions to a perceptible improvement in the rankings is likely to be winding and very long. It is unlikely that any of her proposals would have much effect on the rankings in less than a decade or even two.


So here are 20 realistic proposals for a university wishing to join the rankings game.


Before starting, any advice about how a university can rise in the rankings should be based on these principles.


·         Rankings are proliferating and no doubt there will be more in the future. There is something for almost anybody if you look carefully enough.


·         The indicators and methodology of the better known rankings are very different. Something that works with one may not work with another. It might even have a negative effect.


·         There is often a price to pay for getting ahead in the rankings. Everybody should consider whether it is worth it. Also, while rising from 300th place to 250th is quite easy, going from 30th to 25th is another matter.


·         Don’t forget the number on the bottom. It might be easier to reduce the number of academic staff than to increase the number of citations or publications.

 ·         Rankings are at best an approximation to what universities do. Nobody should get too excited about them.


The top 20 ways in which universities can quickly improve their positions in one or more of the international university rankings are:

 1.  Get rid of students

Over the years many universities acquire a collection of branch campuses, general studies programmes, night schools, pre-degree programmes and so on. Set them free to become independent universities or colleges. Almost always, these places have relatively more students and relatively fewer faculty than the main campus. The university will therefore do better in the Quacquarelli Symonds (QS) and Times Higher Education (THE) faculty student ratio indicators.  Also, staff in the spun off branches and schools generally produce less research than those at the main campus so you will get a boost in the productivity per capita indicator in the Shanghai ARWU rankings.

2.  Kick out the old and bring in the young

Get rid of ageing professors, especially if unproductive and expensive, and hire lots of indentured servants adjunct and temporary teachers and researchers. Again, this will improve the university’s performance on the THE and QS faculty student ratio indicators. They will not count as senior faculty so this will be helpful for ARWU.

3.  Hire research assistants

Recruiting slave labour cheap or unpaid research assistants (unemployed or unemployable graduate interns?) will boost the score for faculty student ratio in the QS rankings, since QS counts research-only staff for their faculty student indicator. It will not, however, work for the THE rankings.  Remember that for QS more faculty are good for faculty student ratio but bad for citations per faculty so you have to analyse the potential trade off carefully.

4.  Think about an exit option

If an emerging university wants to be included in the rankings it might be better to focus on just one of them.  Panjab University is doing very well in the THE rankings but does not appear in the QS rankings. But remember that if you apply to be ranked by THE and you do not like your placing then it is always possible to opt out by not submitting data next year. But QS has a Hotel California policy: once in, you can check out but you can never leave. It does not matter how much you complain about the unique qualities of your institution and how they are neglected by the rankers, QS will go on ranking you whether you like it.

5. Get a medical school

 If you do not have a medical school or a research and/or teaching hospital then get one from somewhere. Merge with an existing one or start your own. If you have one, get another one. Medical research produces a disproportionate number of papers and citations which is good for the QS citations per faculty indicator and the ARWU publications indicator. Remember this strategy may not help so much with THE who use field normalisation. Those citations of medical research will help there only if they above the world average for field and year.

Update August 2016: QS now have a moderate form of field normalisation so the advantage of a medical school is reduced but the Shanghai rankings are still biased towards medical research.

6. But if you are a medical school, diversify

QS and THE supposedly do not include single subject institutions in their general rankings, although from time to time one will, like the University of California at San Francisco, Aston Business School or (National Research Nuclear University) Moscow Engineering Physics Institute (MEPhI), slip through.  If you are an independent medical or single subject institution consider adding one or two more subjects then QS and THE will count you although you will probably  start sliding down the ARWU table.

Update August 2016: the QS BRICS rankings include some Russian institutions that look like they focus on one field and National Research Nuclear University MePhI is back in the THE world rankings.

7. Amalgamate

The Shanghai rankings count the total number of publications in the SCI and SSCI, the total number of highly cited researchers and the total number of papers without regard for the number of researchers. THE and QS count the number of votes in their surveys without considering the number of alumni.

What about a new mega university formed by merging LSE, University College London and Imperial College? Or a tres grande ecole from all those little grandes ecoles around Paris?

Update August 2016: This is pretty much what the University of Paris-Saclay is doing.

8. Consider the weighting of the rankings

THE gives a 30 % weighting to citations and 2.5% to income from industry. QS gives 40 % to its academic survey and 5 % to international faculty. So think about where you are going to spend your money.

9.  The wisdom of crowds

Focus on research projects in those fields that have huge multi - “author”  publications, particle physics, astronomy and medicine for example.  Such publications often have very large numbers of citations. Even if your researchers make a one in two thousandth contribution Thomson Reuters, THE’s data collector, will give them the same credit as they would get if they were the only authors. This will not work for the Leiden Ranking which uses fractionalised counting of citations. Note that this strategy works best when combined with number 10.

Update August 2016: THE methodological changes in 2015 mean that this does not work any more. Look at what happened to Middle East Technical University. But it is still worth looking out for projects with dozens or scores of contributors. 

10.  Do not produce too much

You need to produce 200 papers a year to be included in the THE rankings. But producing more papers than this might be counterproductive. If your researchers are producing five thousand papers a year then those five hundred citations from a five hundred “author” report on the latest discovery in particle physics will not have much impact. But if you are publishing three hundred papers a year those citations will make a very big difference. This is why Dr El Naschie’s frequently cited papers in Chaos, Solitons and Fractals were a big boost for Alexandria University but not for Cambridge, Surrey, Cornell and Frankfurt universities with whom he also claimed affiliation. However, Leiden will not rank universities until they reach 500 papers a year.

Update August 2016: See number 9.

11.  Moneyball Strategy

In his book Moneyball, Michael Lewis recounted the ascent of the Oakland As baseball team through a strategy of buying undervalued players. The idea was to find players who did things that led to their teams winning even if they did not match the stereotype of a talented player.

This strategy was applied by George Mason University in Virginia who created a top basketball team by recruiting players who were overlooked by scouts because they were too small or too fat and a top economics department by recruiting advocates of a market economy at a time when such an idea was unfashionable.

Universities could recruit researchers who are prolific and competent but are unpromotable or unemployable because they are in the wrong group or fail to subscribe enthusiastically to current academic orthodoxies. Maybe start with Mark Regnerus and Jason Richwine.

Update August 2016: See the story of Tim Groseclose's move from UCLA to George Mason

12. Expand doctoral programmes

One indicator in the THE world rankings is the ratio of doctoral to bachelor degree students.

Panjab University recently announced that they will introduce integrated masters and doctors programmes. This could be a smart move if it means students no longer go into master’s programmes but instead into something that can be counted as a doctoral degree program.

13.  The importance of names

Make sure that your researchers know which university they are affiliated to and that they know its correct name. Make sure that branch campuses, research institutes and other autonomous or quasi- autonomous groups incorporate the university name in their publications. Keep an eye on Scopus and ISI and make sure they know what you are called. Be especially careful if you are an American state university.

14.   Evaluate staff according to criteria relevant to the rankings

If staff are to be appointed and promoted according to their collegiality,  the enthusiasm with which they take part in ISO exercises,  community service, ability to make the faculty a pleasant place for everybody  or commitment to diversity then you will get collegial, enthusiastic etc faculty. But those are things that the rankers do not – for once with good reason – attempt to measure.

While you are about it get rid of interviews for staff and students. Predictive validity ranges from zero to low

15.  Collaborate

The more authors a paper has the more likely it is to be cited, even if it is only self-citation.  Also, the more collaborators you have the greater the chances of a good score in the reputation surveys. And do not forget the percentage of collaborators who are international is also an indicator in the THE rankings

16. Rebrand

It would be good to have names that are as distinctive and memorable as possible. Consider a name change. Do you really think that the average scientist filling out the QS or the THE reputation surveys is going to remember which of the sixteen (?) Indian Institutes of Technology is especially good in engineering.

Update August 2016: But not too memorable. I doubt that Lovely Professional University will get the sort of public interest it is hoping for.

17. Be proactive

Rankings are changing all the time so think about indicators that might be introduced in the near future. It would seem quite easy, for example, for rankers to collect data about patent applications.

Update August 2016: Make sure everyone keeps their Google Scholar Citations Profiles up to date.

18. Support your local independence movement

It has been known for a long time that increasing the number of international students and faculty is good for both the THE and QS rankings. But there are drawbacks to just importing students. If it is difficult to move students across borders why not create new borders?

If Scotland votes for independence in next year’s referendum its scores for international students and international faculty in the QS and THE rankings would go up since English and Welsh students and staff would be counted as international.

Update August 2016: Scotland didn't but there may be another chance. 

19. Accept that some things will never work

Realise that there are some things that are quite pointless from a rankings perspective. Or any other for that matter.  Do not bother telling staff and students to click away at the website to get into Webometrics. Believe it or not, there are precautions against that sort of thing. Do not have motivational weekends. Do not have quality initiatives unless they get rid of the cats.

Update August 2016: That should read do not do anything "motivational". The only thing they motivate is the departure of people with other options.

20.  Get Thee to an Island

Leiden Ranking has a little known ranking that measures the distance between collaborators. At the moment the first place goes to the Australian National University. Move to Easter Island or the Falklands and you will be top for something.

Thursday, December 19, 2013

The QS BRICS Rankings

Quacquarelli Symonds (QS), in partnership with  Interfax, the Russian news agency, have just published their BRICS [Brazil, Russia, India, China, South Africa] University Rankings. The top ten are:

1.   Peking University
2.   Tsinghua University
3.   Lomonosov Moscow State University
4.   Fudan University
5.   Nanjing University
6=  University of Science and Technology China
6=  Shanghai Jiao Tong University
8.   Universidade de Sao Paulo
9.   Zhejiang University
10.  Universidade Estadual de Campinas

The highest ranked Indian university is the Indian Institute of Technology Delhi in thirteenth place and the top South African institution is the University of Cape Town which is eleventh.

The methodology is rather different from the QS World University Rankings. The weighting for the academic survey has been reduced to 30% and that for the employer survey has gone up to 20%. Faculty student ratio accounts for 20% as it does in the world rankings, staff with PhDs for 10%, papers per faculty for 10%, citations per paper for 5% and international faculty and students for 5%.

There are some noticeable differences between these rankings and the BRICS and emerging countries rankings produced by Times Higher Education and Thomson Reuters.    

Moscow State University is ahead of the University of Cape Town in the QS rankings but well behind in the THE rankings.

In the QS rankings the Indian Institutes of Technology are supreme among Indian institutions. There are seven before the University of Calcutta appears in 52nd place. In the THE rankings the best Indian performer was Panjab University, which is absent from the QS rankings.

I suspect that Panjab University is an example of rankings shopping, where universities target one specific ranking, and that there is a very smart person directing its ranking strategy. Panjab University has invested money in participation in  the Hadron Collider project, exactly where it would profit from TR's field normalised citations indicator, while the number of publications did not rise excessively. Recently the university has proposed to establish integrated master's and doctoral programs, good for two TR  indicators and to increase research collaboration, good for another.

The Moscow State Engineering Physics Institute, which was removed from the THE world rankings this year year because it was a single subject institution, is in 65th place in this table.

Sunday, December 08, 2013

Africa Excels in Latest THE Rankings, China Performs Poorly

It is unlikely that you will see a headline like this in the mainstream media. Experts and analysts have focused almost exclusively on the number of universities in the top 10 or the top 50 or the top 100 of the Times Higher Education (THE) BRICS and Emerging Economies Rankings (BRICSEE) -- powered, in case anyone has forgotten, by Thomson Reuters -- and concluded that China is the undisputed champion of the world.

Looking at the number of  universities in the BRICSEE rankings compared  to population -- as in the previous post -- gives a different picture with China still ahead of Russia, India and Brazil but not so much.

Another way of analysing a country's higher education  system is by looking at the proportion of  universities that achieve "world class" status.

Assuming -- a big assumption I agree -- that getting into the BRICSEE top 100 is a measure of world class quality, then the percentage of a country's universities that are world class might  be considered a guide to the overall quality of the higher education system.

Here is a ranking of the BRICS and emerging countries according to the percentage of universities in the THE BRICSEE  top 100.

The total number of universities is in brackets and is derived from Webometrics.

First place goes  to South Africa. Egypt is third. Even Morocco does better than Russia and Brazil. China does not do very well although it is still quite a bit ahead of India, Russia and Brazil. Taiwan remains well ahead of Mainland China.

Of course, this should not be taken too seriously. It is probably a lot harder to start a university in Taiwan than it is in Brazil or India. South Africa has a large number of professional schools and private colleges that are not counted by Webometrics that may be of a similar standard to universities in other countries.

Some of the high fliers might find that their positions are precarious. Egypt's third university in the BRICSEE rankings is  Alexandria which is still reaping the benefits from Dr El Naschies' much cited papers in 2007 and 2008 but that will not last long. The UAE's role as an international higher education hub may not survive a fall in the price of oil.


1.     South Africa (25)     20.00%
2.     Taiwan (157)     13.38%
3.     Egypt (59)     5.08%
4.     Turkey (164)      4.27%
5=    UAE (50)     4.00%
5=    Hungary (75)     4.00%
7.    Czech Republic (82)     3.66%
8.    Thailand (188)    2.66%
9.    Chile (78)     2.56%
10.   Malaysia  (91)     2.20%
11.   China (1164)     1.98%
12.   Poland (440)     0.91%
13.   India (1604)     0.62%
14.   Morocco (212)     0.47%
15.   Colombia  (285) 0.35%
16.   Brazil  (1662)    0.24%
17.   Mexico  (898)     0.22%
18.   Russia  (1188)     0.17%
19=   Indonesia  (358)     0%
19=   Philippines (265)  0%
19=   Pakistan  (300)  0%
19=   Peru (92)     0%








Thursday, December 05, 2013

The THE BRICS and Emerging Markets Rankings: Good News for China?

Times Higher Education (THE) has just published its BRICS and Emerging Economies Rankings. The methodology is the same as that used in their World University Rankings and the data was supplied by Thomson Reuters. Emerging economies are those listed in the FTSE Emerging Markets Indices.

At first sight, China appears to do very well with Peking University in first place and Tsinghua University in second and a total of  23 universities in the top 100.

Third place goes to the University of Cape Town while Taiwan National University is fourth and Bogazici University in Turkey is fifth.

Taiwan has 21 universities in the Top 100, India 10, Turkey 7 and South Africa and Thailand 5 each.

Although China tops the list of "hot emergent properties", as THE puts it, and Simon Marginson compares the PRC favourably to Russia which is "in the doldrums", we should remember that China does have a large population. When we look at population size, China's achievement shrinks considerably while Taiwan emerges as the undisputed winner, Eastern Europe does very well and the gap between Russia and China is drastically reduced.

The following is the number of universities in the the BRICS and Emerging Economies University Rankings per 1,000,000  population (Economist Pocket World in figures 2010) The total number of universities in the rankings is in brackets.

1.    Taiwan (21)   0.913
2.     United Arab Emirates (2)   0.400
3=    Czech Republic (3)  0.300
3=    Hungary   (3)   0.300
5.     Chile (2)   0.118
6.     South Africa (5)   0.114
7.     Poland (4)   0.102
8.     Turkey (7)   0.093
9=     Malaysia (2)   0.077
9=    Thailand (5)   0.077
11.    Egypt (3)   0.039
12.    Morocco (1)   0.0031
13.    Brazil (4)   0.021
14.   Colombia (1)   0.021
15.   Mexico (2)   0.018
16.   China (23)   0.017
17.    Russia (2)   0.014
18.    India (10)   0.008
19=   Indonesia (0)   0.000
19=   Pakistan (0)   0.000
19=   Peru   (0)   0.000
19=   Philippines   (0)   0.000


It is very significant that the top two universities in these rankings are in China.  But, taking population size into consideration, it looks as though Mainland China is still way behind Taiwan, Singapore and Hong Kong and even the smaller nations of Eastern Europe.