Tuesday, September 29, 2015

Japanese Barbarians Out to Crush Humanities!

The international education media has been getting very excited recently about what appeared to be an extraordinary act of cultural vandalism by the Japanese Ministry of Education.

It seems that the ministry has been behaving like the Taliban on a rampage through the Louvre and has ordered public universities to stop teaching the humanities and social sciences.

Noah Smith, an Assistant Professor of Finance at Stony Brook University SUNY and a freelance writer, wrote that public universities had been ordered to stop teaching social sciences, humanities and law, although apparently the "order" was non-binding.

Meanwhile Takamitsu Sawa announced in the Japan Times that the humanities were under attack and that someone on the ministry's panel of learned persons had said that students should study accounting software instead of Samuelson's Economics and translation instead of Shakespeare.

Eventually,  the Financial Times revealed that that the ministry had been misinterpreted and that the abolition of the humanities referred to a number of unneeded teacher training programs. This was supported by an authoritative comment by a former government official.

So it seems  that Samuelson and Shakespeare are safe from the rampage of utilitarian barbarians.

Perhaps Japanese universities can now adopt the best practices of Columbia and the University at Buffalo for the teaching of art.


Sunday, September 27, 2015

Latest on the THE Rankings Methodology

Times Higher Education (THE) have now officially announced the methodology of next week's World University Rankings. There are some changes although major problems are still not addressed.

First, THE is now getting data from Scopus rather than Thomson Reuters. The Scopus database is more inclusive -- it covers 22,000 publications compared to 12,000 -- and includes more papers from non-English speaking countries so this may give an advantage to some universities in Eastern Europe and Asia.

Second, THE has tried to make its reputation survey more inclusive, making forms available in an additional six languages and reducing the bias to the USA.

Third, 649 papers with more than 1,000 listed authors, mainly in physics, will not be counted for the citations indicator.

Fourth, the citations indicator will be divided into two with equal weighting. One half will be be with and one half without the "regional modification" by which the overall citations impact score of a university is divided by the square root of the score for the country in which it is located. In previous editions of these rankings this modification gave a big boost to universities in low scoring countries such as Chile, India, Turkey and Russia.

It is likely that Institutions such as Bogazici University, Panjab University, Federico Santa Maria Technical University and Universite Cadi Ayyad, which have benefited from contributing to mega-papers such as those emanating from the Large Hadron Collider project,will suffer from the exclusion of these papers from the citations indicator. Their pain will be increased by the dilution of the regional modification.

It is possible that such places may get some compensation in the form of more responses in the reputation survey or higher publication counts in the Scopus database but that is far from certain. I suspect that several university administrators are going to be very miserable next Thursday.

There is something else that should not be forgotten. The scores published by THE are not raw data but standardised scores derived from standard deviations and means. Since THE are including more universities in this year's rankings and since most of them are likely to have low scores for most indicators it follows that the overall mean scores of ranked universities will fall. This will have the effect of raising the standardised scores of the 400 or so universities that score above the mean. It is likely that this effect will vary from indicator to indicator and so the final overall scores will be even more unpredictable and volatile.

Tuesday, September 22, 2015

Looking Inside the Engine: The Structure of the Round University Rankings

Many of those interested in international university rankings have been frustrated by the lack of transparency in the Quacquarelli Symonds (QS) and the Times Higher Education (THE) rankings .

The QS rankings assign a fifty per cent weighting to two surveys collected from a variety of channels -- I think six for the employer survey and five for the academic survey -- with different and fluctuating response rates.

The THE rankings have lumped five indicators in a Teaching cluster, three in a Research cluster and three in an International cluster. So how can anyone figure out just what is causing a university to rise or fall in the rankings?

A major step forward in transparency has now come with the recent publication of the Round University Rankings (RUR) by a Russian organisation that uses data from Thomson Reuters (TR), who provided the data for the Times Higher Education world and regional rankings from 2009 until the end of last year.

RUR have published the separate scores for all of the indicators. They have retained 12 out of the 13 indicators used in the THE rankings from 2011 to 2014, dropping income from industry as a percentage of research income, and added another eight.

I doubt that RUR could afford to pay TR very much for the data and I suspect that TR's motive in allowing the dissemination of such a large amount of information is to preempt THE or anyone else trying to move upstream in the drive to monetise data.

It is now possible to see whether the various indicators are measuring the same thing and hence are redundant, whether and to what extent  they are associated with other indicators and whether there is any link between markers of input and markers of output.

Here is a crude analysis of a very small sample of sixteen, one in fifty, of the RUR rankings starting with Harvard and ending with the Latvia Transport and Telecom Institute. I hope that a more detailed analysis of the entire corpus can be done in a few weeks.

The combined indicator groups

Three groups, Teaching, Research, Financial Sustainability, are fairly closely associated with one another. The teaching cluster correlates .634 with Research and .735 with Financial Sustainability. Research correlates .702 with Financial Sustainability.

The International Diversity group appears to be the odd one out here. It correlates significantly with Research (.555) but not with Teaching or Financial Sustainability. This suggests that internationalisation, at least in the form of recruiting more international students, may not always be a strong marker of quality.


The Reputation Indicators

Looking at the three reputation indicators, teaching, international teaching and research, we can see that for practical purposes they are measuring the same thing. The correlation between the Research Reputation and Teaching Reputation scores is .986 and between Research Reputation and International Teaching Reputation .925. Between Teaching Reputation and International Teaching Reputation it is .941.

Alex Usher of Higher Education Strategy Associates has claimed a correlation of .99 between teaching and research reputation scores in the THE rankings up to 2014. The figures from the RUR rankings are a bit lower but essentially the reputation indicators are measuring the same thing, whatever it is, and there is no need to count them more than once.

Other Unnecessary Indicators

Turning to the international indicators, the correlation between Academic Staff per Students and Academic Staff per Bachelor Degrees is very close at .834. The latter, which has not been in any previous ranking, could be omitted without a significant loss of information,

There is an extremely high correlation, .989,  between Citations per Academic and Research Staff and Papers per Academic and Research Staff. It sounds rather counter-intuitive but it seems that as measure of research productivity one is as good as the other, at least when dealing with more than a few hundred elite universities

There is a correlation of .906 between  Institutional Income  per Academic Staff and Institutional Income per Student.

It would appear then that the THE rankings of 2011-2014 with 13 indicators had passed the point beyond which additional indicators become redundant and provide no additional information.

Input and Outputs

There are some clues  about the possible relationship between indicators that could be regarded as inputs and those that might be counted as outputs.

Academic Staff per Student does not  significantly affect teaching reputation (.350, sig .183). It is positively and significantly associated only with doctoral degrees per bachelor degrees  (.510). The correlation with the overall score is, however, quite high and significant at .552.

There is some evidence that a diverse international faculty might have a positive impact on research output and quality. The correlations between International Faculty and Normalised Citation Impact, Papers per Academic and Research Staff and the overall score are positive and significant. On the other hand, the correlations between international collaboration and overall score and international students and overall score are weak and insignificant.

Money seems to help at least as as far as research is concerned. There are moderately high and significant correlations between International Income per Academic Staff and Citations per Academic and Research Staff, Papers per Academic and Research Staff, Normalised Citation Impact and the overall score.

Research Income per Academic Staff correlates highly and significantly with Teaching Reputation, International Teaching Reputation, Research Reputation, Citations per Academic  and Research Staff, Papers per Academic and Research Staff, Normalised Citation Impact and the overall score.

















Saturday, September 19, 2015

Who's Interested in the QS World University Rankings?

And here are the first ten results (excluding this blog and the QS page) from a Google search for this year's QS world rankings. Compare with ARWU and RUR. Does anyone notice any patterns?


Canada falls in World University Rankings' 2015 list

UBC places 50th, SFU 225th in QS World University Rankings



















Who's interested in the Round University Rankings?


The top results from a Google search for responses to the recently published Round Universities Rankings 



New Ranking from Russia










Who's Interrested in the Shanghai Rankings?

First results from a Google search for responses to the latest edition of the Shanghai world rankings.

Radboud University: 132nd place on ARWU/Shanghai ranking 2015







Friday, September 18, 2015

The Italian Ranking Dance

As noted in the previous post, the latest QS world rankings have not been well received by the Italian blog ROARS. Their opinion of the reaction of  the Italian media and public was summarised by posting the following video





Who believes QS?

From the Italian site ROARS: Return on Academic Research (Google translation):


According to the ranking Quacquarelli Symonds (QS) in Siena that something would happen in a year they have lost 220 (two hundred) rankings. But Pavia and Turin have collapsed by over 150 people coming out of the top-500; They have lost more than 100 positions Pisa, Tor Vergata, Federico II of Naples, Milan Catholic, Genoa, Perugia and Bicocca. The meltdown is simply due to the fact that QS has changed the methodology used in its construction ranking. Gaining places only the Polytechnics of Milan and Turin, as provided by Richard Holmes more than a month ago, when the news was spread of the change in methodology. I hope that the collapse of the Italian university in 2015 "certified" by QS and caused by the change of methodology, be a lesson: the rankings are not a serious way to evaluate the performance of universities.Unfortunately, judging from press and of POLIMI POLITO, it seems that the lesson has been helpful.

Thursday, September 17, 2015

University Quality and Bias

Anticipating requests, here is the link for a significant paper by Christopher Claassen of the University of Essex.

Measuring University Quality by Christopher Claassen


http://www.chrisclaassen.com/University_rankings.pdf


Tuesday, September 15, 2015

Auto-Induced Fly Catching

Taking a break from the most exciting or second most exciting educational event this month, I have just received a message from Google Scholar Citations asking if I wanted to add the following to my profile:

JG LANE, RJ Holmes
BRITISH VETERINARY JOURNAL 128 (9), 477-&, 1972

Unfortunately, I couldn't claim credit for this work. In 1972 I was still immersed in the Irish Home Rule Debate and the rise of the Sokoto Caliphate.

I wonder whether this was an attempt to develop an environmentally friendly form of pest control or is this a serious mental disorder among certain kinds of dogs?

Is it too late to submit a nomination for the IgNobel awards?




Tuesday, September 08, 2015

Global Ranking From Russia

A very interesting new set of global rankings appeared seven days ago, the Round University Ranking from Russia. The organization is rather mysterious, although probably not so much in Russia and nearby places.

The rankings are based entirely on data from Thomson Reuters (TR) and the structure and methodology are similar to last year's Times Higher Education (THE) World University Rankings. They include 12 out of the 13 indicators used in the 2014 THE rankings, with only the percentage of research income derived from industry omitted. There are eight more measures making a total of twenty, five each  for teaching, research, international diversity and financial sustainability.

There is a normalized citations indicator with a weighting of only eight per cent, balanced by a simple count of citations per academic and research staff, also with eight per cent.

Altogether the three reputation indicators count for 18 per cent of the weighting compared to 33 per cent in the 2014 THE rankings or 50 per cent  in the Quacquarelli Symonds (QS) world rankings

To date these rankings appear to have been ignored by the world media except in Russia and its neighbors. Compared to the excitement with which the THE or even the QS or Shanghai rankings are greeted this might seem a bit odd. If the THE rankings were sophisticated  because they had 13 indicators then these are even more so with 20. If the THE rankings were trusted because they were powered by Thomson Reuters so are these. If the survey in the THE rankings was real social science then so is this.

Could it be that the THE rankings are beloved of the Russell Group and its like around the world not because of their robustness, comprehensiveness, transparency or superior methodology but because of the glamour derived from a succession of prestigious events, networking dinners and exclusive masterclasses designed to appeal to the status anxieties of upwardly or downwardly mobile university administrators?

There are some problems with the RUR rankings. There is incoherence about what the indicators are supposed to measure.The methodology says that '[I]t is assumed that "undergraduate" level is the core of higher education' so there is an indicator measuring academic staff per bachelor degree.  But then we have a weighting of eight per cent for doctoral degrees per bachelor degrees.

One excellent thing about these rankings is that the score for all of the indicators can be found in the profiles of the individual universities. If anyone has the energy and time there are some important questions that could be answered . Is the correlation between teaching and research reputation so high that a distinction between the two is redundant? Is income or number of faculty a better prediction of research performance?

The presentation leaves a lot to be desired. Cooper League? The explanation of the methodology verges on the incomprehensible. Can somebody tell RUR to get a competent human to translate for them and forget about the Google Translator?

The economics of the relationship between TR and RUR are puzzling. There are no obvious signs that RUR has a large income from which to pay TR for the data and  I doubt that TR has passed it on for purely altruistic reasons. Could it be that TR are simply trying to undercut THE's attempt to go it alone? If nothing else, it could undermine any THE plans to go into the benchmarking and consulting trade.

Anyway,  here are some first places. No surprises here, except maybe for Scuola Superiore Normale Pisa. You can find out exactly where the strengths of that school are by checking the scores for the twenty indicators.


Overall:                              Harvard
Teaching:                           Caltech
Research:                           Chicago
International Diversity:     EPF Lausanne
Financial Sustainability:    Caltech

China:                     Peking 49th
Russia:                    Moscow State 187th
India:                      IIT Kharagpur  272nd
UK:                         ICL 5th
Germany:                Munich 22nd
France:                    Ecole Polytechnique   19th
Egypt:                     American University Cairo 571st
South Africa:          Cape Town  201st
Brazil:                     Sao Paulo 65th
Italy:                       Scuola Normale Superiore Pisa 66th
Turkey:                   METU 308th
Malaysia:                Universiti Putra Malaysia 513th
Australia:                ANU 77th
Japan:                     Tokyo 47th
Korea:                     KAIST  41st.


Sunday, September 06, 2015

More on Alternative Indicators for Ranking African Universities


Continuing with our exploration of how to rank universities outside the world's top 200 or 400 where it is necessary to develop robust and sophisticated techniques of standardisation, normalisation, scaling, regional modification, taking away the number you first thought of (just kidding) verification, weighting and validation to figure out that Caltech's  normalised research impact is slightly better than Harvard's or that Cambridge is a bit more international than that place in the other Cambridge, here is a ranking of African universities according to recommendations in LinkedIn.

There are obvious problems with this indicator, not least of which is the tiny number of responses compared to all the students on the continent. It might, however, be the precursor to a useful survey of student opinion or graduate employability later on.

First place goes to the University of South Africa, an open distance education institution whose alumni include Nelson Mandela, Cyril Ramaphosa and Jean-Bertrand Aristide. Makerere University, the University of Nairobi and Kenyatta University do well.

Data was compiled on the 28th and 29th of July. All universities included in the THE experimental African ranking, the top fifty African universities in Webometrics plus the top universities in Webometrics or 4icu of any country still nor included.


RankUniversityCountryNumber of LinkedIn Recommendations
1  University of South AfricaSouth Africa 154
2Makerere University Uganda116
3University of the Witwatersrand        South Africa94
4  University of IbadanNigeria86
5University of JohannesburgSouth Africa79
6University of NairobiKenya75
7Cairo UniversityEgypt67
8Stellenbosch UniversitySouth Africa63
9University of PretoriaSouth Africa      62
10Kenyatta UniversityKenya61
11University of Cape TownSouth Africa60
12University of LagosNigeria58
13Addis Ababa UniversityEthiopia55
14Obafemi Awolowo UniversityNigeria50
15Alexandria UniversityEgypt47
16Rhodes UniversitySouth Africa42
17Jomo Kenyatta University of
Agriculture and Technology
Kenya
40
18American University in CairoEgypt28
19University of Kwazulu-NatalSouth Africa26
20University of IlorinNigeria24
21University of ZimbabweZimbabwe22
22Kwame Nkrumah University of Science and TechnologyGhana21
23Helwan UniversityEgypt
20
24=North West UniversitySouth Africa18
24=University of GhanaGhana18
24=University of Port HarcourtNigeria18
27=Durban University of TechnologySouth Africa16
27=University of Dar Es SalaamTanzania16
29=Nelson Mandela Metropolitan
University
South Africa14
29=University of the Western CapeSouth Africa14
31Cape Peninsula University of
Technology
South Africa13
32Mansoura universityEgypt12
33University of BotswanaBotswana10
34Covenant UniversityNigeria9
35=Zagazig UniversityEgypt7
35=Suez Canal universityEgypt7
37Tanta UniversityEgypt6
38=Assiut UniversityEgypt5
38=Université Constantine 1 Algeria5
40=University of the Free StateSouth Africa4
40=Universite des Sciences et de la
Technologie Houari Boumediene
Algeria
4
42+South Valley UniversityEgypt3
42+Université Cadi Ayyad Morocco2
42+University ofTunisTunisia2
42+University of NamibiaNamibia1
42+University of MauritiusMauritius1
42+Université Cheikh Anta Diop Senegal0
42+Université Mohammed V SouissiMorocco0
42+University of KhartoumSudan0
42+University of MalawiMalawi0
42+Université Hassan II Ain ChockMorocco0
42+Kafrelsheikh UniversityEgypt0
42+University of ZambiaZambia0
42+Bejaia universityAlgeria0
42+Minia UniversityEgypt0
42+Benha UniversityEgypt0
42+Universidade Católica de AngolaAngola0
42+Université de LoméTogo0
42+Université Abou Bekr BelkaidAlgeria0
42+Beni-Suef UniversityEgypt0
42+Université Omar BongoGabon0
42+University of the GambiaGambia0
42+Université de ToliaraMadagascar0
42+Université Kasdi Merbah OuargAlgeria0
42+Universite de la ReunionReunion0
42+Université d'Abomey-CalaviBenin0
42+Universidade Eduardo MondlaneMozambique0
42+Université de OuagadougouBurkina Faso0
42+University of RwandaRwanda0
42+Universite de BamakoMali0
42+University of SwazilandSwaziland0
42+Université Félix Houphouët-BoignyIvory Coast0
42+Université de KinshasaDemocratic Republic of the Congo0
42+National University of LesothoLesotho
0
42+Universidade Jean Piaget de Cabo VerdeCape Verde0
42+N Engineering S of SfaxTunisia 0
42+Université Marien NgouabiRepublic of the Congo
0
42+University of LiberiaLiberia0
42+Université Djillali LiabesAlgeria0
42+Université Abdou Moumouni de NiameyNiger0
42+Misurata UniversityLibya0
42+Université de DschangCameroons0
42+Université de BanguiCentral African Republic0
42+Université de NouakchottMauritania0
42+Eritrea Institute of TechnologyEritrea0
42+Université de DjiboutiDjibouti0
42+University of SeychellesSeychelles0
42+Mogadishu UniversitySomalia0
42+Universidad Nacional de Guinea Ecuatorial Equatorial Guinea0
42+Universite Gamal Abdel Nasser de ConakryGuinea0
42+University of MakeniSierra Leone0
42+John Garang Memorial UniversitySouth Sudan0
42+Hope Africa UniversityBurundi0
42+Universite de MoundouChad0
42+Universite de Yaounde ICameroons0




Tuesday, September 01, 2015

Best German and Austrian Universities if you Want to get Rich

If you want to go a university in Germany or Austria and get rich afterwards, the website Wealth-X has a ranking for you. It counts the number of UHNW (ultra high net worth) alumni, those with US$ 30 million or above.

Here are the top five with the number of UHNW individuals in brackets.

1. University of Cologne     (18)
2. University of Munich      (14)
3. University of Hamburg    (13)
4. University of Freiburg     (11)
5. University of Bonn          (11)

There may well be protests about who should be first. In tenth place is "Ludwig Maximilians University Munich (LMU Munich)", which I assume is another name for the University of Munich, with six UHNW alumni .