Monday, November 21, 2016

TOP500 Supercomputer Rankings

Every six months TOP500 publishes a list of the five hundred most powerful computer systems n the world. This is probably a good guide to the economic, scientific and technological future of the world's nation states.

The most noticeable change since November 2015 is that the number of supercomputers in China has risen dramatically from 108 to 171 systems while the USA has fallen from 200 to 171. Japan has fallen quite considerably from 37 to 27 and Germany and the UK by one each. France has added two supercomputers to reach 20.

In the whole of Africa there is exactly one supercomputer, in Cape Town. In the Middle East there are five, all in Saudi Arabia, three of them operated by Aramco.

Here is a list of countries with the number of computers in the top 500.

China 171
USA 171
Germany 32
Japan 27
France 20
UK 17
Poland 7
Italy 6
India  5
Russia 5
Saudi Arabia 5
South Korea 4
Sweden 4
Switzerland 4
Australia 3
Austria 3
Brazil 3
Netherlands 3
New Zealand 3
Denmark 2
Finland 2
Belgium 1
Canada 1
Czech Republic 1
Ireland 1
Norway 1
Singapore 1
South Africa 1
Spain 1





Friday, November 18, 2016

QS seeks a Passion Integrity Empowerment and Diversity compliant manager

The big ranking brands seem to be suffering from a prolonged fit of megalomania, perhaps caused by the toxic gases of Brexit and the victory of the deplorables. The "trusted" THE, led by the "education secretary of the world", has just made a foray into the US college ranking  market, published a graduate employability ranking and is now going to the University of Johannesburg for a BRICS Plus Various Places summit.

Meanwhile the "revered" QS, creator of "incredibly successful ranking initiatives"  also appears to be getting ready for bigger and better things. They are advertising for a Ranking Manager who will be

"a suitably accomplished and inspirational leader", and possess "a combination of analytical capability, thought leadership and knowledge of the global higher education landscape" and " ensure an environment of Passion, Integrity, Empowerment and Diversity is maintained" and be "(h)ighly analytical with extensive data modelling experience" and have "great leadership attributes".

And so on and so on. Read it yourself. If you can get through to the end without laughing you could be a suitable candidate.

I can't wait to see who gets the job.

Wednesday, November 02, 2016

More on teaching-centred rankings

The UK is proposing to add a Teaching Excellence Framework (TEF) to the famous, or infamous, Research Excellence Framework (REF). The idea is that universities are  to be judged according to their teaching quality which is to be measured by how many students manage to graduate, how satisfied students are with their courses and whether graduates are employed or in postgraduate courses shortly after graduation.

There are apparently going to be big rewards for doing well according to these criteria. It seems that universities that want to charge high tuition fees must reach a certain level.

Does one have to be a hardened cynic to suspect that there is going to be a large amount of manipulation if this is put into effect? Universities will be ranked according to the proportion of students completing their degrees? They will make graduating requirements easier, abolish compulsory courses in difficult things like dead white poets, foreign languages or maths, or allow alternative methods of assessment, group work, art projects and so on. We have, for example, already seen how the number of first and upper second class degrees awarded by British universities has risen enormously in the last few years.

Universities will be graded by student satisfaction? Just let the students know, very subtly of course, that if they say their university is no good then employers are less likely to give them jobs. Employment or postgraduate courses six months after graduation? Lots of internships and easy admissions to postgraduate courses.

In any case, it is all probably futile. A look at the Guardian University Guide rankings in a recent post here shows that if you want to find out about student outcomes six months after graduation the most relevant number the is average entry tariff, that is 'A' level grades three or four years earlier.

I doubt very much that employers and graduate, professional and business schools are really interested in the difference between an A and an A* grade or even an A and a B. Bluntly, they choose from candidates who they think are intelligent and trainable, something which correlates highly with 'A' Level grades or, across the Anglosphere Lake, SAT, ACT and GRE scores, and display other non-cognitive characteristics such as conscientiousness and open-mindedness. Also, they tend to pick people who generally resemble themselves as much as possible. Employers and schools tend to select candidates from those universities that are more likely to produce large numbers of graduates with the desired attributes.

Any teaching assessment exercise that does not measure or attempt to measure the cognitive skills of graduates is likely to be of little value.

In June Times Higher Education (THE) ran a simulation of a ranking of UK universities that might result from the TEF exercise. There were three indicators, student completion of courses, student satisfaction and graduate destinations, that is number of graduates employed or in post graduate courses six months after graduation. In addition to absolute scores, universities were benchmarked for gender, ethnicity, age, disability and subject.

There are many questions about the methodology of THE exercise, some of which are raised in the comments on the THE report.

The THE simulation appears to confirm that students' academic ability is more important than anything else when it comes to their career prospects. Comparing the THE scores for graduate destinations (absolute) with the other indicators in the THE TEF simulation and the Guardian rankings we get the following correlations.

Graduate Destinations (THE absolute) and:

Average Entry Tariff (Guardian)  .772
Student completion (THE absolute)  .750
Staff student ratio (Guardian inverted)  .663
Spending per student (Guardian)  .612
Satisfaction with course (Guardian)  .486
Student satisfaction (THE absolute)  .472
Satisfaction with teaching (Guardian)  .443
Value added (Guardian)  .347
Satisfaction with feedback (Guardian)  -.239

So, a high score in the THE graduate destinations metric, like its counterpart in the Guardian rankings, is associated most closely with students' academic ability and their ability to finish their degree programmes, next with spending, moderately with overall satisfaction and satisfaction with teaching, and substantially less so with value added. Satisfaction with feedback has a negative association with career success narrowly defined.

Looking at the benchmarked score for Graduate Destinations we find that the correlations are more modest than with the absolute score. But average entry tariff is still a better predictor of graduate outcomes than value added.

Graduate Destinations (THE distance from benchmark) and:

Student completion (THE benchmarked) .487
Satisfaction with course (Guardian)  .404
Staff student ratio (Guardian inverted)  .385
Average entry tariff (Guardian)  .383
Spending per student (Guardian)  .383
Satisfaction with teaching (Guardian) .324
Student satisfaction (THE benchmarked)  .305
Value added (Guardian)  .255
Satisfaction with feedback  (Guardian)  .025

It is useful to know about student satisfaction and very useful for students to know how likely they are to finish their programmes. But until rankers and government agencies figure out how to estimate the subject knowledge and cognitive skills of graduates and the impact, if any, of universities then the current trend to teaching-centred rankings will not be helpful to anyone.

















Sunday, October 23, 2016

NORTH KOREA: Some advice on how to become a world-class university


The October 8th post has been republished in University World news
NORTH KOREA
Some advice on how to become a world-class university

Monday, October 17, 2016

More lamentation from Dublin

Rankings have become a major weapon in the struggle of universities around the world to get their fair share or what they think is their fair share of public money. The Times Higher Education (THE) world and regional rankings are especially useful in this regard. They have a well known brand name, occasionally confused with the "Times of London", and sponsor prestigious summits at which rankers, political leaders and university heads wallow together in a warm bath of mutual flattery.

In addition, the THE rankings are highly volatile with significant methodological changes in 2011, 2015 and 2016. Another source of instability is the growing number of ranked universities. The scores used for calculating the various indicators in these rankings are not raw but standardised scores derived from means and standard deviations. So if there is an influx of new universities then mean scores are likely to change and consequently the processed scores of those above or below the mean.

The THE rankings can be interpreted to provide useful arguments whatever happens. If Western universities rise that is a sign of authentic excellence but one that is threatened by reduced funding, restrictions on foreign students and researchers, and reputations sullied by xenophobic electorates. If they fall that means of course that those threats have materialised.

The QS rankings are also sometimes unstable, having made significant methodological changes in 2015 and giving a 50% weighting to very subjective reputation indicators.

Irish universities seem to be especially fond of using these rankings as a ploy to gain public favour and largess. In 2015 Ireland's top university, Trinity College Dublin (TCD), fell seven places in the QS world rankings and 22 places in THE's.

TCD announced of course that government cuts had a lot do with it. The Dean of Research said:
“Notwithstanding these combined achievements the cuts in funding and increased investments made by our global competition, continue to have a direct impact on the rankings. Trinity is battling against intense international competition, particularly from Asian universities and from certain European countries where governments are investing heavily in higher education. The continued reduction in government investment in Irish universities has impacted negatively on the international standing of our universities and our ability to compete in a global arena.”“Trinity’s top 100 position globally and top 30 in Europe is remarkable in the context of its reduced income. Trinity’s annual budget per academic is 45% lower than that of the average university in the world top 200.  It is to the credit of Trinity’s dedicated teaching and research staff that the University continues to maintain its global position against such challenges.”
“As a knowledge economy we need an excellent competitive education system.  Trinity remains a world leading research-intensive university and the knowledge and innovation created are critical for the economic development of Ireland.”
I pointed out in 2015 that TCD had been steadily rising in the Shanghai ARWU rankings since 2004, especially in the Publications indicator (papers in the Science Citation Index and the Social Science Citation Index) and PCP (productivity per capita, that is the combined indicator scores divided by the number of faculty). This year, TCD's publication score again went up very slightly from 31 to 31.1 and the PCP quite significantly from 19 to 20.8, compared to top scores of 100 for Harvard and Caltech respectively.

University College Dublin has also continued to do well in the Shanghai rankings with the publications score rising this year from 34.1 (27.3 in 2004) to 34.2 and PCP from 18.0 (8.1 in 2014) to 18.1.

The Shanghai rankings are famous for not counting the arts and humanities or trying to measure anything related to teaching. The RUR rankings from Russia are based on Thomson Reuters data, also used by THE until two years ago and they do include publications in the humanities and teaching-related metrics. They have 12 out of the 13 indicators in the THE world rankings, plus eight others, but with a sensible weighting, for example 8% instead of 30% for field normalised citations.

The RUR rankings show that TCD rose from 174th overall in 2010 to 102nd in 2016. (193rd to 67th for research).

University College Dublin  (UCD) rose from 213th overall to 195th (157th to 69th for research) although some Irish universities, NUI Galway, NUI Maynooth, University College Cork, and Dublin City University, have fallen.

Nonetheless TCD decided in March of this year to develop a rankings strategy  aimed at QS and THE with a Rankings Steering Group chaired by the Provost. The competence and knowledge displayed by such groups and committees often have little relationship to the status and salaries of its members and that appears to be the case for TCD.

It seems that there was a misplaced decimal point in the financial data submitted to THE for the 2016 rankings and that would have left TCD with a lower rating than it deserved and so it has withdrawn from the rankings until the error is corrected.

If TCD cannot find an administrator or a statistician to check things like that it really has no business asking for taxpayers' money. I suspect that decimal points are not misplaced -- or if they are it is to the right rather than the left --  in submissions for grants or subsidies.

This raises the question of whether the THE checking procedures are adequate. I was under the impression that if there was a change of 20% then red flags would start waving. For THE to allow a large change in reported income and therefore at least one, maybe two or three, income indicators sounds rather odd. What about that unique game changing audit?

Meanwhile UCD, 176th in the THE rankings last year, has dropped out of the top 200 altogether.

The  QS  rankings were also bad news for Ireland. Every university fell except for NUI Galway and there were none in the top 100.

But has there in fact been any real decline in the quality of TCD and UCD?

The evidence of RUR and the Shanghai rankings is that the two main universities are steadily improving or at least  holding their own, especially with regard to research. Possibly less highly regarded places like NUI Galway and NUI Maynooth are struggling but that could be fairly easy to remedy.

The Irish Universities Association issued a statement:

'The continued slide of the Irish Universities in the QS World University Rankings should be greeted with alarm. Strenuous efforts on the part of the universities has resulted in strong performance on some measures in the rankings such as those relating to research citations and internationalisation of the staff and student cohort. Unfortunately, this good work is being undermined by the negative impact of underfunding on key indicators such as the student:faculty ratio. The latter is highly influential in scoring in the QS rankings.
It would also appear likely that almost a decade of austerity is spilling over into the reputational component of the rankings, with consequent negative repercussions. IUA Chief Executive, Ned Costello said: “we can no longer hide from the corrosive effect which years of cutbacks are having on our higher education system. At a time when we are more dependent than ever on the talent of our people for our economic future, we simply must invest in our universities. An immediate injection of funding is required in the upcoming Budget and Estimates to fund more lecturers, deliver smaller group teaching and restore quality in our system.” '
The decline of TCD and and UCD in the QS and THE rankings cannot reasonably be attributed to any real deficiencies on the part of those universities. A decline in the number of lecturers would have a negative effect on the faculty student metric but would help indicators scaled for faculty size. The alleged decline is largely a consequence of methodological changes and adjustments, the instability resulting from the influx of new universities and growing ranking sophistication in other places.

It is a shame that researchers and scholars should collude with those rankings that show them in a bad light while ignoring more stable and less biased ones that show a continuing and genuine improvement especially in research.






Saturday, October 08, 2016

Will North Korea Engage with the Rankings?

Kim Jong-un has declared that Kim Il-sung University must become a world-class institution. No doubt there will be chuckles at Oxford,  Anglia Ruskin University, the University of Iceland and the Free University of Bozen - Bolzano but it could be surprisingly easy if being world class means getting a high place in the rankings. After all, there are now quite a few places appearing in the various global and regional tables that would have been just as surprising just a few years ago.

First, I should mention that there already is a ranking in which Kim Il-sung University is listed: a ranking of international influence as measured by Google's ranking of search results where the institution is 254th.

Here is my plan for North Korea to become world class in just a few years.

1. Offer adjunct professorships to 150 researchers and ask them to  put the university as a secondary affiliation. Maybe they can come and visit Pyongyang sometimes but that is not really necessary. In a little while they will be producing 150 papers or more a year with the university name on, eventually one thousand over a five year period, which will meet the threshold for inclusion in the THE world rankings.

2. Make sure that one or two of those adjunct professors are involved in multi-author, multi-cited projects (but make sure below 1,000 authors) with multiple citations. Medicine is probably a better bet than physics at the moment. This will get a good score in the THE citations indicator.

3. Make sure that research funds to the university go through something with the word industry in it. That way the university will go to the top of the THE Industry Income: Innovation indicator.

4. Don't forget the other rankings. Give the university a boost in the QS world rankings by drafting lots of research assistants who will count in the the student faculty ratio indicator.

5.   Start a branch campus somewhere and get a high score in the international indicators that nearly everybody has nowadays. If the branch is in the USA go for Princeton Review's top party school. 

6. Send a few hundred closely supervised graduate students abroad and tell them they know what to do for the QS reputation survey. When they come back as faculty with a co-authored article or two tell them they know what to do for the THE survey.

7. When Kim Il-sung University is a rising star of the university world, try hosting a summit to rise even higher. Better make sure that hotel is finished though.

Tuesday, October 04, 2016

About those predictions

On September 16th I made some predictions about the latest Times Higher Education (THE) world rankings and summit at Berkeley. My record is not perfect but probably a bit better than the professional pollsters who predicted a hung parliament at the last UK elections, a crushing defeat for Brexit and humiliation for Donald Trump in the Republican primaries.

I predicted that Trump would not be invited to give a keynote speech. I was right but it was a pity. He would certainly have added a bit of diversity to a rather bland affair and he does seem to have a talent for helping unpromising beginners into successful careers, something that the current fad for value added ranking is supposed to measure.

I also said that UC Berkeley as the summit host would get into the top ten again after falling to thirteenth last year. This has now become a tradition at THE summits. I suspect though that even THE will find it hard to get King's College London, the 2017 world summit host, into the top ten. Maybe they will have to settle for top twenty.

The prediction that adding books to the indicator mix would help British universities seems to have been fulfilled. Oxford was number one for the first time. I was also right about the renewed rise of Asia, some of it anyway.  The Korean favourites, Seoul National University, POSTECH, KAIST, Sungkyunkwan University, Korea University, have all risen significantly this year.

The decline of US public universities blamed on lack of funding? Yes, although I never thought Robert Reich would say that public higher education is dying.

Danger of Brexit and immigration controls for UK universities? I did not see anything specific but I did not look very hard and probably everybody thinks it's self evident.

I have to confess that I have not counted the number of times that the words prestige and prestigious were used at the summit or in the Christopher Priest novel. In the latter it is a contraction of prestidigitation and refers to the effect or the third segment of a stage illusion following the setup and the performance, the moment when the rabbit is pulled out of the hat or Anglia Ruskin revealed to have a greater world research impact than Cambridge or Imperial.

Phil Baty gave a masterclass and so did did Duncan Ross. I am pretty certain that no feminists complained about this outrageous sexism so I am prepared to admit that I was wrong there.

Incidentally, according to wikipedia a master class is "a class given to students of a particular discipline by an expert of that discipline -- usually music, but also painting, drama, any of the arts, or on any other occasion where skills are being developed."

Saturday, October 01, 2016

Who says rankings are of no significance?

From Mansion Global 


Six High-End Homes Near America’s Top-Ranked University

Who needs dorms at Stanford when you can live in one of these?


Stanford is, in case you haven't noticed, top of the Wall Street Journal/Times Higher Education US college ranking [subscription required for full results] and, more significantly, the world's 100 most innovative universities.

Saturday, September 24, 2016

The THE World University Rankings: Arguably the Most Amusing League Table in the World

If ever somebody does get round to doing a ranking of university rankings and if entertainment value is an indicator the Times Higher Education (THE) World University Rankings (WUR) stand a good chance of being at the top.

The latest global rankings contain many items that academics would be advised not to read in public places lest they embarrass the family by sniggering to themselves in Starbucks or Nandos.

THE would, for example, have us believe that St. George's, University of London is the top university in the world for research impact as measured by citations. This institution specialises in medicine, biomedical science and healthcare sciences. It does not do research in the physical sciences, the social sciences, or the arts and humanities and makes no claim that it does. To suggest that it is the best in the world across the range of scientific and academic research is ridiculous.

There are several other universities with scores for citations that are disproportionately higher than their research scores, a sure sign that the THE citations indicator is generating absurdity.  They include Brandeis, the Free University of Bozen-Bolzano, Clark University, King Abdulaziz University, Anglia Ruskin University, the University of Iceland, and Orebro University, Sweden.

In some cases, it is obvious what has happened. King Abdulaziz University has been gaming the rankings by recruiting large numbers of adjunct faculty whose main function appears to be listing the university as as a secondary affiliation in order to collect a share of the credit for publications and citations. The Shanghai rankers have stopped counting secondary affiliations for their highly cited researchers indicator but KAU is still racking up the points in other indicators and other rankings.

The contention that Anglia Ruskin University is tenth in the world  for research impact, equal to Oxford, Princeton, and UC Santa Barbara, and just above the University of Chicago, will no doubt be met with donnish smirks at the high tables of that other place in Cambridge, 31st for citations, although there will probably be less amusement about Oxford being crowned best university in the world.

Anglia Ruskin 's output of research is not very high, about a thirtieth of Chicago's according to the Web of Science Core Collection. Its faculty does, however, include one Professor who is a frequent contributor to global medical studies with a large number of authors, although never more than a thousand, and hundreds of citations a year. Single-handedly he has propelled the university into the research stratosphere since the rest of the university has been generating few citations (there's nothing wrong with that: it's not that sort of place) and so the number of papers by which the normalised citations are divided is very low.

The THE citations methodology is badly flawed. That university heads give any credence to rankings that include such ludicrous results is sad testimony to the decadence of the modern academy.

There are also many universities that have moved up or down by  a disproportionate number of places. These include:

Peking University rising from 42nd  to 29th
University of  Maryland at College Park rising from 117th to 67th.
Purdue University rising from 113th to 70th.
Chinese  University of Hong Kong rising from 138th  to 76th.
RWTH Aachen rising from 110th to 78th
Korean Advanced Institute of Science and Technology rising from  148th to 89th


Vanderbilt University falling from 87th to108th
University of Copenhagen falling from 82nd to 120th
Scuola Normale Pisa falling from 112nd to 137th
University of Cape Town falling from 120th to 148th
Royal Holloway, University of London falling from 129th to173rd
Lomonosov Moscow State University falling from 161st to 188th.


The point cannot be stressed too clearly that universities are large and complex organisations. They do not in 12 months or less, short of major restructuring, change sufficiently to produce movements such as these. The only way that such instability could occur is through entry into the rankings of universities with attributes different from the established ones thus changing the means from which standardised scores are derived or significant methodological changes.

There have in fact been significant changes to the methodology this year although perhaps not as substantial as 2015. First, books and book chapters are included in the count of publications and citations, an innovation pioneered by the US News in their Best Global Universities. Almost certainly this has helped English speaking universities with a comparative advantage in the humanities and social sciences although THE's practice of bundling indicators together makes it impossible to say exactly how much. It would also work to the disadvantage of institutions such as Caltech that are comparatively less strong in the arts and humanities.

Second, THE have used a modest version of fractional counting for papers with more than a thousand authors. Last year they were not counted at all. This means that universities that have participated in mega-papers such as those associated with the Large Hadron Collider will get some credit for citations of those papers although not as much as they did in 2014 and before. This has almost certainly helped a number of Asian universities that have participated in such projects but have a generally modest research output. It might almost certainly have benefitted universities in California such as UC Berkeley.

Third, THE have combined the results of the academic reputation survey conducted earlier this year with that used in the 2015-16 rankings. Averaging reputation surveys is a sensible idea, already adopted by QS and US News in their global rankings, but one that THE has avoided until now.

This year's survey saw a very large reduction in the number of responses from researchers in the arts and humanities and a very large increase, for reasons unexplained, in the number of responses from business studies and the social sciences, separated now but combined in 2015.

Had the responses for 2016 alone been counted there might have been serious consequences for UK universities, relatively strong in the humanities, and a boost for East Asian universities, relatively strong in business studies. Combining the two surveys would have limited the damage to British universities and slowed down the rise of Asia to media-acceptable proportions.

One possible consequence of these changes is that UC Berkeley, eighth in 2014-15 and thirteenth in 2015-16, is now, as predicted here,  back in the top ten. Berkeley is host for the forthcoming THE world summit although that is no doubt entirely coincidental.

The overall top place has been taken by Oxford to the great joy of the vice-chancellor who is said to be "thrilled" by the news.

I do not want to be unfair to Oxford but the idea that it is superior to Harvard, Princeton, Caltech or MIT is nonsense. Its strong performance in the THE WUR is in large measure due to the over- emphasis in these tables on reputation, income and a very flawed citations indicator. Its rise to first place over Caltech is almost certainly a result of this year's methodological changes.

Let's look at Oxford's standing in other rankings. The Round University Ranking (RUR) uses Thomson Reuters data just like THE did until two years ago. It has 12 of the indicators employed by THE and eight additional ones.

Overall Oxford was 10th, up from 17th in 2010. In the teaching group of five indicators Oxford is in 28th place. For specific indicators in that group the best performance was teaching reputation (6th) and the worst academic staff per bachelor degrees (203rd).

In Research it was 20th. Places ranged from 6th for research reputation to 206th for doctoral degrees per admitted PhD. It was 5th for International Diversity and 12th for Financial Sustainability

The Shanghai ARWU rankings have Oxford in 7th place and Webometrics in 10th (9th for Google Scholar Citations).

THE is said to be trusted by the great and the good of the academic world. The latest example is the Norwegian government including performance in the THE WUR as a criterion for overseas study grants. That trust seems largely misplaced. When the vice-chancellor of Oxford University is thrilled by a ranking that puts the university on a par for research impact with Anglia Ruskin then one really wonders about the quality of university leadership.

To conclude my latest exercise in malice and cynicism, (thank you ROARS) here is a game to amuse international academics .

Ask your friends which university in their country is the leader for research impact and then tell them who THE thinks it is.

Here are THE's research champions, according to the citations indicator:

Argentina: National University of the South
Australia: Charles Darwin University
Brazil: Universidade Federal do ABC (ABC refers to its location, not the courses offered)
Canada: University of British Columbia
China: University of Science and Technology of China
France: Paris Diderot Univerity: Paris 7
Germany: Ulm University
Ireland: Royal College of Surgeons
Japan: Toyota Technological Institute
Italy: Free University of Bozen-Bolzano
Russia: ITMO University
Turkey: Atilim University
United Kingdom: St George's, University of London.



Monday, September 19, 2016

Update on previous post

The reputation data used by THE in the 2016 world rankings, for which the world is breathlessly waiting, is that which was used in their reputation rankings  released last May and collected between January and March.

Therefore, the distribution of responses from disciplinary groups this year was 9% for the arts and humanities and 15% for social sciences and 13% for business (28% for the last two combined). In 2015 it was 16% for the arts and humanities and 19% for the social sciences (which then included business).

Since UK universities are relatively strong in the humanities and Asian universities relatively strong in business studies the result of this was a shift in the reputation rankings away from the UK and towards Asian universities. Oxford fell from 3rd (score 80.4) to 5th (score 69.1) in the reputation rankings and Bristol and Durham dropped out of the top 100 while Tsinghua University rose from 26th place to 18th, Peking University from 32nd to 21st and Seoul National University from 51-60 to 45th.

In the forthcoming world rankings British universities (although threatened by Brexit) ought to do better because of the inclusion of books in the publications and citations indicators and certain Asian universities, but by no means all, may do better because their citations for mega-projects will be partially restored.

Notice that THE have also said that this year they will combine the reputation scores for 2015 and 2016, something that is unprecedented. Presumably this will reduce the fall of UK universities in the reputation survey. Combined with the inclusion of books in the database, this may mean that UK universities may not fall this year and may even go up a bit (ATBB).  

Friday, September 16, 2016

Some predictions for the THE rankings and summit

Here are my predictions for the THE rankings on the 21st and academic summit on the 26th -28th.

  • Donald Trump will not be invited to give a keynote address.
  • The decline of US public universities will be blamed on government spending cuts.
  • British universities will be found to be in mortal danger from Brexit and visa controls.
  • Phil Baty will give a rankings "masterclass" but will have to apologise to feminists because he couldn't think of anything else to call it.
  • The words 'prestige' and 'prestigious' will be used more times than in the novel by Christopher Priest or the film by Christopher Nolan
  • The counting of books will help British universities, especially Oxford and Cambridge, but they will still be threatened by Brexit.
  • The partial reinclusion of citations of papers with 1,000+ authors, mainly in physics, will lead to a modest recovery of some universities in France, Korea, Japan and Turkey. The rise of Asia will resume.
  • Since the host city or university of THE summits somehow manages to get in the top ten, Berkeley will recover from last year's fall to 13th place. 
  • Last year the percentage of survey responses from the arts and humanities fell to 9% from 16%. I suspect that this year the fall might be reversed and that the reason THE are combining the reputation survey results for this year and 2015 is to reduce the swing back to UK universities, which are suffering because of visa controls and Brexit.
  • At least one of the above will be wrong..




Sunday, September 11, 2016

Waiting for the THE world rankings



The world, having recovered from the shocks of the Shanghai, QS and RUR rankings, now waits for the THE world rankings, especially the research impact indicator measured by field normalised citations.

It might be helpful to show the top 5 universities for this criterion since 2010-11.

2010-11
1. Caltech
2. MIT
3. Princeton
4. Alexandria University
5. UC Santa Cruz

2011-12
1. Princeton
2. MIT
3. Caltech
4. UC Santa Barbara
5. Rice University

2012-13
1. Rice University
2. National Research Nuclear University MePhI
3. MIT
4. UC Santa Cruz
5. Princeton

2013-14
1. MIT
2. Tokyo Metropolitan University
3. Rice University
4. UC Santa Cruz
5. Caltech

2014-15
1. MIT
2. UC Santa Cruz
3. Tokyo Metropolitan University
4. Rice University
5. Caltech

2015-16
1. St George's, University of London
2. Stanford University
3. UC Santa Cruz
4  Caltech
5. Harvard

Notice that no university has been in the top five for citations in every year.

Last year THE introduced some changes to this indicator, one of which was to exclude papers with more than 1000 authors from the citation count. This, along with a dilution of the regional modification that gave a bonus to universities in low scoring countries, had a devastating effect on some universities in France, Korea, Japan, Morocco, Chile and Turkey.

The citations indicator has always been an embarrassment to THE, throwing up a number of improbable front runners aka previously undiscovered pockets of excellence. Last year they introduced some reforms but not enough. It would be a good idea for THE to get rid of the regional modification altogether, to introduce full scale fractional counting, to reduce the weighting assigned to citations, to exclude self-citations and secondary affiliations and to include more than one measure of research impact and research quality.

Excluding the papers, mainly in particle physics, with 1,000 plus "authors" meant avoiding the bizarre situation where a contributor to a single paper with 2,000 authors and 2,000 citations would get the same credit as 1,000 authors writing a thousand papers each of which had been cited twice.

But this measure also  meant that some of the most significant scientific activity of the century would not be counted in the rankings. The best solution would have been fractional counting, distributing the citations among all of the institutions or contributors, and in fact THE did this for their pilot African rankings at the University of Johannesburg.

Now, THE have announced a change for this year's rankings. According to their data chief Duncan Ross.

" Last year we excluded a small number of papers with more than 1,000 authors. I won’t rehearse the arguments for their exclusion here, but we said at the time that we would try to identify a way to re-include them that would prevent the distorting effect that they had on the overall metric for a few universities.


This year they are included – although they will be treated differently from other papers. Every university with researchers who author a kilo-author paper will receive at least 5 per cent credit for the paper – rising proportionally to the number of authors that the university has.
This is the first time that we have used a proportional measure in our citations score, and we will be monitoring it with interest.

We’re also pleased that this year the calculation of the Times Higher Education World University Rankings has been subject to independent audit by professional services firm PricewaterhouseCoopers (PwC). "
This could have perverse consequences. If an institution has one contributor to a 1,000 author paper with 2,000 citations then that author will get 2,000 citations for the university. But if there are 1001 authors then he or she would get only 50 citations.

It is possible that we will see a cluster of papers with 998, 999, 1000 authors as institutions remove their researchers from the author lists or project leaders start capping the number of contributors.

This could be a way  of finding out if research intensive universities really do care about the THE rankings.

Similarly, QS now excludes papers with more than ten contributing institutions. If researchers are concerned about the QS rankings they will ensure that the number of institutions does not go above ten. Let's see if we start getting large numbers of papers with ten institutions but none or few with 11, 12 13 etc.

I am wondering why THE would bother introducing this relatively small change. Wouldn't it make more sense to introduce a lot of small changes all at once and get the resulting volatility over and done with?

I wonder if this has something to do with the THE world academic summit being held at Berkeley on 26-28 September in cooperation with UC Berkeley. Last year Berkeley fell from 8th to 13th in the THE world rankings. Since it is a contributor to several multi-contributor papers it is possible that the partial re-inclusion of hyper-papers will help the university back into the top ten.



Thursday, September 08, 2016

More on Brexitophobic hysteria

John Field, an expert on lifelong learning and a small Scotswoman, comments on the growing Brexit hysteria blowing through academia.

Professor Field quotes the Vice Chancellor of the University of York:

"York, along with many other British universities, appears to have fallen in the QS league table because of concerns about the impact of Brexit; specifically, this has been attributed to worries about future access to research funding and whether we will be able to recruit excellent academic staff and students from all over the world."

The shadow of Brexit falls across the land


The western chattering and scribbling classes sometimes like to reflect on their superiority to the pre-scientific attitudes of the local peasantry, astrology, nationalism and religion and things like that. But it seems that the credentialled elite of Britain are now in the grip of a great fear of an all pervading spirit called Brexit whose malign power is unlimited in time and space.

Thus the Independent tells us that university rankings (QS in this case) show that "post Brexit uncertainty and long-term funding issues" have hit UK higher education.

The Guardian implies that Brexit has something to do with the decline of British universities in the rankings without actually saying so.

"British universities have taken a tumble in the latest international rankings, as concern persists about the potential impact of Brexit on the country’s higher education sector. "

Many British universities have fallen in the QS rankings this year but the idea that Brexit has anything to do with it is nonsense. The Brexit vote was on June 23rd, well after QS's deadlines for submitting respondents for the reputation surveys and updating institutional data. The citations indicator refers to the period 2011-2015.

The belief that rankings reveal the dire effects of funding cuts and immigration restrictions is somewhat more plausible but fundamentally untenable.

Certainly, British universities have taken some blows in the QS rankings this year. Of the 18 universities in the top 100 in 2015 two are in the same place this year, two have risen and 14 have fallen. This is associated with a general decline in performance in the academic reputation indicator which accounts for 40% of the overall score.

Of those 18 universities three, Oxford, Cambridge and Edinburgh, hold the same rank in the academic reputation indicator, one, King's College London, has risen and fourteen are down.

The idea that the reputation of British universities is suffering because survey respondents have heard that the UK government is cutting spending or tightening up on visa regulations is based on some unlikely assumptions about how researchers go about completing reputation surveys.

Do researchers really base their assessment of research quality on media headlines, often inaccurate and alarmist? Or do they make an honest assessment of performance over the last few years or even decades? Or do they vote according to their self interest, nominating their almae matres or former employers?

I suspect that the decline of British universities in the QS reputation indicator has little to do with perceptions about British universities and a lot more to do with growing sophistication about and interest in rankings in the rest of the world, particularly in East Asia and maybe parts of continental Europe.






Wednesday, September 07, 2016

What was that about the origins of science in seventeenth century England?

Trigger warning

If you're triggered by just about anything, don't read this.

Those who dislike inherited privilege will be entertained by this account of the last days of Charles II. it is from a post by Gregory Cochran at the blog, West Hunter.  

It seems that there has been a little bit of progress over the centuries. The future Charles III has a thing about homeopathy, expensive pseudoscientific rubbish but at least it's harmless.

I can't help wondering whether the malign spirit of pseudoscience has now taken refuge in university faculties of social science with their endless crises of irreproducible research.

"Back in the good old days, Charles II, age 53, had a fit one Sunday evening, while fondling two of his mistresses.

Monday they bled him (cupping and scarifying) of eight ounces of blood. Followed by an antimony emetic, vitriol in peony water, purgative pills, and a clyster. Followed by another clyster after two hours. Then syrup of blackthorn, more antimony, and rock salt. Next, more laxatives, white hellebore root up the nostrils. Powdered cowslip flowers. More purgatives. Then Spanish Fly. They shaved his head and stuck blistering plasters all over it, plastered the soles of his feet with tar and pigeon-dung, then said good-night.


Tuesday. ten more ounces of blood, a gargle of elm in syrup of mallow, and a julep of black cherry, peony, crushed pearls, and white sugar candy.
Wednesday. Things looked good:: only senna pods infused in spring water, along with white wine and nutmeg.
Thursday. More fits. They gave him a spirituous draft made from the skull of a man who had died a violent death. Peruvian bark, repeatedly, interspersed with more human skull. Didn’t work.
Friday. The king was worse. He tells them not to let poor Nelly starve. They try the Oriental Bezoar Stone, and more bleeding. Dies at noon."

Saturday, September 03, 2016

Another Important Ranking


Ranking fans have a busy week ahead of them. On Tuesday the QS world rankings will be announced and results will probably start leaking on Sunday or Monday. Then there will be the Shanghai broad subject rankings.

Times Higher Education have promised a major revelation on Monday. I suspect that this might just be the top ten or twenty of the world rankings or a preview of their new US college rankings. 

But this ranking might be more important. Hackerrank, "a platform that ranks engineers based on their coding skills and helps companies discover talent faster", has just published a ranking of countries according to the speed and accuracy with which developers can solve a variety of coding challenges. 

China is first and Russia second.

The USA is 28th and the UK 29th. Eastern Europe and East Asia generally perform well.

For once, there is some fairly good news for Africa and the Muslim world: Turkey is 30th, Egypt 42nd, Bangladesh 44th and Nigeria 48th. 

The top ten are

1. China
2. Russia
3. Poland
4. Switzerland
5. Hungary
6. Japan
7. Taiwan
8. France
9. Czech republic
10. Italy

Tuesday, August 30, 2016

The Pursuit of Excellence



Congratulations to the Institute of Excellence and Higher Education (IEHE) in Bhopal, India, which has managed to maintain its 'A' grade from the National Assessment and Accreditation Council.

If you are wondering how they did it, see the story in the Hindustan Times
"The Institute of Excellence and Higher Education (IEHE) in Bhopal improved its teacher and student ratio from 1:47 to 1:24 a day before the National Assessment and Accreditation Council (NAAC) team was scheduled to visit to retain the institute’s Grade ‘A’.
A three-member NAAC team, led by former vice chancellor SK Singh, will reach on Monday and inspect the institute in 24 sessions.
IEHE, which was facing hardships due to shortage of teachers, appointed 54 guest faculties in a week. The strength of teachers increased from 58 to 112."

There is nothing very unusual about this sort of thing. There have been, for example, suspicions about some British universities offering "relatively short- term contracts"  that expire just after the Research Excellence Framework (REF) assessment is completed.




Tuesday, August 23, 2016

The Naming of Universities



You can tell a few things about universities from their names. If, for example, a university has a direction in its name then it is usually not ranked very highly: University of East London, Southern Illinois University. Those that are named after long dead people -- Yale, Duke -- are often but not always -- Bishop Grosseteste -- very prestigious.

It might be an idea to have a ranking of institutions with the most interesting or strangest names. After all nearly everything else about higher education is ranked somewhere or other. California University of Pennsylvania should be near the top. And of course there is Hamburger University and Butler University. Or, from a few years ago, The Universal Institute of Food Management and Hotel Administration I was Called by the Almighty and I Said Yes My Lord, which was actually a restaurant somewhere along the road from Maiduguri to Kano in Nigeria.

Another high flier might be the Lovely Professional University, a "semi-residential university college in North India", which is ranked 4326th in the world and 213rd in India by Webometrics. I doubt if it will get many votes in the QS or THE academic surveys unless it changes its name which I suspect might be a literal translation from Hindi or Sanskrit. 

Monday, August 22, 2016

Worth Watching



Video
Salvatore Babones, Gaming the Rankings Game: University Rankings and the Future of the University


Friday, August 19, 2016

Shanghai Rankings 2016 Update




An interesting tweet from    reports that the average change in rank position in this year's Shanghai rankings was 32 compared to 11.7 between 2014 and 2015. Changes in methodology, even simple ones, can lead to a lot of churning.

Meanwhile, here are the correlations between the various indicators in the ranking. In general, it seems that the indicators are not measuring exactly the same thing and they do not raise red flags by showing a low or zero association with each other.

The lowest correlations are between publications and alumni and award (alumni and faculty winning Nobel and Fields awards). Publications are papers in the Science Citation Index and the Social Science Citation Index in 2015 while the alumni and award indicators go back several decades. Time makes a difference and as a measure of contemporary research excellence Nobel and Fields awards may be losing their relevance.




 alumni
 award
highly
cited
Nature
& Science
 publications
PCP
 total
alumni
 1
 .764
 .480
 .708
 .439
 .612
 .783
award
 .764
 1
 .544
 .751
 .403
 .686
 .846
highly
cited
 .480
 .544
 1
 .738
 .581
 .588
 .825
Nature
& Science
 .708
 .751
 .738
 1
 .628
 .687
 .925
publications
 .439
 .403
 .581
 .628
 1
 .394
 .733
PCP
 .612
 .686
 .588
 .687
 .394
 1
 .757
Total
 .783
 .846
 .825
 .925
 .733
 .757
 1










All correlations are significant at the 0.01 level (2 tailed).
N is 500 in all cases except for Nature and Science where it is 497