Friday, August 31, 2012

The Shanghai Rankings 4

The publications indicator in the Shanghai ARWU simply measures all science and social science publications in the ISI Science and Social Science Indexes over the previous year. The Arts and Humanities Index is excluded.

It is safe to assume that developing universities will start producing large numbers of publications before work that is of sufficient quality to justify a Nobel or Fields award or publication in Science or Nature. This indicator should then tell us something about that universities that are likely to forge ahead in the coming decade.

The top five for this indicator are:

1.  Harvard
2.  Toronto
3.  Michigan at Ann Arbor
4.  Tokyo
5.  Sao Paulo

Among the rising stars in the top fifty for this indicator are Seoul National University (15th), Peking (27th), National Taiwan University (36th) and Tsinghua (44th).

The Productivity per Capita indicator is one that is rather incoherent as it combines the scores for the other indicators, which may represent achievements from years or decades ago or from last year, and divides them by the current number of senior faculty, including those in the humanities whose papers are not counted. This is one indicator where spending cuts could actually have a positive effect.

For once, Harvard is second and first place is taken by Caltech. The top fifty contains an assortment of specialised institutions such as Scuola Normale Superiore Pisa, Ecole Normale Superieure Paris, Stockholm School of Economics and the Technical University of Munich. 

Wednesday, August 29, 2012

Moving Up

Every so  often political and academic leaders announce plans for getting into the top 50  or 100 or 200 of the global rankings.  The problem is that it is not always clear which rankings they are talking about. Not only do the various league tables have different indicators but doing well in one can actually have negative effects in another. Racking up a large number of publications in ISI-indexed journals would be great for the Shanghai rankings but could be detrimental for the THE World University Rankings until those publications start getting citations that are above average for year, field and country.

A paper by Angela Yung Chi Hou and Chung-Lin Chiang of Fu Jen Catholic University, Taiwan, and Robert Morse of US News & World Report has been published by Higher Education Research and Development. Here is the abstract

Since the start of the twenty-first century, university rankings have become
internationalized. Global rankings have a variety of uses, levels of popularity and
rationales and they are here to stay. An examination of the results of the current
global ranking reveals that well-reputed world-class universities are amongst the
top ranked ones. A major concern for university administrators in many parts of
the world is how to use the global rankings wisely in their mid-term and longterm
strategic planning for building their institutions into world-class
universities. Four major global rankings have been developed: the Academic
Ranking of World Universities, the World University Rankings, the
Webometrics Ranking of World Universities and the Performance Ranking of
Scientific Papers for World Universities. The main purpose of this paper is to
explore the most influential indicators in these global university rankings that
will affect the rank mobility of an institution. Based on an analysis of correlation
coefficients and K-means clustering, a model of strategic institutional planning
for building a world-class university is proposed.

The paper shows that for universities wishing to stay in the top 30 in various rankings, the most influential indicators are Nobel and Fields Awards in the Shanghai ARWU, Citations per Faculty and Faculty Student Ratio in the QS World University Rankings, Visibility in Webometrics and Citations in the Last Two Years in HEEACT.

For the ambitious intent on moving up the rankings, the indicators to watch out for are Papers in Nature and Science and Productivity per Capita in ARWU, the Academic Survey in QS, Visibility in webometrics and H-index in HEEACT
An International Student Bubble?

Something of a mania for international students seems to be developing. In a single issue of University World News there are stories from Canada, China and Poland about plans to recruit students from abroad.


A new report urging Canadian universities to nearly double international student enrolment by 2022 signals a fundamental policy change in Canada.

The report, released last week, recommends that Canada increase the number of foreign students from 240,000 in 2011 to 450,000 by 2022.

The government-appointed panel led by Amit Chakma, president and vice-chancellor of the University of Western Ontario, also laid out a blueprint for how the federal government ought to support universities in their recruitment efforts.

From China

China has been wooing foreign universities and foreign students in a bid to internationalise its universities and as part of a ‘soft power’ policy to project itself internationally.

“China wants to be seen as a major player internationally in terms of education,” said Anthony Welch, a professor of international education at the University of Sydney.

“There is a clear national policy in China of ‘soft power’ using education. I would argue that is a good thing for all partners,” said Yang Rui, an assistant professor in Hong Kong University’s faculty of education.

The article by Yojana Sharma also refers to efforts by universities and governments in Malaysia and Singapore to recruit more students from abroad

From Poland

Polish universities have introduced a free iPhone and iPad app to spread information internationally about opportunities in Polish higher education, and an Android version is promised soon.

The use of the latest technology will move the promotion of Polish higher education to a completely new level, according to a Polish Press Agency report quoting Dr Wojciech Marchwica of the Perspektywy Educational Foundation (Fundacja Edukacyjna Perspektywy), coordinator of the Study in Poland programme.
The universities are hoping to attract high-quality students from Ukraine, Russia, Belarus and Kazakhstan.

A few years ago, Ukraine was declared by the Study in Poland coordinating committee to be a priority source country, as it is tied to Poland by history, culture and geographical proximity.

The effort has already brought measurable results: the number of students from Ukraine grew from 1,989 in 2005 to 6,321 in 2012, an increase of more than 300%. In 2009 Study in Poland opened its first foreign office in Kyiv, at the Kyiv Polytechnic Institute.

China, Canada, Poland, Singapore and Malaysia are  not the only places struggling for more international students.

So why is there such a craze for moving students back and forth across international borders?

One reason for adding more international students is that it is probably the easiest  way to rise in the rankings (excluding ARWU) and the one with the quickest returns for universities outside the top 200 or 300. Getting faculty to do research and write papers is not always popular and may produce a backlash especially if senior staff have political connections. Writing papers that are  readable and citeable is even more difficult. Recruiting faculty to boost faculty student ratios can be expensive and may have an adverse impact on other indicators. The QS surveys are rather opaque and the THE citations indicator painfully complicated. But finding students who can cross a frontier to get a degree is comparatively easy and may even pay for itself. For University College Cork just one international student would pay for the cost of joining the QS Stars.

There are other reasons. Canada appears to be hunting for students from abroad as proxy for a meritocratic immigration policy. The problem here is that those talented engineers and computer scientists may be followed by not so talented spouses, siblings and cousins. China appears to be using universities to further diplomatic objectives and Poland seems to be trying to challenge Russian cultural hegemony.


Tuesday, August 28, 2012

Comment on the QS Subject Rankings

An enjoyable although perhaps a little intemperate comment on the QS philosophy rankings from the Leiter Reports.

Several readers sent this item, the latest worthless misinformation from the "world universities" ranking industry, in which "QS" (which, contrary to rumor, does not actually stand for 'Quirky Silliness") is a main player. As a commenter at The Guardian site notes, five of the universities ranked tops in Geography do not even have geography departments! And which are the "top five" US universities in philosophy?
1. Harvard University
2. University of California, Berkeley
3. Princeton University
4. Stanford University
5. Yale University
That corresponds decently to the top five American research universities, to be sure, but it has nothing to do with the top five U.S. philosophy departments, at least not in the 21st-century. But it should hardly be surprising that if you ask academics teaching in philosophy departments in Japan or Italy to rank the best philosophy departments, many of them will use general university reputation as a proxy. Indeed, every department that is pretty obviously "overrated" in philosophy in this list is at a top research university, and every department obviously underrated is not: so, e.g., Rutgers comes in at a mere 13th, Pittsburgh at 18th (behind Brown and Penn), and North Carolina at 20th.
One may hope that no student thinking about post-graduate work will base any decisions on this nonsense.
Important Dates

September 11th. From QS Intelligence Unit

QS Intelligence Unit is pleased to invite you to attend this afternoon event featuring the global exclusive release of the full QS World University Rankings® 2012-2013 Tuesday 11th September 2012 in Trinity College Dublin. Be among 300 university delegates present for a focused ninety minute session and networking reception on the eve of the EAIE conference.


September 12th. From Morse Code

The 2013 edition of U.S. News's Best Colleges rankings will go live on usnews.com on Wednesday, September 12. National UniversitiesNational Liberal Arts CollegesRegional Universities, and Regional Colleges are included in these rankings.

Our website will have the most complete version of the rankings, tables, and lists. It will have extensive statistical profiles for each school as well as wide-ranging interactivity and a college search to enable students and parents to find the school that best fits their needs. These exclusive rankings will also be published in our Best Colleges 2013 edition guidebook, which will go on sale September 18 on newsstands and at usnews.com.


October 3rd.  Times Higher Education
 
The annual THE rankings, which UK universities and science minister David Willetts said are "fast becoming something of a fixture in the academic calendar", will be published live online at 21.00 BST on 3 October.
A special rankings print supplement will also be published with the 4 October edition of THE, and the results will be available on a free interactive iPhone application.

Monday, August 27, 2012

The Shanghai Rankings 3

Two of the indicators in the Shanghai rankings measure research achievement at the highest level. The highly cited researchers indicator is based on a list of those scientists who have been cited most frequently by other researchers. Since ARWU counts current but not past affiliations of researchers, it is possible for a university to boost its score by recruiting researchers. This indicator might then  be seen as signalling a willingness to invest in and to retain international talent and hence a sign of future excellence. 

The top five for this indicator are

1,  Harvard
2.  Stanford
3.  UC Berkeley
4.  MIT
5.  Princeton

This indicator shows that there are a lot of US state universities and non-Ivy League schools that are doing well on this indicator. There is the University of Michigan (6th), University of Washington (13th), University of Minnesota (19th), Penn State (23rd), and Rutgers  (42nd).

Before this year, the methodology for this indicator was simple. If a highly cited researcher had two affiliations then there was a straightforward fifty-fifty division. Things were complicated when King Abdulaziz University (KAU) in Jeddah signed up scores of researchers on part time contracts, a story recounted in Science. ARWU has responded deftly by asking researchers to indicate how their time was divided if they had joint affiliations and this seems to have deflated KAU's score considerably but has had no or minimal effect for anyone else.

The top five universities for papers in Nature and Science are:

1.  Harvard
2.  Stanford
3.  MIT
4.  UC Berkeley
5.  Cambridge

High fliers on this indicator include several specialised science and medical institutions such as Imperial College London, Rockefeller University, Karolinka Institutet and the University of Texas Southwestern Medical Center.
Self Citation

In 2010 Mohamed El Naschie, former editor of the journal Chaos, Solitons and Fractals, embarrassed a lot of people by launching the University of Alexandria into the world's top five universities for research impact in the new Times Higher Education (THE) World University Rankings. He did this partly by  diligent self citation and partly by lot of mutual citation with a few friends and another journal. He was also helped by a ranking indicator that gave  the university disproportionate credit for citations in a little cited field, for citations in a short period of time and for being in a country were there are few citations.

Clearly self citation was only part of he story of Alexandria's brief and undeserved success but it was not an insignificant one.

It now seems that Thomson Reuters (TR), who collect and process the data for THE beginning to get a bit worried about  "anomalous citation patterns" . According to an article by Paul Jump in THE.

When Thomson Reuters announced at the end of June that a record 26 journals had been consigned to its naughty corner this year for "anomalous citation patterns", defenders of research ethics were quick to raise an eyebrow.

"Anomalous citation patterns" is a euphemism for excessive citation of other articles published in the same journal. It is generally assumed to be a ruse to boost a journal's impact factor, which is a measure of the average number of citations garnered by articles in the journal over the previous two years.

Impact factors are often used, controversially, as a proxy for journal quality and, even more contentiously, for the quality of individual papers published in the journal and even of the people who write them.

When Thomson Reuters discovers that anomalous citation has had a significant effect on a journal's impact factor, it bans the journal for two years from its annual Journal Citation Reports (JCR), which publishes up-to-date impact factors.

"Impact factor is hugely important for academics in choosing where to publish because [it is] often used to measure [their] research productivity," according to Liz Wager, former chair of the Committee on Publication Ethics.

"So a journal with a falsely inflated impact factor will get more submissions, which could lead to the true impact factor rising, so it's a positive spiral."

One trick employed by editors is to require submitting authors to include superfluous references to other papers in the same journal.

A large-scale survey by researchers at the University of Alabama in Huntsville's College of Business Administration published in the 3 February edition of Science found that such demands had been made of one in five authors in various social science and business fields.

That TR are beginning to crack down on self citation is good news. But will they follow their rivals QS and stop counting self citation in the citation indicator in their rankings? When I spoke to Simon Pratt of TR at the Shanghai World Class Universities conference in Shanghai at the end of last year he seemed adamant that they would go on counting self citations.

Even if TR and THE start excluding self citations, it would probably not be enough.. It may soon become necessary to exclude intra-journal citations as well.

Saturday, August 25, 2012

Universiti Malaya Again

In many countries performance in international university rankings has become as much a symbol of national accomplishment as winning Olympic medals or qualifying for the World Cup. When a local university rises in the rankings it is cause for congratulations for everyone, especially for administrators. When they fall it is an occasion for soul-searching and a little bit of schadenfreude for opposition groups.

Malaysia has been particularly prone to this syndrome. There was a magical moment in 2004 when the first THES-QS ranking put Universiti Malaya (UM), the country's first university, in the world's top 100. Then it went crashing down . Since then it has moved erratically up and down around the 200th position.

Lim Kit Siang, leader emeritus of the Malaysian opposition has this to say in his blog:

At the University of Malaya’s centennial celebrations in June 2005, the then Deputy Prime Minister Datuk Seri Najib Razak threw the challenge to University of Malaya to raise its 89th position among the world’s top 100 universities in THES-QS (Times Higher Education Supplement-Quacquarelli Symonds) ranking in 2004 to 50 by the year 2020.

Instead of accepting Najib’s challenge with incremental improvement of its THES ranking, the premier university went into a free fall when in 2005 and 2006 it fell to 169th and 192nd ranking respectively, and in the following two years in 2007 and 2008, fell out of the 200 Top Universities ranking altogether.

In 2009, University of Malaya made a comeback to the 200 Top Universities Ranking when it was placed No. 180, but in 2010 it again fell out of the 200 Top Universities list when it dropped to 207th placing.

For the 2011 QS Top 200 Universities Ranking, University of Malaya returned to the Top 200 Universities Ranking, being placed at No. 167.

In the THES-QS World University Rankings 2009, University of Malaya leapfrogged 50 places from No. 230 placing in 2008 to No. 180 in 2009; while in the 2011 QS World University Ranking, University of Malaya leapt 40 places from No. 207 in 2010 to No. 167 in 2011.

The QS World University Rankings 2012 will be released in 20 days’ time. Can University of Malaya make another leapfrog as in 2009 and 2011 to seriously restore her place as one of the world’s top 100 universities by before 2015?


The government has announced that in addition to Najib’s challenge to University of Malaya in 2005 to be among the world’s Top 50 universities by 2020, the National Higher Education Strategic Plan called for at least three Malaysian universities to be ranked among the world’s top 100 universities.

Recently, the U.S. News World’s Best Universities Rankings included five local universities in its Top 100 Asian Universities, but this is not really something to celebrate about.

The U.S. News World’s Best Universities Ranking is actually based on the QS 2012 Top 300 Asian University Rankings released on May 30 this year, which commented that overall, although University of Malaya improved its ranking as compared to 2011 ranking, the majority of Malaysian universities dropped in their rankings this year as compared to 2011.
There is a lot of detail missing here. UM"s fluctuating scores had nothing to do with failed or successful policies but resulted from errors, corrections of errors, or "clarification of data", changes in methodology and variations in the collecting and reporting of data .

UM was only in the top 100 of the THES-QS rankings because of a mistake by QS, the data collectors, who thought that ethnic minority students and faculty were actually foreigners and therefore handed out a massive and undeserved boost for the international faulty and international student indicators.

Its fall in 2005 was the result of QS's belated realisation of its mistake.

The continued decline in 2007 may have been because QS changed its procedures to prevent respondents to the academic survey voting for their own institutions or because of the introduction of Z scores which had the effect of substantially boosting the scores in citation per faculty for mediocre universities like Peking but only slightly for laggards like UM.

The rise in 2009 from 230th to 180th position was largely the result of a big improvement in the score for faculty student ratio comprising both a reported fall in the number of students and a reported rise in the number of faculty. It is unlikely that the university administration had thrown 6000 students into the Klang River:more probably somebody told somebody that diploma and certificate students need not be included in the data reported to QS.

Whether UM rises again in the QS rankings is less interesting than its performance in the Shanghai Academic Ranking of World Universities. In 2011 it moved into the top 500 with.scores of 3.4 for highly cited researchers and 34.6 for ISI-indexed publications (compared to 100 for the front-runner Harvard) and 16 for per capita productivity (in this case the top scorer was Caltech).

In 2012 UM had the same score for highly cited researchers and registered a score of 38.6 for publications and a slight improvement to 16.7 for productivity. This meant that UM was now ranked in 439th place and that reaching the 300-400 band in  a few years time would not be impossible.

UM has managed to make it into the Shanghai rankings by actively encouraging research among its faculty and by recruiting international researchers, policies that are unpopular and in marked contrast to those of other Malaysian universities.

What will happen in the QS rankings when they come out next month? Something to watch out for is the employer survey, which has a weighting of ten per cent. In 2011 something odd was going on . Apparently there had been an enthusiastic response to the rankings in Latin America especially the employer survey so that QS resorted to capping the scores for many universities. They reported that:


"QS received a dramatic level of response from Latin America in 2011, these counts and all subsequent analysis have been adjusted by applying a weighting to responses from countries with a distinctly disproportionate level of response."
It seems that one effect of the inflated number of responses was to raise the mean score so that universities with below average scores saw a dramatic fall in their adjusted scores. If there is a further increase in responses this year universities like UM may see a further reduction for this indicator.

Sunday, August 19, 2012

The Shanghai Rankings 2

The Shanghai Rankings get more interesting when we look at the individual indicators. Here are the 2012 top five for Almuni who have won Nobel and Fields awards.

1. Harvard
2. Cambridge
3. MIT
4. Berkeley
5. Columbia

In the top fifty for this indicator there are the Ecole Normale Superieure, Moscow State University, the Technical University of Munich, Goettingen, Strasbourg and the City University of New York City College.

Essentially, this indicator allows universities that have seen better decades to gain a few points from an academic excellence that has long been in decline. City College of New York is an especially obvious victim of politics and bureaucracy.

The top five in the Awards indicator, faculty who have won Nobel prizes and Fields medals, are:

1.. Harvard
2.  Cambridge
3.  Princeton
4.  Chicago
5.  MIT

The top fifty includes the Universities of Buenos Aires, Heidelberg, Paris Dauphine, Bonn, Munich and Freiburg. Again, this indicator may be a pale reflection of past glory rather than a sign of future accomplishments.





Saturday, August 18, 2012

The Shanghai Rankings 1

The 2012 edition of Shanghai Jiao Tong University's Academic Ranking of World Universities has been published. Here are the top ten, which are the same as last year's top ten.

1.  Harvard
2.  Stanford
3.  MIT
4.  UC Berkeley
5.  Cambridge
6.  Caltech
7.  Princeton
8.  Columbia
9.  Chicago
10. Oxford

It is necessary to go down to the 19th and 20th places to find any changes. Tokyo is now 19th and University College London 20th, reversing last year's order and restoring that of 2003.

Saturday, August 11, 2012


What’s up at Wits?
 
The university of Witwatersrand is in turmoil. Faculty are going on strike for higher salaries, claiming that there has been a drastic decline in quality in recent years. Evidence for this decline is the university’s fall by more than a hundred places in the QS world rankings. The administration has argued that these rankings are not valid.

THE University of the Witwatersrand is one of SA's largest and oldest academic institutions. According to its strategic planning division, at the end of last year there were about 1300 academic staff, 2000 administrative staff and nearly 30000 students, with 9000 of these being postgraduates.

There is no doubt that Wits has pockets of excellence, and many talented academics who are players on the global stage. However, this excellence is being overwhelmed and dragged down by inefficient bureaucracy in its administrative processes.

There are more administrative staff than academic staff, and as one academic said: "It is impossible to get anything done."

David Dickinson, president of the Academic Staff Association of Wits University - which has more than 700 members and is threatening to strike, said: "Between 2007 and last year, we fell more than 100 places in the QS World University Rankings.. A significant problem is that the most important part of the university has been forgotten: its employees."

The university is ranked second in the country, after the University of Cape Town, but scraped into the top 400 in the world at 399th on the QS World University rankings for last year.
The faculty are correct about the QS rankings. Between 2007 and 2011 the university fell from 283rd place to 399th. The decline was especially apparent in the employer review, from 191st to below 301st and international faculty, from 69th to 176th.

But there is a problem. From 2007 to 2011 Wits steadily improved on some indicators in the Shanghai rankings, from 10.9 to 11.2 for publications in Nature and Science, from 26.2 to 29.9 for publications and from 14.8 to 16.3 for faculty productivity. The score for alumni winning Nobel prizes has declined from 23.5 to 21.2 but this was because the two alumni were being compared to an increase for front runner Harvard.

So which ranking is correct? Probably they both are because they refer to two different periods. The  alumni who contributed to the Alumni indicator in ARWU graduated in 1982 and 2002. Publications and papers In Nature and Science could reflect the fruits of research projects that began up to a decade earlier.

The QS rankings (formerly the THE-QS rankings) are heavily weighted towards two surveys of debatable validity. The declining score for Wits in the employer review from 59 points (well above the mean of 50) to 11 is remarkable and almost certainly is nothing to do with the university but is the result of a flooding of the survey by supporters of other institutions leading to a massive increase in the average number of responses.

The decline in other scores such as international faculty and faculty student ration could be the result of short term policy changes. However, if it is correct that research and teaching are being strangled by bureaucracy and mistaken policies then sooner or later we should start seeing indications in the Shanghai rankings.


Monday, August 06, 2012

Philippine Universities and the QS English Rankings

The QS subject rankings have produced quite a few surprises. Among them is the high position of several Philippine universities in the 2012 English Literature and Language ranking. In the top 100 we find Ateneo de Manila University, the University of the Philippines and De La Salle University. Ateneo de Manila in 24th place is ahead of Birmingham, Melbourne, Lancaster and University College Dublin.

How did the Philippine universities do so well? First, the subject rankings are based on different combinations of criteria. Those for English Literature and Language rankings have a 90 per cent weighting for the academic survey conducted in 2011 and 10 percent for the employer survey. There is, unlike the natural sciences, nothing for citations. Essentially then the English ranking is a measure of reputation in that subject and these universities were picked by a large number of survey respondents..

One feature of the QS academic survey is that respondents can choose to nominate universities globally or by region. Ateneo de Manila's performing better than Birmingham or Melbourne in this subject most probably means that it was being compared with others in Asia while the latter were assessed internationally.

Also, the category English Literature and Language is an extremely diverse one, covering scholars toiling away at a critical edition of Chaucer, post-modern cultural theorists and researchers in language education. I suspect that the high scores for Ateneo de Manila and the other universities came from dozens of postgraduate TESOL students in the US and Australia. It would be a good idea for QS to have separate rankings for English literature and English language education.

As usual, university administrators seem to be somewhat confused  about the rankings. The Dean of the Faculty of Arts and Letters at the University of Santo Tomas is reported as saying;

The University, he pointed out, did not get any request for data from QS, the London consultancy that comes out with annual university rankings:
“With due respect to the QS, I think we should also know how the data is being collected, because as far as we are concerned, we are the academic unit taking care of arts and humanities and philosophy and literature,” he told the Varsitarian.
The QS survey may have been perception-based, and data gathering could have relied on what’s available on the Internet, Vasco added. “The question is, how do they source the data? Do they simply get it from the general information known about the University? Do they simply get it from the website? What if the website is not updated? What information will you get there?” he asked.
Vasco also said it would be difficult to compete in other clusters of the Arts and Humanities category of the QS subject rankings, namely Philosophy, Modern Languages, Geography, History, and Linguistics.
“[We] do not offer the same breadth of programs being surveyed under the arts and humanities cluster in the QS survey,” Vasco said.
The growing number of participants in the QS survey has contributed to the general decline of Philippine schools in various QS rankings, the Artlets dean noted. “More and more international universities from highly industrialized countries are participating, like universities from Europe, North America, and even Asia-Pacific,” he said. “Chances are, Philippine schools will slide down to lower rankings.”

For once, QS is being unfairly treated. The methodology of the subject rankings is explained quite clearly here



Saturday, August 04, 2012


QS Stars

University World News (UWN) has published an article by David Jobbins about QS Stars, which are awarded to universities that pay (most of them anyway) for an audit and a three year licence to use the stars and which are shown alongside the listings in the  QS World University Rankings. Participation is not spread evenly around the world and it is mainly medioce universities or worse that have signed up according to a QS brochure. Nearly half of the universities that have opted for the stars are from Indonesia.

Jobbins refers to a report in Private Eye which in turn refers to the Irish Examiner. He writes:

The stars appear seamlessly alongside the listing for each university on the World University Rankings, despite protestations from QS that the two are totally separate operations.

The UK magazine Private Eye reported in its current issue that two Irish universities – the University of Limerick and University College Cork, UCC – had paid “tens of thousands” of euro for their stars.

The magazine recorded that UCC had told the Irish Examiner that the €22,000 (US$26,600) cost of obtaining the stars was worthwhile, as it could be recouped through additional international student recruitment.

The total cost for the audit and a three-year licence is US$30,400, according to the scheme prospectus.


 The Irish Examiner article by Neil Murray is quite revealing about the motivation for signing up for an audit:

UCC paid almost €22,000 for its evaluation, which includes a €7,035 audit fee and three annual licence fees of €4,893. It was awarded five-star status, which it can use for marketing purposes for the next three years.

The audit involved a visit to the college by QS researchers but is mostly based on analysis of data provided by UCC on eight criteria. The university’s five-star rating is largely down to top marks for research, infrastructure, internationalisation, innovation, and life science, but it got just three stars for teaching and engagement.
About 3,000 international students from more than 100 countries earn UCC approximately €19 million a year.

UCC vice-president for external affairs Trevor Holmes said there are plans to raise the proportion of international students from 13% — one of the highest of any Irish college — to 20%.

"Should UCC’s participation in QS Stars result in attracting a single additional, full-time international student to study at UCC then the costs of participation are covered," he said.

"In recent times, unlike many other Irish universities, UCC has not been in a position to spend significant sums on marketing and advertising domestically or internationally. QS Stars represents a very cost-effective approach of increasing our profile in international media and online."
So now we know how much a single international student adds to the revenue of an Irish university.

So far, there is nothing really new here. The QS Stars system has been well publicised and it probably was a factor in Times Higher Education dropping QS as its data collecting partner and replacing them with Thomson Reuters.

What is interesting about the UWN article is that a number of British and American universities have been given the stars without paying anything. These include Oxford and Cambridge and 12 leading American institutions that are described by QS as "independently audited based on publicly available information". It would be interesting to know whether the universities gave permission to QS to award them stars in the rankings. Also, why are there differences between the latest rankings and the QS brochure? Oxford does not have any stars in last year's rankings but is on the list in the brochure. Boston University has stars but is not on the list. It may be just a matter of updating.

It would probably be a good idea for QS to remove the stars from the rankings and keep them in the university profiles.