Friday, January 30, 2015

Who says university isn't worth it?

In Malaysia it might be.

According to a local blog the customary dowry ("hantaran," distinct from the religiously sanctioned "mas kahwin" which is very modest) paid to the family of the bride varies significantly according to the bride's level of education.

For a woman with UPSR ( primary school certificate) it is 2-4,000 Ringgit.
For SPM (secondary school certificate) holders it is 4-8,000 Ringgit.
For STPM holders (equivalent to 'A' levels) it is 8-12,000 Ringgit.
For degree holders it is  12-15,000 Ringgit.
For master's holders it is  15-20,000 Ringgit.
For Ph Ds it is 20-30,000 Ringgit.

As far as I know, there is no premium for international universities or for those with a high place in the global rankings. Yet.

Thursday, January 29, 2015

THE Under Fire in the Far North and Down Under

Times Higher Education  (THE) has been attacked for the latest spin-off from its world rankings. Alex Usher of Higher Education Strategy Associates has tweeted "your "most international unis" rankings lack even the barest face validity. Galway more intl than Harvard? C'mon"

The higher education editor of the Australian, Julie Hare, reports that Australian observers are surprised that Monash university, reputed to be the most international Australian university, has been ranked so low and quotes a comment at THE: "What cretin can assert that LSE and Cambridge are less "international" than Brunel and Canterbury?"

I wonder if there will be similar comments on THE's preview, in advance of a summit in Qatar, of its forthcoming MENA rankings. This is the top five of a research impact indicator based on field and year normalised citations. The first place goes to Texas A and M University Qatar. The other four are the Lebanese American University, King Abdulaziz University, Jeddah, Qatar University and the American University of Beirut.

In case you are wondering, Texas A and M Qatar does have someone on  the Large Hadron Collider project. 

Wednesday, January 28, 2015

The Most International Universities

Times Higher Education has released its list of the top 100 most international universities. This is simply the International Outlook indicator extracted from last year's world rankings. It counts the proportion of international staff and faculty and the percentage of papers with  international collaborators.

The top three are in Switzerland. The National University of Singapore is fourth and Ecole Polytechnique in Paris fifth. The Ecole is up from 28th place last year which probably means a "clarification of data" of some sort.

So what makes a university international?

It helps a lot to be located in an English speaking country. The UK, and Australia get high scores.

What is more noticeable is that many  small countries to do well especially if they are located next to a big country with the the same or similar language and culture -- Switzerland, Singapore, Macau, Hong Kong, Austria, Denmark,  Big countries like Mainland China, the USA and India do not.

Perhaps THE (and QS) should think about the implications of its methods. Does it make sense to count as international a student who moves a few miles from Fermanagh to Galway, Bavaria to Austria or Johor to Singapore?

Perhaps the THE should count the whole of the EU as a single country or give extra points for students and faculty who cross an ocean rather than an increasingly meaningless line on a map. Perhaps also, Macau and Hong Kong should be reunited methodologically with the Mainland. What about counting out of state students at US universities?




Thursday, January 22, 2015

More University Mission Creep

Demands on western universities continue to increase and so do calls for more and more indicators in national and global rankings. Universities, it seems, are underemployed if they just provide instruction in academic, professional and technical subjects and promote research and scholarship.

Now they are supposed to support diversity and inclusiveness, build character, grit and resilience, promote anti-racism, combat sexism, homophobia, cisgender normativity and weightism, boycott Israel, reward students for overcoming adversity, engage with communities, combat terrorism, transform lives, provide gender free bathrooms, sponsor near-professional level sports teams, boycott fossil fuels, make everybody safe and comfortable except for those whose privilege needs continued confrontation.

All this is now spilling over into the rankings business. We have already seen Universitas 21, which ranks national university systems, give countries a score for the number of female students and faculty.and there have been repeated proposals  that the US News law school rankings should include faculty  and student diversity among their criteria.

US News has published a diversity index that consists simply of calculating the percentage of minority students. First place goes to Cornell, followed by the University of Hawaii - Manoa, Whitter College, California, the University of the District of Columbia and Nova Southeastern University in Florida. A quick calculation of the correlation between the diversity index and overall scores in the law school rankings shows no significant relationship between diversity so defined and overall quality as measured by the rankings.

To incorporate such an index into the law school rankings would be a pointless exercise.  If Nova Southeastern Unversity Law School, which accepts nearly half of those who apply and a third of whose graduates are not employed nine months after graduation, were to get a high ranking, this would be seriously misleading for everybody.

The US federal government is now proposing to rate colleges according to their admission of low income and first generation students, affordability, and outcomes such as graduation rates, graduate employment and entry into postgraduate programs. The problem here is that in the US and  almost everywhere else these objectives are mutually exclusive. Low income students are, on average, likely to be less academically capable, and that means, if academic standards remain unchanged, that fewer will graduate and fewer will go on to graduate study.

If the ratings plan ever happens the likeliest consequence of the colliding demands is that it will become much easier to get a degree or get into graduate school. There are dozens of ways in which academic standards can be eroded, most of which we have seen already somewhere.

Another kind of creep is the rising chorus that universities should encourage and promote civic engagement, a rather slippery concept that is difficult to describe but covers  a variety of worthy activities reaching beyond the campus such as promoting local economic development, employing women and minority groups, helping poor students succeed, buying local products and encouraging students to be volunteer teachers, . A recent conference in South Africa ended with a call for action that included a proposal that rankings should take account of such activities.

Adam Habib, Vice-Chancellor of the University of the Witwatersrand, even proposed to boycott rankings unless they included civic engagement as an indicator.

"Gather a group of universities and tell the rankings that you'll collectively withdraw if they don't take in civic engagement in the future. I guarantee that every one of them will listen".

But why should universities be required to do what other institutions have failed to do even though they are far better qualified. If entrepreneurs cannot promote economic growth, revolutionary parties cannot achieve social justice, trade unions cannot help the poor then just why should should universities be expected to do so?

Part of the drive for new indicators is probably rooted in the realisation that universities are losing much of their reason for existence. Research is increasingly done by specialised institutes, companies and hospitals, for- profit organisations offer no frills instruction at low prices, online learning is replacing traditional seminars and lectures. Civic engagement looks like the new quality audit, a way of keeping busy those who are reluctant to teach or research.

If all the demands for new indicators are met we will end up with hugely bloated rankings that fail to make any meaningful distinctions.



Saturday, January 17, 2015

New Resource from IREG

A new resource for anyone interested in university rankings is available at the International Rankings Experts Group (IREG) site.

The Inventory of National Rankings has been prepared by the Perspektywy Education Foundation, Poland, and provides basic data about a variety of national ranking systems. Two of them have been approved by IREG.

Wednesday, January 07, 2015

US Federal Ratings Plan: A Few Answers, More Questions

The US Department of Education has just revealed the progress that it has made towards its planned ratings for colleges and universities. There has been over a year of public discussion since the Obama administration announced that it was planning on introducing a new system. Unfortunately, it seems that there is still a long way to go before a final product emerges and the administration’s forecast of a launch in August or September 2015 may be too optimistic.

Since the 1980s, the US News& World Report’s ‘America’s Best Colleges’ has been followed avidly by students and other stakeholders. These rankings have been criticised, sometimes with justification, but they do provide a reasonably accurate guide to some of the things that students, parents, employers and counsellors want to know: how likely a student is to graduate once admitted, the typical academic ability of fellow students, reputation among peers, resources available for teaching.

There is, of course, much that the US News rankings do not tell us. The international rankings produced by Shanghai Jiao Tong University’s Center for World-Class Universities, Quacquarelli Symonds (QS), Times Higher Education and now US News with its Best Global Universities are probably even more limited since they focus largely or entirely on research and postgraduate training. There is also a widespread feeling that existing rankings are unfair to schools that try to educate students from non-traditional backgrounds or underrepresented groups.

The demand for more information and for greater accountability comes when American universities are entering a time of increasing pressure and constraint. Costs are rising inexorably, even though many students are taught not by hugely expansive superstar professors but by poorly paid adjuncts and untrained graduate assistants. Many students graduate late or not at all and incur a large and growing debt burden from which bankruptcy rarely provides an escape. Meanwhile the more reliable global university rankings show American universities steadily losing ground to Asian institutions.

Many colleges and universities are facing a death spiral as stagnant or declining admissions lead to a fall in the number of graduates which in turn erodes reputation and undermines alumni contributions. Underlying everything is the grim reality that the overall quality of graduate of American high schools is apparently insufficient to supply colleges and universities with students capable of completing a degree within a reasonable length of time.
The federal government has become increasingly concerned over these trends and the failure of American higher education to provide a route to secure employment and middle class status. The new plan had its origins in a speech by President Obama at the University at Buffalo: SUNY in August 2013.

A succession of hearings and forums has been held and finally the Department of Education has come out with a draft framework. The department has indicated that it will publish ratings, not rankings, so that colleges and universities will be divided into three categories, high performers, low performers and those in between. Two year institutions such as community colleges and four year colleges and universities will be assessed separately and institutions that teach only postgraduates or do not grant degrees will not be included. The main source of data will be information collected by the federal government.

According to the document, ‘For Public feedback: A College Ratings Framework’, the objectives are to help colleges measure and make progress towards the objectives of access, affordability and outcomes, to provide information for students and families, and to help the government ensure that financial aid is well used.

The department has announced the indicators it is thinking about using. These include enrolment of low income and first generation students, family income levels and the average net price of an institution. Student outcomes could be measured by completion rates, transfer rates and the number of students going on to graduate school. More details can be found in ‘A New System of College Ratings– Invitation to Comment’.

Several questions remain unanswered. A rating consisting of three categories may be too crude. There will almost certainly be a large gap between those at the top of the high performing group and those on the edge of the intermediate category. Membership of the same large group will not help anyone trying to compare two universities. A small change in one or two indicators might push colleges out of the intermediate group into the underperformers where they could suffer financial sanctions.

If the department provides the scores or raw data for each indicator then it would be relatively simple for analysts or journalists to calculate numerical rankings.

The most disquieting thing about this document is that the department seems to have given little thought about how easy it would be to game much of the data. There are, for example, dozens of ways in which colleges and universities could increase the number of students who graduate on time, even if it means undermining the quality of their degrees and their value to potential employers. Rating institutions according to the repayment of student loans might encourage universities to close humanities and social science departments while coaxing students into programs for which they might not be suited.


It is likely that there will be more arguments and discussion before the ratings are launched and it remains to be seen how much credibility they will have when they do appear.