Sunday, July 19, 2009

The Paris Rankings are out

L'Ecole des Mines de Paris has produced its third Professional Ranking of World Universities. This is based solely on the number of CEOs of Fortune's top 500 companies. The top 5 in order are Tokyo, Harvard, Stanford, Waseda and Seoul national universities. Five French schools are in the top 20 and in general France performs much better on these rankings than any other, which, one suspects, might be the whole point of the enterprise.

According to University World News

Interviewed in the online higher education publication Educpros, Nicolas Cheimanoff, director of studies of Mines Paris
Tech, explained the aims of the rankings: "In France we were challenged into taking action, to say we could not base arguments exclusively on the Shanghai ranking and construct higher education policy solely on this ranking.

"We wanted to show at an international level that France is a country where you can study. Our ranking gives visibility to a school, but also to the system of French higher education as a whole."Cheimanoff said Mines Paris Tech had been in contact with Professor Liu, originator of the Shanghai rankings, to suggest Jiao Tong should incorporate the Mines crterion. "He was a priori in favour but only if we included the academic careers of company heads since 1920 as he did for
the Nobel prizewinners. But that's totally impossible."


The Paris rankings do correlate quite well with others indicating they are measuring some sort of quality. However, the performance of French, Japanese and Korean schools may say more about the recruitment and immigration policies of their countries than anything else.

Also, one wonders whether producing the CEO of General Motors is indicative of the real quality of Duke and Harvard.

The frightening thing is that it probably is.

Monday, July 06, 2009

A Ranking System for the Philippines

Another national ranking system is on the way.

The Commission on Higher Education (Ched) will come up with a ranking system of the best schools in specific fields of study or discipline, an official said today.

“We may come up (with the ranking system) within the
year,” said Ched executive director Julito Vitriolo said in a phone
interview. As of the moment, Vitriolo said Ched is now compiling the
licensure examination results on different fields of study in various colleges and universities nationwide.


See here for more.
IREG-4

Presentations from the fourth International Rankings Expert Group Conference in Astana, Kazakhstan, are available here.

Tuesday, June 30, 2009

Is There a q Factor in University Ranking?

Although the Shanghai rankings show a high correlation with other rankings (based on a tiny sample of US universities) the HEEACT rankings from Taiwan (Performance Ranking for Scientific Papers for World Universities) do somewhat better. The correlation with THE-QS is .740, the Shanghai ARWU .984, the USNWR America's best Colleges .711, Professional Ranking of World Universities .920 and the Center for College Affordability and Productivity .700.

All these rankings measure diffent things. The USNWR measures a variety of indicators related directly or indirectly to the quality of instruction, the CCAP is quite definitely a consumer-orintated ranking, the THES-QS World University Rankings are largely a measure of research performance (reputational survey, citations per faculty and student faculty ratio where researchers are counted in the faculty), the Professional Renking of World Universities counts CEOs of top companies while the Shanghai and Taiwan rankings focus entirely on research, mainly in the natural sciences.

The ability of the Taiwan rankings to predict scores on the other rankings suggests that underlying various measures of university quality is a single q factor, the average intelligence of its faculty. If there is one single number that would tell you about the general quality of a school than it would probably be the average IQ of the faculty, although performance on standardised tests, publications and citations (especially in the hard sciences) and postgraduate degrees might be goood proxies. The strength of the Taiwan rankings would be their focus on research productivity alone.

Incidently, if anyone from HEEACT reads this, please think of a new name for your rankings. PROSPWU is not exactly a memorable acronym.

Sunday, June 28, 2009

Ranking the Rankings

University rankings are popping up everywhere. So how do they compare with one another? One way is to check the correlation between the total scores of the rankings. Here, correlations have been calculated for the scores of ten US universities (every tenth university in the Shanghai rankings excluding those not in the THE-QS top 400).

It seems that the Shanghai ARWU is the most valid of five rankings Correlations for total scores are .796 with the THE-QS, .712 with the USNWR America's Best Colleges, .896 for the Professional Ranking of World Universities (Paris) and .628 for the Center for College Affordability and Productivity.

It looks like on the basis of this extremely small and unrepresentative sample that if you had to pick just one ranking to rely on then it would have to be the Shanghai ARWU.

Thursday, June 25, 2009

International Rankings Expert Group Conference in Kazakhstan

I have just returned from the International Rankings Expert Group’s fourth conference in Astana, Kazakstan. There were some positive developments at the conference but also a few disappointments.

Starting with the negative aspects, there seems to be a global trend to the proliferation of national rankings which are increasingly and unnecessarily detailed and which impose a serious burden on teachers and researchers.. A case in point is the new ranking produced for Kazakhstan which includes just about every variable imaginable from "the number of Dissertation Councils" to "the availabilty of medical centers, sport halls, preventoriums, recreation zones". Very few at the conference seemed aware of the backwash effect of the rankings boom as universities outside the top 500 create their own rankings or compete for irrelevant awards, medals or certificates. Drudges in the periphery of the world university system now face an endless round of form filling, office tidying, meetings, committees and professional development activities which make teaching difficult and genuine research, as opposed to research-like behaviour, close to impossible.

The European Union ranking project was presented in some detail but I suspect is going to make little impact since it appears largely concerned with making fine distinctions between the research capabilities of faculties and departments.

There was a presentation about the Lisbon project which proposed to ignore research altogether and measure teaching excellence. This is an interesting idea but it seems to miss two important points. One reason for emphasizing the measurement of research is that the qualities required for research, general cognitive ability, reading and writing skills, conscientiousness and interest in a subject also correlate to some extent with teaching ability, however that is measured. Also, the assumption that learning is dependent on teaching which in turn must be regulated by a centralized bureaucracy is surely false, at least for the more able students

Positive developments include a trend towards personalized rankings where consumers assign their own weighting to indicators. There is an interesting project under way in Taiwan.

Richard Vedder introduced a ranking that has the merit of being based largely on publicly accessible data. The basic idea is excellent but there are some issues to be dealt with. Using RateMyProfessors is not a bad way to assess the quality of teaching but to be really valid there needs to be some adjustment for the grades awarded by the instructor. Using the American Who’s Who is also potentially interesting – and could well be applied internationally -- but there are of course obvious issues of bias.

He also gave a presentation without using PowerPoint. I must remember that next time I fill in a form about my innovative teaching methodology.

One measure presented was to create an IREG seal of approval. The logo is ready. I am not sure though whether this is going to be effective.

Overall, the conference has strengthened my conviction that if ranking is to be done it should not be by administrators or businesses but by universities themselves.

Monday, June 22, 2009

A New Atlantic Alliance?

Recently, the US News and World Report expanded its rankings portfolio to include the World’s 100 Best Universities. This turned out to be nothing more than the THE-QS World University Rankings with a rebranding for the US market. Now the USNWR has gone a step further and produced a list of the world’s top 400 universities along with sundry regional and subject rankings. Once again, this is the QS rankings with a new name.

This is no doubt a shrewd move for QS who are now marketing their rankings on both sides of the Atlantic and appear to be on the way to establishing a near monopoly over the international ranking business. It could, however, be risky for USNWR. People are bound to wonder why it should link up with a company that has a history of errors where American universities are concerned. In 2007 QS got their North Carolina business schools mixed up and as a result caused Fortune magazine to withdraw its business school rankings based on QS data. Will US students and stakeholders forgive the USNWR if its data includes things like a near zero for research for Washington University in St Louis or an unbelievably good score for Duke for student faculty ratio?

Tuesday, June 16, 2009

An Alternative Global Ranking

This is from GLOBALHIGHERED.

Finally the decision on who has won the European Commission’s million euro tender – to develop and test a global ranking of universities – has been announced.

The successful bid – the CHERPA network (or the Consortium for Higher Education and Research Performance Assessment), is charged with developing a ranking system to overcome what is regarded by the European Commission as the limitations of the Shanghai Jiao Tong and the QS-Times Higher Education schemes. The final product is to be launched in 2011.

CHERPA is comprised of a consortium of leading institutions in the field within Europe; all have been developing and offering rather different approaches to ranking over the past few years (see our earlier stories here, here and here for some of the potential contenders):CHE – Centre for Higher Education Development (Gütersloh, Germany)Center for Higher Education Policy Studies (CHEPS) at the University of Twente (Netherlands)Centre for Science and Technology Studies (CWTS) at Leiden University (Netherlands)Research division INCENTIM at the Catholic University of Leuven (Belgium)Observatoire des Sciences et des Techniques (OST) in ParisEuropean Federation of National Engineering Associations (FEANI)European Foundation for Management Development (EFMD)

Monday, June 15, 2009

Rankings do matter

One of the most dangersous things about university rankings is that they are becoming -- in parts of Asia at any rate -- symbols of national grandeur or decline, attracting almost as much public concern and interest as the World Cup.

Dr Hsu has an interesting post on the divergent histories of Singapore and Malaysia that contains this comment:

Incidentally, I think this university ranking [almost certainly he means THE-QS] can be taken as representative of everything comparative among the 2 countries.

Sunday, June 14, 2009

Publish and Pay

There is a growing trend towards open access academic publishing where researchers have to pay for publication. Open access is in principle a good idea but the idea of authors rather than subscribers footing the bill has its dangers.

Firstly, it poses a threat to new academic journals in emerging countries. There are, I suspect quite a few researchers who would find it more convenient to spend a few hundred dollars, especially if comes out of grant money, for speedy and "prestigious" international publication rather than writing for a local journal with limited impact.

Secondly, there is a definite threat to standards if criteria for publication are to relaxed or perhaps even abandoned altogether. 

Recently, Philip Davis and Kent Anderson sent a totally nonsensical computer generated paper to the Open Information Science Journal. It was accepted, supposedly after peer review, with a request for the payment of $800 in author's fees. In this case, at least, the peer review process had apparently been dropped altogther.

For more information see The Scientist and the "authors'" blog, The Scholarly Kitchen.

In all fairness, it must be pointed out that another computer generated paper submitted to another journal run by the same company journal was rejected and that one reviewer at least figured out what was going on.

Still, this does have disturbing implications. If publication becomes influenced or even determined by ability to pay then we are heading for the complete corruption of the peer review system.

It would be a good idea if universities refused to consider articles in pay for publication journals as evidence for selection or promotion. Perhaps also, Scopus and other databases could list such journals in a separate category.

Anyway, here is an extract from the first paper:

"In this section, we discuss existing research into red-black trees, vacuum tubes, and courseware [10]. On a similar note, recent work by Takahashi suggests a methodology for providing robust modalities, but does not offer an implementation [9]."

 

 

Thursday, June 11, 2009

Ranking the Rankings

University rankings are popping everywhere now. It is time to start comparing them with each other. First, here are the number of results from a Yahoo! search using the official names of the rankings. In the lead is the THE-QS World University Rankings, followed by the USNWR America's Best Colleges. The Shanghai rankings have made a much smaller impact and the Webometrics rankings even less. No doubt a search in languages other than English would lead to different results as would a search using different names.

Still, it seems that in the webosphere THE-QS have a strong lead among the international rankings.

"World University Rankings" (THES-QS) 942,000
"America's Best Colleges" (US News and World Report) 892,000
"Times Good University Guide" (UK) 252,000
"Academic Ranking of World Universities" (Shanghai) 123,000
"Guardian Good University Guide" (UK) 87,300
"Maclean's University Rankings" (Canada) 14,200
"World University Ranking on the Web" (Webometrics) 9,890
"CHE/Die Zeit University Ranking" (Germany) 4,570
"Ranking of Scientific Papers for World Universities" (Taiwan) 2,120

Sunday, June 07, 2009

Is this an error?

According to the QS.com ranking of Asian universities, the best university in Asia for Student/Faculty Ratio is "College of Medicine, Pochon Cha University". (it seems that it is actually Pochon CHA, with CHA being the name of a private medical conglomerate).

This is a little odd since the institution is clearly a single subject one and therefore presumably should not have been included in the rankings at all. This was the rationale for the University of California at San Francisco being removed after a brief appearance in the world rankings.

It is possible though that QS has different requirements for being included in the world and the regional rankings. If this is the case then countries can now use a new strategy for getting excellent scores in the rankings. Just designate medical schools or faculties as independent universities. They will get good scores for publications and citations since medical researchers tend to publish short articles that are cited more frequently and more quickly than in other disciplines and for student/faculty ratio since they have a lot of clinical faculty who can be added to the faculty totals.

It will be interesting to see how long Pochon CHA University remains in the Asian rankings or whether it will appear in the forthcoming world rankings.

Wednesday, June 03, 2009

Not a Good Start

The Malaysian government has awarded Universiti Sains Malaysia the coverted APEX University status, meaning that it gets a lot of money to try and get in the top 100 world. universities.

Unfortunately, things went wrong last Friday when the university's website informed 8,000 plus students that they had been accepted. In fact, only 3,599 had been and it took 24 hours for the university to correct the error. Not a good start but it will probably boost USM's scores in the Webometrics rankings.

See Education in Malaysia for more coverage
Ranking News

The European Union is planning on introducing a rival to the Shanghai and THES-QS rankings. This is a good idea in principle but who is going to get the contract? It is a pity that "internationalisation" is going to be an indicator and what exactly does "community outreach" mean?

Odile Quintin, the European Commission's director-general for education, told the HES that the Shanghai Jiao Tong was "firmly concentrated on research", anchored to the production of Nobel laureates, and narrow in scope.


"We think that universities have a strong role in research but also in teaching and employability so we are promoting an alternative ranking to measure all these dimensions," she said.


The ranking would be handled by a consortium working independently of the EC, and work would begin after the results of a tendering process were revealed next week.


The plan is to develop the ranking throughout 2009 and 2010, for implementation a
year later. The project will have a budget of E1.1 million ($1.9m).


Ms Quintin said the new ranking, while based in Europe, would have a global reach.
She added that the new European survey would be focused much more on
disciplinary strength, "because you can be the best university in nanotechnology
but not in psychology".

She said the alternative world ranking system would be independent, run neither by governments nor universities and provide a multidimensional measure of education, research, innovation, internationalisation and community outreach.

Tuesday, June 02, 2009

Saturday, May 30, 2009

Asian University Rankings

QS Quacquarelli Symonds has come out with a ranking of the top 200 Asian universities. Here is the top ten.

1. University of Hong Kong
2. Chinese University of Hong Kong
3. University of Tokyo
4. Hong Kong University of Science and Technology
5. Kyoto University
6. Osaka University
7. Korean Advanced Institute of Science and Technology
8. Seoul National University
9. Tokyo Institute of Technology
10. National University of Singapore and Peking University


There are also rankings by disciplinary cluster and by indicator.

For every single disciplinary cluster, the University of Tokyo, not the University of Hong Kong is top. How strange.

For the indicators, the National University of Singapore is first for Employer Review and International Students, Tokyo University for Academic Peer Review, College of Medicine at Pochon Cha University (Korea) for faculty student ratio [I’m wondering about that as well], Gwangju Institute of Science and Technology for papers per faculty, Yokohama City University for Citations per Paper, Hong Kong University of Science and Technology for International Faculty, Kansai Gaidai for Inbound Exchange Students and City University of Hong Kong for Outbound Exchange Students.


These rankings seem to be a shrewd marketing move. Universities that have no chance of getting anywhere in the World University Rankings will now be able to boast that they came in the top 50 Asian universities for outbound exchange students or top 100 for citations per paper. A glance at the indicator rankings, for example, shows some Malaysian universities that one would not have thought had any chance of being in any sort of ranking. On the other hand, these rankings have been able to identify rising stars such as the Multi Media University.

There are two methodological innovations, both of which are questionable. They need to be discussed since this regional ranking could be a tryout for the global rankings. The first is the addition of two further measures of internationalization, inbound and outbound exchange students.

If internationalization is going to be a criterion, then having more measures might be a good idea. However, it is time to consider whether internationalization is actually a valid indicator of quality. Measures of internationalization do not correlate very well if at all with any other indicator and they also give an unfair advantage to the European Union and Hong Kong.

If we want to measure faculty quality, which internationalization supposedly underlies, a better method might be calculate the percentage of a random sample of teaching and research staff on university web pages who obtained degrees from the top 100 universities (on the Shanghai rankings?).

However, since QS get a lot of their bread and butter from facilitating students moving across national boundaries we are unlikely to see the end of this indicator.

The addition of number of inbound and outbound exchange students might also be very easily manipulated. If it were included in the world rankings it is likely that we will see universities setting up branch campuses a few miles away across some increasingly irrelevant frontier and then moving everybody there for their second year and calling them exchange students. So we might expect to see Queens University Belfast setting up a branch in Dundalk in the Irish Republic or the National University of Singapore in Johore in Malaysia and so on.

The other innovation is that research is measured by citations per paper, which measures the average impact of papers, and papers per faculty which measures the quantity of research in a very basic sense. This represents an improvement over the previous policy of using a single indicator. However, the problem remains that both are based on the Scopus database which aims to be as inclusive as possible. Scopus is an excellent research tool but inclusion in its database is an indicator of quality only in the broadest sense. To be credible, QS should consider finding some measure of research that measures genuine excellence.

These rankings have some surprises, the most noticeable and one lacking in face validity, is that the University of Hong Kong and not the University of Tokyo is the top university in Asia. Or perhaps this should not really be a surprise. Tokyo actually outperforms Hong Kong on all indicators except the internationalization ones and is ahead in all of the disciplinary rankings. Again, a lot of South Korean universities do very well.

It is good that QS are prepared to experiment with different indicators but the methodological innovations of these rankings do not seem to help very much.

Wednesday, May 13, 2009

Jawaharlal Nehru University has been returned to India. (see previous post)

Tuesday, May 12, 2009

Don't They Teach Geography Any More?

The QS Asian University Rankings are now out. I hope to comment in a while. For the time being. I've noticed that in the "International Students Review" (I'm not sure that it's a review but never mind), Jawaharlal Nehru University is listed as being in South Korea? I'm wondering how long it will remain there?
Asian University Rankings

QS will shortly release their Top 100 Asian Universities Rankings. It seems from the bits and pieces released so far that Australia, the Pacific and South West Asia are not included. There appear to be two innovations -- a trial run for the global rankings? -- namely counting student exchanges and including citations per paper as a measure of quality of research.

Monday, May 11, 2009

The Man with the Midas Touch

Those who are familiar with Malaysian gymnastics – admittedly not a large group -- will know who I am talking about.

There is a gymnastics coach in Malaysia who has achieved remarkable results with the women’s team of a very small state. For the last few years this state has won gold after gold at national competition. Hence the title bestowed by the Malaysian press. There does not seem to be anything special about the state – the men’s team has never done very much. Nor is there anything unusual about his training methods apart from their rigour and his intolerance for poor performance.

There must then be something about his selection methods. The state is so small that it does not have a gymnastics association and so this coach is free to select anyone he wants for training.

Recently, there was an opportunity to see just what these selection methods might be. The coach has now moved up to the national team and held a selection for young gymnasts, one of whom was my daughter, to train at the national sports centre.

The selection was quite revealing. The gymnasts were required to run, jump, do a bridge, do as many chin-ups as they could and a few simple exercises. Their height and weight were measured. There was no interest in their competition records, team spirit, motivation, leadership qualities or ability to respond to adversity.

At the end there was an interesting moment. The gymnasts were told to line up with their parents behind. There was much scurrying into position as people assumed they were being summoned for a group photograph, something without which no Malaysian event of any sort is complete. But after surveying the lineup, the coach said thank you and waved everyone away. What he was in fact doing was checking to see what the gymnasts would look like in a few years

So that was the secret of the man with the Midas touch. Assessment of basic physical skills and characteristics and reference to inherited traits. It was as though someone selected for elite universities by a simple test of general intelligence and a check on parental academic performance.

So here are two proposed experiments. Asian universities – and others – should scrap the proliferating complex of personality tests, language tests, co-curricular activities, interviews to test for leadership, politeness, sensitivity, appearance, interest and so on, profiles, course work and essays and just test for general cognitive ability.

The second experiment is that this coach and others should take heed of the global consensus and introduce personality tests, interviews and the whole paraphernalia of holistic assessment to choose future athletes.

I have a horrible feeling that the first will never happen but that one day the second will.
Comments on an Article by Andrew Oswald

I have already referred to an article by Andrew Oswald (14/12/2007) that contained some acerbic comments on claims to excellence by British universities derived from their showing on the THE-QS rankings.

He recently had an article in the Independent suggesting three measures that would allow universities to rise in the rankings. I will skip the first and third and just look at the second. He proposes that universities should be free to pay the market rate for highly rated researchers and offers Dartmouth College, which built up a top economics faculty by paying outstanding researchers appropriate salaries.

Dr Oswald is quite right but a lot of people are going to be depressed after reading his article. How can universities in Africa, Latin America and Asia get anywhere if they cannot afford to pay Ivy League level salaries? Is there anything that a university without a big pot of money can do?

The answer is that there is. In Moneyball Michael Lewis described how the Oakland As baseball team performed dramatically well even though they had only a comparatively small amount of money with which to buy players. They did it by simply ignoring the intuitions of talent scouts and looking at crude statistical data.

Basically, Oakland did the equivalent of a university scrapping search committees, interviews, personality tests, references and looking at the research done by applicants. Or choosing undergraduates by testing cognitive skills, literacy and numeracy rather than holistic assessment of leadership, communication skills, response to adversity, community involvement and so on.

At least one American university has done something like this. George Mason University has built up an excellent economics department by recruiting academics specializing in unfashionable fields that were undervalued by the academic marketplace.

Asian universities and others might consider systematically recruiting researchers whose personal characteristics and choice of unpopular research topics put them at a disadvantage when applying for academic positions. They might end up with a collection of unpleasant eccentrics but they might also see their ranking scores inching upwards.

Sunday, March 29, 2009

The Globalization of Cheating

There is a fascinating story in the Chronicle of Higher Education about the growth of the essay writing business. It seems that around the world there are shadowy companies employing dozens or hundreds of writers churning out essays for university students at every level.

'The writers for essay mills are anonymous and often poorly paid. Some of them crank out 10 or more essays a week, hundreds over the course of a year. They earn anywhere from a few dollars to $40 per page, depending on the company and the subject. Some of the freelancers have graduate degrees and can write smooth, A-level prose. Others have no college degree and limited English skills.


James Robbins is one of the good ones. Mr. Robbins, now 30, started
working for essay mills to help pay his way through Lamar University, in Beaumont, Tex. He continued after graduation and, for a time, ran his own company under the name Mr. Essay. What he's discovered, after writing hundreds of academic papers, is that he has a knack for the form: He's fast, and his papers consistently earn high marks. "I can knock out 10 pages in an hour," he says. "Ten pages is nothing."
His most recent gig was for Essay Writers. His clients have included students from top colleges like the University of Pennsylvania, and he's written short freshman-comp papers along with longer, more sophisticated fare. Like all freelancers for Essay Writers, Mr. Robbins
logs in to a password-protected Web site that gives him access to the company's orders. If he finds an assignment that's to his liking, he clicks the "Take Order" button. "I took one on Christological topics in the second and third centuries," he remembers. "I didn't even know what that meant. I had to look it up on Wikipedia." '



There are interviews with some of the professional essay writers. Two of them are Americans with law degrees. Another is a Nigerian with a master's degree from the University of Lagos. Many others appear to be from the Philippines and India.

Customers of the essay mills include a doctoral student in aerospace engineering at MIT and graduate students at Northern Kentucky University, James Madison University and the University of Southern Mississippi.

Nobody seems to be asking what is wrong with selection for American and British universities when there are thousands of students who cannot do the academic work required while there are people without degrees who are able to produce acceptable work for relatively trivial wages. Some of the writers for the paper mills do have degrees. With their obvious and marketable research and writing skills shouldn't they be in doctoral programmes or academic appointments?

Is it possible that the trend towards holistic admissions in the US and the dumbing down of A-levels in the UK have something to do with it? If students are admitted to university on the basis of leadership, social skills displayed at interviews, participating in community service, overcoming adversity and the writing (by whom ?) of admission essays rather than the cognitive abilities and background knowledge necessary to do academic work then it would seem that the essay mill business is essential to keep the system going.

I have a suggestion for any university that wants to improve student and faculty quality within a short period. Find an essay mill operator, appoint him as admissions and recruitment officer and give scholarships to the essay writers who do not have degrees and faculty positions to those who do.

Sunday, March 08, 2009

Best Places for Postdocs

The Scientist has published its annual report on the best places for postdocs. A certain amount of scepticism is in order. There are dramatic changes in position from 2008 and 2009 -- for example, this year's number 1 in the US, the Whitehead Institute for Biomedical Research, was number 14 last year -- that suggest a certain amount of caution is needed.

Nonetheless, the results are interesting. First, in the US the tables are dominated by non-university institutions. Is it possible that politicisation and declining academic standards are beginning to have a noticeable effect on the quality of American university research?

Second, in the international category the top three positions are held by German, Danish and Dutch institutions. The only English university in the international top ten is York. What happened to Imperial College, Oxford and Cambridge?

Thursday, March 05, 2009

Interesting New Blog

Go here.


International Education Blogs ~ A New Blogging Project by David Comp
International Education Blogs is a new blogging project I started today as an effort to bring all of the blogs on the web that touch on international education issues into one central location. Essentially it will be a blog roll with a few postings. Please bear with me as I learn the technology. This initial launch will have two different blog lists for you to review.The first list of blogs will be those of people who are blogging on the field/state of international education and related matters.The second list of blogs will be those of students and others who are currently blogging from a foreign location.

Wednesday, February 25, 2009

Semantic Shift Watch

Inside Higher Ed has a story about Sonoma State University that starts


A faculty report has stirred some racial tensions at Sonoma State University, following claims from its author that the institution’s
administration has deliberately targeted those from higher-income families as potential students for the past decade. In this process, the report claims that the university has become the “whitest” public institution in California, effectively preferring white students to minorities in an admission practice that it deems “reverse affirmative action.”


One aspect of Sonoma State that is decidedly diverse is the administration, where the president, provost and director of admissions – all criticized in the report – are Latino. The
professor who brought forth this report, however, is white.


Since when is a university administration whose senior members are from one ethnic group decidedly diverse?

Friday, February 20, 2009

A Note on the THE-QS Employer Review

Of all the components of the THE-QS World University Rankings the employer review is probably the least noticed. Universities in South East Asia and elsewhere, for example, are going to great lengths to recruit international students to boost their performance in the rankings but there seems to be no comparable effort to do better in the employer review.

It might be worth looking at the structure of the review for 2008. First of all, the distribution by industry (2008) seems very unrepresentative with a disproportionate number of respondents drawn from financial services and banking, consulting and professional services, manufacturing and engineering and IT and computer services in that order.

The distribution by country (2006-8) is even more skewed. Take a look at the list of employer review resonses by country from the QS topuniversities site. The UK and Australia between them have more respondents than the USA. Mexico has more than China, Greece more than Germany, Singapore more than Japan, Ireland more than France and Romania more than any country in the Middle East, including Israel, Turkey, Iran and the Gulf states.

Incidentally, since the banking sector has been so incompetent at choosing whom to lend money to, is it a good idea to allow it to have such a big say in evaluating universities?




United States
346
United Kingdom
269
Australia
178
Mexico
75
Netherlands
75
Singapore
74
Russia
69
India
64
Argentina
60
Greece
59
Germany
56
Hong Kong
50
Philippines
45
Ireland
41
Malaysia
38
Canada
37
Japan
37
France
36
New Zealand
36
South Korea
32
Italy
29
Chile
28
Spain
27
Venezuela
27
China
25
Denmark
23
Thailand
23
Switzerland
22
Belgium
19
South Africa
19
Ukraine
18
Taiwan
17
Czech Republic
16
Romania
15
Other
354

Sunday, February 01, 2009

A Bit More About VU Amsterdam and the Previous Post

I have just remembered seeing this item at the bottom of the 2007 rankings on the QS site.

Vrije Universiteit AMSTERDAM

The data supplied by VU Amsterdam did not include faculty numbers for the VU Medical Center. Using 2006 citations data as a benchmark, it appears that the mapping of the citations database did not return the expected number of papers and citations - perhaps due to a
volume of research being published under institution names not easily
identifiable as being part of VU Amsterdam. The resolution of these issues would certainly result in a higher ranking for the university and improved performance in the Peer and Recruiter Reviews would suggest that VU Amsterdam may have maintained its Top 200 position.


It seems then that in 2007 QS did not count medical faculty but they did in 2008, resulting in a rise from 40 to 84 in the student- faculty ratio score. Secondly, in 2007 they did not count research if the affiliation did not clearly identify VU Amsterdam but they did in 2008, producing a rise in the citations score from1 to 38.

Saturday, January 31, 2009

No Way But Up

Researchtrends, a newsletter published by Scopus, has an interesting item on the THE-QS World University Rankings. It suggests that the rankings can be used to assess the recent research performance of countries as well as institutions.

This is not a bad idea in principle but researchtrends has failed to examine the rankings closely enough. Its writer observed that universities in some countries have improved their position quite dramatically. Two Indian institutions in the top 200 have enjoyed a net change in rank of 248 between 2007 and 2008, eleven from the Netherlands a change of 230 and seven Swiss universities one of 217 while three Israeli institutions rose 194 places between them. Researchtrends describe the Indian achievement as "astonishing" and "testament to the continued development of research in India. "

The writer concludes

"This suggests that national improvements in ranking may be at least
partially the result of individual universities taking a more strategic
approach: targeting international publications, aided by bibliometric tools and building and promoting library collections"

Are they?



Looking at the scores on the citations per faculty part of the rankings, we find that The Indian Institute of Technology (IIT) Bombay had exactly 1 point in 2007 and 43 in 2008, apparently a truly remarkable increase in research or rather citations of research.

The IIT Delhi went up from 1 to 47, the Free University of Amsterdam from 1 to 38, Technion Israel Institute of Technology from 1 to 79 and the Ecole Polytechnique Federale EPF) Lausanne from 29 to 77.

A couple of American universities also apparently enjoyed spectacular rises in the citations score, Washington University in St. Louis (WUSL), already discussed in this blog, and Stony Brook University from 1 to 75.

In reality, there has of course been no spectacular increase in research or citations. What has happened is that in 2007 QS had problems with identifying certain universities or got them confused with others. They, or THE, took a while to decide whether there was one IIT or several and almost certainly got confused between WUSL and the University of Washington. One reader of the blog has suggested they got the Free University of Amsterdam mixed up with the University of Amsterdam. Something similar could have happened with the EPF Lausanne.

So, there may have been an improvement in research productivity over the last few years in India, Switzerland, Israel and the Netherlands. But that is nothing to do with the ascent of some universities from these countries in the rankings. That is testament only to errors committed by QS or -- let us give them some credit -- the correction of errors.

Thursday, January 22, 2009

Top Universities Guide

I have finally forced myself to buy and read the second edition of QS's book, retitled Top Universities Guide.

The first edition, which was called Guide to the World's Top Universities, was an unqualified disaster, full of errors the worst of which was getting every single student faculty ratio wrong as a result of somebody moving every university three rows down while copying data. Others included putting the Technical University of Munich in the profiles twice in positions 82 and 98 and listing an "Official University of California, Riverside". It also started to fall apart within a few days.

The new edition, however, tuned out to be a pleasant surprise. The previous errors seem to have been corrected and there do not appear to be any new ones. There is a new dignified purple cover and so far not a single page has fallen out.

This does not mean that the THE-QS rankings are faultless. Far from it. The peer review remains extremely biased, the international components are largely meaningless and the student faculty ratio is misleading and easily manipulable. Still, the book does show that in a technical sense, QS has improved quite a bit recently.

The change in title should be noted. There may be another reason, but this would appear to be a clever attempt to distance QS and the authors from the first edition without admitting that there was anything wrong with it.

And I wonder why Blackwell is no longer distributing the book.