Thursday, August 30, 2007

Discipline and Intelligence

Steve Sailer has a post that compares the Graduate Record Exam results of candidates by intended field of study and calculates their mean IQs. There may be some methodological leaps here but the results are interesting. Here is a selection:

Physics & astronomy 133
Mathematical sciences 130
Philosophy 129
Economics 128
Engineering 126
Chemistry 124
English language & lit 120
History 119
Sociology 114
Business 114
Business admin & mgmt 111
Student Counseling 105
Early Childhood education 104
Social Work 103
Psychology 112

Is it possible the number of students and faculty in various disciplines might be a crude but useful proxy for the overall intelligence of staff and students in specific universities?

Also, I can't help but wonder whether QS's succession of errors (counting ethnic minorities in Malaysia as international faculty and students, getting hopelessly mixed up over Duke's student faculty ratio and so on) compared with Shanghai Jiao Tong University's relatively blemish-free rankings has something to do with the former being led by a couple of MBAs and the latter by someone with degrees in chemistry and engineering.
Advice about Business School Rankings

Nunzio Quacquarelli, director of QS Quacquarelli Symonds (QS), has some advice for applicants to business school. He points out that rankings of business schools are controversial and that many believe they are badly flawed:

"For the last decade, the management education sector has been obsessed with the ranking of business schools. Publishers such as the Financial Times, Business Week and the Wall Street Journal sponsor regular surveys that stoke interest to the point that their coverage produces some of their top selling editions. There is now a growing controversy about whether these rankings provide useful information for MBA applicants, or are misleading and creating a 'herd instinct' towards a few schools, which is of no benefit to anyone.

Business school officials differ in their views of rankings. Although The Wharton School frequently tops rankings, Dean Pat Harker feels, "There is a very strong consensus among parties (alumni, faculty and staff of other institutions), that the ranking methodologies are severely flawed... Some people believe that if the rankings help us, who cares if they are flawed or give a limited view of the school? But we can't have it both ways. We either endorse a defective, inconsistent practice, or we speak out, offer better alternatives for information, and work with the media to enable them to report with more useful, objective data." "


After reviewing various rankings, Quacquarelli looks at QS's own MBA scorecard which allows readers to create their own rankings by changing the weighting of the various components. Unsurprisingly, he thinks highly of the scorecard:

"Rachel Tufft, Marketing Director at Manchester Business School, feels, "Scorecard is the most in-depth and interactive information tool available for MBA applicants today." The MBA Director at Cranfield adds, "TopMBA.com Recruiter Research adds a great deal of value because it is a clear statement from the marketplace about the popularity of international MBA programmes with recruiters. It also gives us a clear indication of what we need to do to improve. Any improvements we make to enhance our visibility amongst recruiters will be of direct benefit to our students - another sign of useful research." "

Quacquarelli does not mention that earlier this year QS provided the data for Fortune's 2007 "50 Best B-Schools for Getting Hired." The University of North Carolina at Chapel Hill's Kenan-Flagler Business School was outraged at not being included and was shocked by "the shoddy, inaccurate and inappropriate research methods employed in the Ranking of Top 50 Business Schools."


Kenan-Flagler has listed QS's errors in detail:

" QS has admitted that they did not contact us for this ranking. They admitted that they used data, often out-of-date information, collected for another purpose. They explained our exclusion by saying that they confused our business school with another North Carolina school (NC State).
Every major ranking organization notifies schools of impending rankings and requests data as input. QS did not. Virtually all data-collection organizations have verification and validation procedures. QS did not. Every publication announcing rankings would at least cross check that major schools from existing, established rankings were included. QS did not.


We and other schools have already uncovered multiple serious issues in data collection and analysis. The salary figures for our and other MBA Programs are outdated or wrong. Some data come from 2004, some from 2005, and some schools have reported the numbers don’t match their data for any year, even though QS contends that the data are all from 2006. If we were to use accurate Kenan-Flagler salary data alone, we would expect to be in the top 15 schools. To document, the average Kenan-Flagler base salary for the Class of 2006 was $89,494. The average signing bonus was $22,971. The number of students employed at 90 days post-graduation was 91.5%. "


Eventually, Fortune removed the rankings from their website.


Quacquarelli concludes his article with a warning:

"Users need to delve into each ranking and identify the elements that can provide useful information or insight into schools that may interest them."

Indeed they do.

Monday, August 27, 2007

College Board and NSC Pearson pay for mistakes

A settlement has been reached with regard to the suit brought by test takers who received incorrect scores for the October 2005 SAT. The College Board, who owns the test, and NSC Pearson, who scores it, have agreed to pay 2.85 million dollars to about 4,000 people.

"A tentative settlement was announced Friday by the two testing entities and lawyers who filed the class action. About 4,400 people — or about 1 percent of those who took the test that month — are in the class because their scores were reported incorrectly. Under the planned settlement, they will have two options. They can fill out a short form to automatically receive $275, or they can provide more information — if they believe that their damages were greater — and a retired judge will make binding decisions on how much they are entitled to receive."


The full story is here.

This is certainly embarrassing for the testers and will no doubt be used as ammunition by opponents of standardised testing and its use in university admissions and assessment. But one wonders how many more people have suffered from the gender, race and class bias of interviews. And are we ever going to see THES or QS acknowledge or apologise for any of their errors?

Saturday, August 25, 2007

Poppleton Blasts the League Tables

I have been a fan of Laurie Taylor's s satirical column at the Times Higher Education Supplement (subscription required) for a long time. The current issue has an amusing piece about Poppleton University's criticism of university rankings:


"Here at Poppleton, we strongly support this move. For although we were obviously gratified by our appearance at No 2 in the recently compiled Poppleton Evening News league table, our relatively lower positioning in the tables compiled by other newspapers is, we believe, the result of just such bias. Not one of these tables, for example, includes any of the following distinctive Poppleton features:

Size of Human Resources Department
Statistics show that Poppleton has more people involved in managing other people than any other university of comparable size in this country. "

Universities and colleges that complain about the failure of rankings to acknowledge their unique qualities, which somehow are not only unquantifiable but also inexpressible, deserve to be mocked. But they are a rather easy target. Will Laurie Taylor ever have a go at the THES-QS rankings?

Tuesday, August 21, 2007

Boycotting the Rankings

A number of American liberal arts colleges have refused to contribute to the reputational survey of the US News and World Report rankings. See insidehighered for the full story. This is a highly positive development since such surveys tend to be biased, self-confirming and opaque. The THES-QS "peer review" is perhaps the worst of the lot in these respects but other reputational surveys are probably little better.

Less positive is the news about Sarah Lawrence, a New York liberal arts college. This school no longer looks at the SAT scores of its applicants and therefore has been placed in the "unranked" category by USNWR, which counts SAT scores as a key indicator of student quality. There has been a fair bit of controversy about this but I doubt that Sarah Lawrence will suffer very much. The publicity will probably compensate for losing its place among the top liberal arts colleges.
Sarah Lawrence's action is, however, potentially very dangerous. The SAT is essentially an intelligence test and therefore is highly predictive of academic success and resistant to coaching. There is, it is true, a small scale industry devoted to boosting SAT scores but its claims are grossly exaggerated.

What will likely happen is that admission to Sarah Lawrence will be based on the evaluation of high school essays and performance in class including advanced placement courses and recommendations from teachers and counselors topped up with an array of interesting extra-curricular activities. It is more than likely that the admissions process will give an advantage to those whose parents can move to suburbs with good schools, provide a glut of stimulating activities that will be raw material for essays and provide advice, assistance, Internet access and transport for high school projects.

In short, ultimately admission to Sarah Lawrence -- and no doubt many other colleges eventually -- will be based on the ability to impress high school teachers and administrators and to have an interesting out of school life. In the end this is all far more dependent on parental wealth than an intelligence test.

If Sarah Lawrence's stand became widespread -- and it probably will -- then admission to many highly valued American colleges will be determined not by cognitive ability but by the social and communicative skills that can only be acquired by long and expensive socialisation.

What is baffling about this is that, as with the abolition of the 11-plus in Britain, such a development is led by intelligent and educated people people who surely must have gained enormously by the spread of standardised testing over the last century.
The Ten Happiest Colleges in the US

According to the Princeton Review they are:

1. Whitman College (Walla Walla, Washington)
2. Brown
3. Clemson university (South Carolina)
4. Princeton
5. Stanford
6. Tulsa
7. College of new Jerseay
8. Bowdoin (Maine)
9. Yale
10. Thomas Aquinas College (California)

I am sure that PR's methods could be argued about but it is striking that four of the universities on this list are also in the top ten of selective universities.
The Princeton Review Rankings

The Princeton Review has come out with a variety of rankings. One of them ranks US colleges by how hard they are to get into, which many would think is a good proxy for general quality. Here are the top ten.

1. Harvard
2. Princeton
3. MIT
4. Yale
5. Stanford
6. Brown
7. Columbia
8. Pennsylvania
9. Washington in St Louis
10. Caltech

Monday, August 20, 2007

US News and World Report Top Ten

The US News and World Report rankings of American colleges is out. A full report is here.
Briefly this is how they are produced:

"To rank colleges and universities, U.S. News first assigns schools to a group of their peers, based on the basic categories developed by the Carnegie Foundation for the Advancement of Teaching in 2006. Those in the National Universities group are the 262 American universities (164 public and 98 private) that offer a wide range of undergraduate majors as well as master's and doctoral degrees; many strongly emphasize research.

In each category, data on up to 15 indicators of academic quality are gathered from each school and tabulated. Schools are ranked within categories by their total weighted score."

The top ten are:

1. Princeton
2. Harvard
3. Yale
4. Stanford
5. Pennsylvania
6. Caltech
7. MIT
8. Duke
9. Columbia
9. Chicago



Sunday, August 19, 2007

The Shanghai Science and Maths Rankings 2007

The full rankings, released earlier this year, are here.

Below are the top twenty.

1. Harvard
2. Berkeley
3. Princeton
4. Cambridge
5. Caltech
6. MIT
7. Stanford
8. Tokyo
9. UCLA
10. Oxford
11.Cornell
12. Columbia
13. Chicago
14. Colorado--Boulder
15. ETH Zurich
16. Kyoto
17. Wisconsin -- Madison
18. UC Santa Barbara
19. UC San Diego
20.Illinois -- Urbana Champaign

Friday, August 17, 2007

The Shanghai Rankings: Social Science

Shanghai Jiao Tong University has also released, earlier this year, research rankings in broad subject areas. First here are the top twenty for the social sciences. The full ranking is here. These rankings should be taken with a fair bit of salt since they are heavily biased towards economics and business studies. Credit is given for Nobel prizes although these are only awarded for economics while psychology and psychiatry are excluded. There are two categories of highly cited researchers, Social Sciences -- general and Economics/Business.

It would, I think, be better to refer to this as an economics and business ranking.


1. Harvard
2. Chicago
3. Stanford
4. Columbia
5. Berkeley
6. MIT
7. Princeton
8. Pennsylvania
9. Yale
10.Michigan -- Ann Arbor
11. New York Univ
12. Minnesota -- Twin Cities
13. Carnegie-Mellon
14. UCLA
15. Northwestern
16. Cambridge
17. Duke
18. Maryland -- College Park
19. Texas -- Austin
19 . Wisconsin-- Madison


Note that there is only one non-US university in the top twenty, Cambridge at number 16. The best Asian university is the Hebrew University in Jerusalem at 40. The best Australian university is ANU at 77-104 . There is no mainland Chinese university in the top 100. This is dramatically different from the picture shown by the THES peer review in 2006.

Thursday, August 16, 2007

The Good University Guide

The London Times (not the Times Higher Education Supplement) has just produced its Good University Guide for British Universities. It is based on eight criteria: student satisfaction, research quality, student staff ratio, services and facilities spend, entry standards, completion, good honours and graduate prospects.

Here are the top ten.

1. Oxford
2. Cambridge
3. Imperial College London
4. London School of Economics
5. St Andrews
6. University College London
7. Warwick
8. Bristol
9. Durham
10. Kings College London

I shall try and comment later but for the moment it's worth pointing out that there are some spectacular rises, Kings College, Exeter and City University. This immediately raises questions of the stability of the methods and the validity of the data.

Wednesday, August 15, 2007

News from Shanghai

Shanghai Jiaotong University has just released its 2007 Academic Ranking of World universities. The top 100 can be found here. The top 500 are here.

I shall add a few comments in a day or so. Meanwhile here are the top 20.

1 Harvard Univ
2 Stanford Univ
3 Univ California - Berkeley
4 Univ Cambridge
5 Massachusetts Inst Tech (MIT)
6 California Inst Tech
7 Columbia Univ
8 Princeton Univ
9 Univ Chicago
10 Univ Oxford
11 Yale Univ
12 Cornell Univ
13 Univ California - Los Angeles
14 Univ California - San Diego
15 Univ Pennsylvania
16 Univ Washington - Seattle
17 Univ Wisconsin - Madison
18 Univ California - San Francisco
19 Johns Hopkins Univ
20 Tokyo Univ

Tuesday, August 14, 2007

Life Imitates Art

There is a web site, College Ranking Service, that is produced by "a non-profit organization dedicated to providing rankings of colleges in a manner suitable for students, university leaders, and tuition paying parents."

The home page says:

"We take our rankings seriously. Each college is painstakingly analyzed, as if under a microscope, for its flaws and degree of polish. The rankings found on www.rankyourcollege.com represent thousands of hours of research, and are updated annually or at the discretion of the Director.

The Board of the College Ranking Service, composed of Nobel Prize Winners and Captains of Industry, remains anonymous to ensure the integrity of the rankings.

The Director is also anonymous, however, rest assured that he is a prominent member of the academy and a professor of the highest regard at one of the most prestigious universities in the world."

There is also a disclaimer:

"There is no such thing as the "College Ranking Service." But the hyperbole and baloney contained in this web site are not that different from equally silly, but maddeningly serious college ranking publications and web sites offered by the media.

It is a sham and a scam to try to rank the quality of universities like sports franchises. Media publications that do this should be laughed out of existence. They simply measure wealth ("The Classic Method" on this web site), which is something that is at best obtusely related to quality.

Regardless of their lack of validity, media-based college rankings are having a negative influence on higher education. Tuition paying parents and their children are swayed by the false prestige these rankings imply. The push to get into a "top ten" school has created added pressure on students to stuff their high school years with lofty sounding, but often meaningless accomplishments. It has been partly responsible for the rise of a college application industry that provides services (like SAT prep classes and college application consulting) of dubious worth."


CRS also describes its methodology:

"In the course of developing our methodology, we found that our rankings had unique properties. First, we noted a phenomenon well known in particle physics, but unheard of heretofore in ranking systems: a college, like a subatomic particle, could be two or more places at once. In other words, individual colleges could have multiple rankings!

Second, we noted the well known and by now passe Heisenberg phenomenon in our rankings: our rankings were influenced by our evaluation. The more we looked at them in great detail, the more variability we saw. Finally, we found a butterfly effect: small perturbations in our extensive data base resulted in significant changes in our rankings.

The combined influences of these phenomena we term the Kanoeddel effect, in honor of the Director's mother's Passover matzah balls, which even though they were made at the same time, had a wide range in density (from that of cotton balls to that of granite pebbles). In Yiddish, the word for "matzah ball" is "kanoeddel."

Because of the Kanoeddel effect, we note that our rankings are not static. Hitting the refresh button on your web browser will cause the Mighty Max to recompute the rankings, resulting in a slightly different order."

In the Guide to the World's Top Universities, we find a perfect example of the first property with the Technical University of Munich occupying two different places in the rankings and also, in one case, being located in Dortmund.

The butterfly effect is illustrated perfectly by the data entry or transfer error that led to an incorrect figure for student faculty ratio for every university in the Guide.

Sunday, August 12, 2007

A Bit More About Student Faculty Ratios

This blog was originally supposed to be about university ranking in general but is danger of turning into a catalogue of THES and QS errors. I shall try to move on to more varied topics in the future but here is an elaboration of an earlier post on the faculty student ratios in the book Guide to the World's Top Universities, published by QS Quacquarelli Symonds Ltd. (QS) and written by Nunzio Quacquarelli, a director of that company, and John O'Leary and Martin Ince, former and current THES`editors.

In the first column below I have arranged the universities in the THES-QS rankings in alphabetical order. The middle column consists of the student faculty ratio included in the 2006 World University Rankings published in the THES (top 200) and on the topuniversities website. The figure is derived from converting the scores out of 100 in the rankings to ratios and cross-checking with QS's figures for faculty and students at topuniversities. The right-hand column contains the student faculty ratio in the Guide's directory and in the profile of the top 100 universities.

The two figures are completely different. But if you go down three rows you will find the figure is identical or almost identical.


Thus Aachen has a ratio of 14.7 in the Guide. Go down three rows and you will find that in the rankings and at topunivsersities Aberystwyth has a ratio of 14.7.

So, presumably what happened is that someone was pasting data between files and slipped three rows. This simple mistake has resulted in over 500 errors in the Guide.







UniversityrankingGuide
Aachen 12.414.7
Aarhus10.5 24.1
Aberdeen10.514.8
Aberystwyth14.715.1
Adelaide24.918.1
Adolfo Ibanez 14.820.5
Airlangga15.112.3
Alabama18.110.4
Alberta20.513.1
Amsterdam12.4 16.1
Antwerp10.424.3
Aoyama Guiken 13.113.6
Aristotelian16.115.4
Arizona State 24.314.3


I have written to Ben Sowter, director of research at QS and to the author's at Martin Ince's address. So far the only response is an automated message indicating that the latter is away.

Friday, August 10, 2007

Correcting Errors

Another site worth reading on law school rankings is MoneyLaw. There is an interesting post on the US News and World Report's correction policy. It seems that if USNWR makes a mistake the rankings are corrected but if the school is responsible the underlying data but not the rankings are corrected.

MoneyLaw comments:

"I wouldn't call that grossly unfair. Academic research would perhaps demand more attention to setting the record straight, granted. But USN&WR's rankings hardly constitute academic research."

I would add that this policy seems dramatically better than that of most other rankings
Self-assessment by American universities

State universities and colleges have come up with a plan to publish essential information on their web sites.

The National Association of State Universities and Land-Grant Colleges and the American Association of State Colleges and Universities are , according to the Wall Street Journal,

" designing a template for college Web sites that, for those that opt to use it, shows in standard format: (1) details about admission rates, costs and graduation rates to make comparisons simple; (2) results from surveys of students designed to measure satisfaction and engagement, and (3) results of tests given to a representative sample of students to gauge not how smart they were when they arrived, but how much they learned about writing, analysis and problem-solving between freshman and senior years.

The last one is the biggie. Participating schools will use one of three tests to gauge the performance of students with similar entering SAT scores at tasks that any college grad ought to be able to handle. One test, the Collegiate Learning Assessment, gives students some circumstance and a variety of information about it, and asks for short essays (no multiple choice) on solving a problem or analyzing a scenario. Under the state schools' proposed grading scale, 70% of the schools will report that students did "as expected," given their SATs. An additional 15% will report they did better or much better than expected, and 15% will report students did worse or much worse than expected."


This seems like a good idea. It could even go some way towards making commercial rankings redundant.


Saturday, August 04, 2007

The Wuhan Rankings

I have just noticed, via Wikipedia, a world university ranking by the Research Centre for China Science Evaluation at Wuhan University that seems to be based on current research productivity. Since the website does not have an English version, it is not possible to comment very much about it at the moment. According to Wikipedia it " is based on Essential Science Indicators (ESI), which provides data of journal article publication counts and citation frequencies in over 11,000 journals around the world in 22 research fields". If anyone can look at the website and tell me what the research fields are and what period is covered I'd be grateful.

I noticed some errors in the rankings. Laval University is located in France not Canada, York in the US, not Canada, and Bern in Sweden. Ljubljana is listed as being in "Jugoslavia", a few years out of date.

If the rankers have assessed a broad range of subjects and if they have looked at a recent period and if their methods are valid they may have produced a ranking of research achievement that is more current than the Shanghai index which includes decades-old Nobel and Fields prize winners. The ranking gives low positions to Cambridge and Oxford confirming suspicions that their high rating by THES -QS is unjustified. Princeton and Yale (strengths in the humanities?) have relatively low places. So do Chicago (strength in the social sciences?) and Caltech.

There are some more surprises. Texas is at number 2. Maybe this represents a genuine advance or perhaps the presence of a large medical school has something to do with it. "Univ Washington" is at number 3. This most probably means Washington University in St Louis. Before getting too excited about this result I would like to be sure that there has been no confusion with The University of Washington, Washington State University and George Washington University.



Here are is the top 20 along with the scores. Harvard at the top gets 100. The full ranking can be found here.



1. Harvard 100

2. Texas 87.49

3. "Univ Washington" 72.39

4. Stanford 71.91

5. Johns Hopkins 71.45

6. UC Berkeley 70.76

7. UCLA 70.38

8. Michigan 69.11

9. MIT 68.62

10. Toronto 66,90

11. Wisconsin 64.83

12. Columbia 64.71

13. UC San Diego 64.54

14. Pennsylvania 64.42

15. Cambridge 62.93

16. Minnesota 62.80

17. Yale 62.20

18. Cornell 62.19

19 UC San Francisco 61.52

20. Duke 60.60

Friday, August 03, 2007

International Students in Malaysia

QS Quacquarelli Symonds seems to have bad luck with international students in Malaysia. Their topuniversities site has a piece on "Study Abroad in Malaysia" which states

"On the back of its enduring economic and industrial boom, Malaysia is trying hard to position itself as the Asian destination of choice for international students seeking to study abroad, and with some success. Currently there are around 50,000 students from 100 countries in Malaysian tertiary education, forming 20-30% of the student body - and the country wants to promote a multicultural image that reflects the country itself. "


The total number ot registered students in tertiary education in Malaysia is in fact about 732,000. International students therefore are well under ten per cent of tertiary students
Rankings Not a Target


According to the Kuala Lumpur New Straits Times, the vice-chancellor of Universiti Malaya has said that the university's international ranking

"should not be a target. Instead, UM’s main aim was to produce
quality work, she added"

Wednesday, August 01, 2007

Problems with Law School Rankings


Another blog worth looking at is Agoraphilia which, among other things, has posts on the US News and World Report law school rankings. One of them deals with University of Florida's receiving an erroneous and over-favourable rating by the USN&WR, apparently because it reported the LSAT scores and GPAs only for the fall 2005 intake and did not include those for the spring intake.


What most impresses me about this is that the Dean and Associate Dean of Florida's law school and Robert J. Morse, Director of Data Research at USN&WR, have replied promptly and at length to questions about ranking methods.