What happened to MIT and Cambridge?
QS now has a new world number one university. Massachusetts Institute of Technology (MIT) has replaced Cambridge and overtaken Harvard.
Unfortunately, this change probably means very little.
Overall the change was very slight. MIT rose from 99.21 to 100 while Cambridge fell from 100 to 99.8.
There was no change in the two surveys that account for half of the weighting. MIT and Cambridge both scored 100 for the academic and the employer surveys in 2011 and in 2012.
On the citations per faculty indicator Cambridge did quite a bit better this year, rising from 92.7 to 97 while MIT fell slightly from 99.6 to 99.3. This could mean that, compared to front-runner Caltech,Cambridge has produced more articles, had articles cited more often, increased its faculty numbers or that there was some combination of the three.
For faculty student ratio, Cambridge fell slightly while MIT's score remained the same. For international students both fell slightly.
What made the difference was the international faculty indicator. Cambridge's score went from 98.4 to 98.2 while MIT's rose from 50 to 86.4, which means 1.82 more points in the total ranking, more than enough to overcome Cambridge's improvement in citations and pull slightly ahead.
Having done some rapid switching between the ranking scores and university statistics, I would estimate that a score of 50 represents about 15% international faculty and a score of 86 about 30 %.
It is most unlikely that MIT has in one year recruited about 150 international faculty while getting rid of a similar number of American faculty. We would surely have heard about it. After all, even the allocation of office space at MIT makes national headlines. Even more so if they had boosted the total number of faculty.
International faculty is a notoriously difficult statistic for data collectors. "International" could mean anything from getting a degree abroad to being a temporary visiting scholar. QS are quite clear that they mean current national status but this may not always reach the branch campuses, institutes, departments and programs where data is born before starting the long painful journey to the world rankings.
I suspect that what happened in the case of MIT is that somebody somewhere told somebody somewhere that permanent residents should be counted as international or that faculty who forgot to fill out a form were moved into the international category or something like that.
All this draws attention to what may have been a major mistake by QS, that is configuring the surveys so that a large number of universities are squashed together at the top.For the academic survey, there are 11 universities with a score of 100 and another 17 with a score of 99 to 99.9. Consequently, differentiating between universities at the top depends largely on data about students and faculty submitted by institutions themselves. Even if they are totally scrupulous about finding and disseminating data there are all sorts of things that can can cause problems at each stage of the process.
I have not heard any official reaction yet from MIT. I believe that there are some people there who are quite good at counting things so maybe there will be a comment or an explanation soon