Wednesday, July 29, 2009

Webometrics Rankings

This is from Ranking Web of Universities

"The July edition of the Ranking Web of World Universities (http://www.webometrics.info) shows important news. Most of them are due to changes done to improve the academic impact of the open web contents and to reduce the geographical bias of search engines. As a result, the US universities still lead the Ranking (MIT with its huge Open Courseware is again the first, followed by Harvard, Stanford and Berkeley), but the digital gap with their European counterparts (Cambridge and Oxford are in the region’s top) has been reduced. Even more important, some of the developing countries institutions reach high ranks, especially in Latin America where the University of Sao Paulo (38th) and UNAM (44th) benefits from the increasingly interconnected Brazilian and Mexican academic webspaces.Several countries improves their performance including Taiwan and Saudi Arabia with strong web oriented strategies, Czech Republic (Charles), the leader for Eastern Europe, Spain (Complutense) and Portugal (Minho, Porto) with huge repositories and strong Open Access initiatives. Norway (NTNU, Oslo), Egypt could be also mentioned.On the other side, the underrated are headed by France, with a very fragmented system, Korea, whose student-oriented websites are frequently duplicated, New Zealand, India or Argentina.Africa is still monopolized by South African universities (Cape Town is the first, 405th), as well as Australian Universities are the best ranked for Oceania (Australian National University, 77th)Other well performing institutions include Cornell or Caltech in the USA, Tokyo (24th) Toronto (28th), Hong Kong (91st), or Peking (104th). On the contrary, in positions below expected we find Yale, Princeton, Saint Petersburg, Seoul and the Indian Institutes of Science or Technology."

1 comment:

Anonymous said...

This is from the webometrics site:
The Web covers not only only formal (e-journals, repositories) but also informal scholarly communication. Web publication is cheaper, maintaining the high standards of quality of peer review processes. It could also reach much larger potential audiences, offering access to scientific knowledge to researchers and institutions located in developing countries and also to third parties (economic, industrial, political or cultural stakeholders) in their own community.




All rankings have serious problems (for the first two rounds, THES criteria and weights were seriously questioned). But the webometrics people seem to make some very untenable assumptions. Anyone can put anything on the internet; its value is indeterminate. Moreover, the journals that do have an impact on scientific research are no longer in the public domain on the net. Citation and impact factors are not exactly flawless indicators of scientific productivity; nor is "internationalization" a good indicator of open minded inquiry. But they are at least weak (if very imperfect) proxies compared to net presence. This webometrics stuff really does not say anything about the quality of the institution in terms of teaching or scientific productivity (and the quality of productivity). So, they should drop the word ranking or simply say "web popularity or web presence" ranking. They should not pretend to rank universities. Conveniently, they do not let anyone contact them by email; they left their postal address on the web page. So much for "web"-o-metrics.