Tomorrow, the new 2008 Academic Ranking of World Universities will be officially published. Not surprisingly, it’s an almost all American affair. It’s rather interesting that the publication of the Shanhai Jiao Tong rankings almost goes by unnoticed, especially if you compare it to the publication of the Times Higher Education Supplement/QS World University Rankings (the THES-QS rankings 2008 will be published on 9 October).

This exactly is the strength of the SJT ranking. After all, universities are robust organisations and don’t change a lot in a years time. I guess it therefore corresponds with reality that the top 10 of 2008 is exactly the same as the one of 2007. Actually, not much has changed at all (although I of course did notice that the University of Sydney – my former employer – entered the top 100; the top 500 list is here).

2008(2007) University
1 (1)   Harvard University
2 (2)   Stanford University
3 (3)   University California – Berkeley
4 (4)   University Cambridge
5 (5)   Massachusetts Inst Tech (MIT)
6 (6)   California Inst Tech
7 (7)   Columbia University
8 (8)   Princeton University
9 (9)   University of Chicago
10 (10)   University of Oxford

The main critique on the SJT rankings is that they only give an indication of a university’s research quality. They have only one proxy for teaching quality and that one isn’t exactly saying much about teaching quality at all. I have already pointed to some alternatives for these research biased rankings and league tables, for instance the new ranking being develop by CCAP (Center for College Affordability and Productivity).

This last one has now been published by Forbes Magazine. And yes…the criteria are very different than the ones we are used to:

  1. Listing of Alumni in the 2008 Who’s Who in America (25%)
  2. Student Evaluations of Professors from (25%)
  3. Four- Year Graduation Rates (16 2/3%)
  4. Enrollment-adjusted numbers of students and faculty receiving nationally competitive awards (16 2/3%)
  5. Average four year accumulated student debt of those borrowing money (16 2/3%)

And what’s the result?

2008 University
1 Princeton University
2 California Institute of Technology
3 Harvard University
4 Swarthmore College
5 Williams College
6 United States Military Academy
7 Amherst College
8 Wellesley College
9 Yale University
10 Columbia University

Compared with the SJT rankings, it are especially the liberal art colleges and the military colleges that are evident in the Forbes ranking. The high quality liberal arts colleges in the US (and elsewhere) are unfortunately lacking in nearly all international rankings. The reasons for this is of course again that these rankings are so research biased.

Another thing that I noticed after looking through the rest of the list is the relatively low standing of the public research universities. University of Virginia is the first one on 43, University of North Carolina at Chapel Hill at 66 and UC Berkeley at 73.This is probably due to another flaw in most rankings, that is that they measure the quality of the graduates without looking at the quality of the inputs. For more criticism on this ranking, see the comments on Vedder’s article in Inside HigherEd and the critical contribution of Patricia McGuire.

This challenge of actually measuring the added value provided by the university is taken up by the OECD’s AHELO project: assessing learning outcomes in higher education (sometimes referred to as the PISA for higher education). This exercise is still in it’s early stages and currently they are at the stage of studying the feasibility of such an exercise. And although the OECD explicitly does not want to promote it as a ranking, it might provide an alternative for the league tables.

This article has 7 comments

  1. Kris Olds

    Good entry. But I would disagree with you as my sense, on the basis of what I have seen going on in higher ed policy circles in Paris, Brussels, the USA and Canada, is significant and *growing* interest in the Shanghai rankings. In the US, in particular, universities pay a lot more attention to the Shanghai rankings than they do the THES-QS rankings (for a range of reasons). National-scale university and field-specific rankings (e.g., U.S. News & World Report; NRC) still matter most in the US, this said, at least right now. Kris

  2. Eric

    Thanks for that comment Kris. It gives me the opportunity to clarify what I said above.

    I agree with what you said. I do think that the SJT rankings get – and should get – more attention than the THES ranking. That is…from the professional circles.

    I still have the impression that students and media (unfortunately) pay more attention to the THES rankings. Media exposure (especially in the commonwealth) is higher on the THES rankings than on the SJT ranking. And – looking at the search terms that bring people to my blog – I think the general public is more aware of the THES rankings.

    My ‘two cents theory’ is that one of the reasons for this difference in ‘popularity’ is exactly the fact that the THES is so inconsistent. I was not really excited to see the next SJT ranking. I didn’t expected any major changes, and I didn’t get them. I am more excited about the publication of the THES rankings. Simply because you never know what major absurd shifts will be in that league table… Unfortunately!

  3. Maarja Soo

    I see the problem somewhat differently. First, I think it is good if rankings are constructed separately for teaching and research. An attempt to aggregate excellent teaching in a Liberal Arts College with the lack of research (which is not part of their mission) would end up in a mess that does not inform anybody on anything. Teaching and research are somewhat linked but they are mostly separate “products”. We still do not know whether they are complementing or substituting each other or if this depends on thousands of other things. The problems arise only when people use research rankings for teaching-related decisions. But what can you do?! Some people tend to go for looks and fame, not only in HE.

    Secondly, SJIT ranking is not a research ranking; it is a research prestige ranking. This explains their stability. It is heavily influenced by Nobel prizes and highly cited authors – i.e. “research stars”. We know that the top private universities have simply more money to hire the “stars”. The real question is to what extent the “stars” actually affect research (or teaching) that goes on at the university in 2008. I noticed the picture of Einstein in your next entry. He certainly brought a lot of prestige to Princeton. With all due respect, by the time he joined Princeton his real contribution to science was done and he also was certainly not the type to bring in a lot of grant money and encourage “the best talent” to work with him. It is correct that universities are complex organizations and it is unlikely that they substantially change over one year’s timebut this, I am afraid, is not the reason why we see the stability in this ranking. The true reason is that the ranking is tied down by history and money.

    I completely share your concern that there are no public universities on the picture. Fortunately, no reason to despair! If we count citations, University of Michigan, UCLA, University of Washington are in the top 10. This is in my view a better indicator of CURRENT research. By that criterion, by the way, Princeton drops to 120. (See interesting calculations at

    PS. BTW, this is not a personal vendetta against Princeton. It is still my favorite among the giants.

  4. ilanch

    hai friend

    i am pleased with your work. it was very helpful to me.but i need one clarification. i am indian and i wanna do my masters in canada in the field of civil engineering. but i have no idea about the good unis out there. so tell me what should i do to find such one..please help

  5. Pingback: More rankings Shanghai Jiao Tong Forbes amp AHELO Beerkens 39 Blog | Uniform Stores

Leave a Comment

Your email address will not be published. Required fields are marked *