It’s a fact! Rankings are getting more and more important. In a recent article in Inside Higher-ed I found out that they are even explicitly included in the performance criteria for some university presidents. In this case it was the Arizona State University president who would get an extra 60,000 US$ bonus if his university would improve its ranking in the U.S. News & World Report. But in the US, resistance against this ranking is gradually growing.

For the rest of the world it is probably the ranking of the Times Higher Education Supplement (THES) that has the biggest impact. I am sure that this ranking functions as a performance criterion for many university leaders in the world (although, much more implicitly). All the more reasons to be careful and accurate in composing the rankings. That’s what you would think…

This week’s Economist reports on an MBA ranking published by Fortune magazine. The top exists of the usual suspects like Wharton and Harvard. But one well respected business school was missing in the list: the Kenan-Flagler Business School of the University of North Carolina in Chapel Hill. The school for instance had recently placed eighth in a national ranking based on recruiter ratings published by the Wall Street Journal. So….what was the case?

It turned out that Fortune had used a European firm, Quacquarelli Symonds Ltd (QS), to collect data from the schools and create the list. When officials from Kenan-Flagler spoke with QS they were told that their school had been confused with North Carolina State’s College of Management. NC State rarely appears in any of the major rankings, but it placed 25th on Fortune’s list. Along with Kenan-Flagler, Boston University School of Management was also a victim of the flawed research.

After reading the first line, I thought: ‘again!?’ Yep… Quacquarelli Symonds Ltd (QS) did it again.

QS is the (UK based) organisation responsible for the THES rankings and they are making a real mess out of it. First time I noticed that was when the University of Malaya (UM) fell in the rankings from position 89 in 2004 to 169 in 2005. This caused quite some political turmoil in Malaysia. The result of the ranking was even discussed in Parliament. Even Prime Minister Abdullah Ahmad Badawi expressed his concerns. The Vice Chancellor at the time of the ‘rise and fall’ of UM did not continue for a next term as VC…..

What proved to be the case was that QS had counted all the Malaysian Chinese and Malaysian Indians as foreign students (one of the criteria in their rankings) in 2004. In 2005, they corrected their mistake with a steep drop in the rankings as a result. During last years publication of the THES ranking I was in Kuala Lumpur (at UM) and I could experience the impact of the THES rankings myself. I’ve never seen so many articles and letters about higher education in regular newspapers.

But…this was just the tip of the iceberg. It’s a good thing that there are people like Richard Holmes that are keeping a close watch on the methodology used in those rankings. He reports on many, many instances where QS messed up. Here are a few examples:

  • On the flaws of peer review, and especially incorporating peer review as such an important criterion (40%), have a look here (on the geographical bias) , here (comparing the peer review with citations) and here (about the methodology of the peer review: a response rate of 0.8%!!!).
  • On the student/faculty ratio. All indicators are indexed on the best performer on a that indicator. For faculty/student ratio in 2005 this was Duke University. It was clear that this figure was wrong (not Duke’s mistake but QS’s). However, it was not corrected for the 2006 rankings. Here he finds out what mistakes were made. Since the rest was indexed on this score, the other scores are wrong as well!
  • There were also other universities were things went wrong, for instance here and here.
  • And then there are simply a lot of factual mistakes reported here. No…that’s not all. There are many more.

Holmes informed THES in an open letter about all of the QS mistakes, but they are not exactly in a hurry to correct these misstakes.

But also from the research community there’s criticism. For instance this article in The Australian from Simon Marginson, a higher education researcher at the Center for the Study of Higher Education of the University of Melbourne. He also agrees that rankings are here to stay, but he does provide some valuable input for improving these rankings.

It’s rather disappointing that reputable publications like THES and Forbes use the services of companies like QS. QS clearly doesn’t have any clue about the global academic market and has no understanding of the impact that their rankings are having throughout the world. There has been a lot of critique about the indicators that they use, but at least we can see these indicators. It are the mistakes and the biases that are behind the indicators that make it unacceptable!

Fortune already took the results of the MBA ranking of their website. I wonder whether THES will do the same thing… Probably not.

For the THES/QS World University Rankings of 2007, look here

This article has 8 comments

  1. John

    Hi Eric,

    Thanks for the post. It’s an incredibly interesting read as I have been long aware of the poor research methodologies behind the THES university rankings (among others). It’s a clear indication of the negative social, political and economical impacts that poor research can have.

    The folks at QS really need to be sanctioned for their poor research skills. Even a first grader would know that a 0.85% response rate is an extraordinary non-representative sample size. Despite this they still continued to publish the results which in such a case is indeed extremely invalid.

    It’s a shame no one of social or authoritative prominence has stepped up and denounced the rankings officially though.

    John.

  2. Eric

    John….I couldn’t agree more! There however seems to be increasing resistance from small colleges against the US News & World Report. I guess it needs some clout to make the resistance worthwhile. No sign of that yet in the field of international rankings

  3. Zack

    Dear Eric,

    I have to agree with you on the fact that THES World University ranking has the biggest impact to the Universities outside the US. No matter how QS conduct their University ranking, obviously a drop in the ranking for a University from year to year would definitely make the University suffer in term of reputation and public perception. The continous decline in the ranking of the University of Malaya from 2004 to 2006 is the best example of such case. The drastic drop in the ranking from 169th in 2005 to 192nd in 2006 (behind UKM) has put the University’s status as the premier University of Malaysia in doubt. Does the fact gives a true indicator that the quality of the University is indeed declining? This is pretty doubtful. Anyway, we shall see what happen to the THES world ranking of the University in 2007 which will be released on November the 9th, 2007.

  4. Eric

    Zack,
    Yes, I’ll keep my eyes on the new rankings tomorrow (or actually….I found a leak today). And of course on the reactions on it around the world, and especially Malaysia. But I wonder how it really affects reputation nationally. I doubt whether the rankings have changed the fact that UM used to be the preferred (national) destination for Malaysian students. I don’t think that last year’s rankings changed that and that students now prefer to go to UKM.

    It will be interesting to see how the changes in methodology affect this years rankings. Have a look at Richard Holmes’ latest post on that.

    Also, a shift in the THES rankings from 169 to 192 is barely significant, especially if you take into account the fluctuations that some other universities have experienced in the past years. QS and THES should categorise institutions like the Shanghai ranking: specify the ranks 1-50 and then create categories for 50-100, 100-200 etc. That would at least not create the illusion that number 180 is better than the number 190.

  5. Zack

    Eric,

    Personally, I believe that a University should be given a precise rank in the THES world University ranking just like in a class where students are ranked academically according to the average marks they scored in the exam. As long as QS adopt reliable methodology to evaluate its University ranking, and if the outcome undisputely indicate that UKM is better than UM, it does not matter to accept the fact that the number 185 is better than the number 192 as the solid evidence is there to prove that. What sadden me most is that QS continue to adopt the Peer to peer review and recruiter review (which are rather subjective) as the major consideration in evaluating the World university ranking this year. I believe that a University should be largely judged by its quality of research, number of international papers published, and the quality of its staffs and graduates. Anyway, thanks for sharing the leaked information of the top 100 THES universities in 2007. It is not suprising to see that Harvard is consistently ranked as the Number 1 university in the world for four consecutive years. I am anxious to see how the malaysians react after the list of top 200 THES world university ranking for year 2007 is released tommorow. Will there be any Malaysian University in the top 200 or none of them make it into the top 200 this time? I won’t make any comment on that until the full result of top 200 THES Universities for year 2007 is released. Let’s see what happen.

  6. Pingback: On the use of rankings and league tables | Beerkens' Blog

  7. zack

    THES QS World University ranking for year 2008 will be released on 9th October 2008. Does anyone knows the new ranking of the top 200 Universities in the world for year 2008? Please share the leaked information of the ranking if you do have it. Thank you.

  8. Pingback: The Social Significance of University Rankings « Scott Sommers’ Taiwan Blog

Leave a Comment

Your email address will not be published. Required fields are marked *