QS and Flawed Rankings

It’s a fact! Rankings are getting more and more important. In a recent article in Inside Higher-ed I found out that they are even explicitly included in the performance criteria for some university presidents. In this case it was the Arizona State University president who would get an extra 60,000 US$ bonus if his university would improve its ranking in the U.S. News & World Report. But in the US, resistance against this ranking is gradually growing.

For the rest of the world it is probably the ranking of the Times Higher Education Supplement (THES) that has the biggest impact. I am sure that this ranking functions as a performance criterion for many university leaders in the world (although, much more implicitly). All the more reasons to be careful and accurate in composing the rankings. That’s what you would think…

This week’s Economist reports on an MBA ranking published by Fortune magazine. The top exists of the usual suspects like Wharton and Harvard. But one well respected business school was missing in the list: the Kenan-Flagler Business School of the University of North Carolina in Chapel Hill. The school for instance had recently placed eighth in a national ranking based on recruiter ratings published by the Wall Street Journal. So….what was the case?

It turned out that Fortune had used a European firm, Quacquarelli Symonds Ltd (QS), to collect data from the schools and create the list. When officials from Kenan-Flagler spoke with QS they were told that their school had been confused with North Carolina State’s College of Management. NC State rarely appears in any of the major rankings, but it placed 25th on Fortune’s list. Along with Kenan-Flagler, Boston University School of Management was also a victim of the flawed research.

After reading the first line, I thought: ‘again!?’ Yep… Quacquarelli Symonds Ltd (QS) did it again.

QS is the (UK based) organisation responsible for the THES rankings and they are making a real mess out of it. First time I noticed that was when the University of Malaya (UM) fell in the rankings from position 89 in 2004 to 169 in 2005. This caused quite some political turmoil in Malaysia. The result of the ranking was even discussed in Parliament. Even Prime Minister Abdullah Ahmad Badawi expressed his concerns. The Vice Chancellor at the time of the ‘rise and fall’ of UM did not continue for a next term as VC…..

What proved to be the case was that QS had counted all the Malaysian Chinese and Malaysian Indians as foreign students (one of the criteria in their rankings) in 2004. In 2005, they corrected their mistake with a steep drop in the rankings as a result. During last years publication of the THES ranking I was in Kuala Lumpur (at UM) and I could experience the impact of the THES rankings myself. I’ve never seen so many articles and letters about higher education in regular newspapers.

But…this was just the tip of the iceberg. It’s a good thing that there are people like Richard Holmes that are keeping a close watch on the methodology used in those rankings. He reports on many, many instances where QS messed up. Here are a few examples:

  • On the flaws of peer review, and especially incorporating peer review as such an important criterion (40%), have a look here (on the geographical bias) , here (comparing the peer review with citations) and here (about the methodology of the peer review: a response rate of 0.8%!!!).
  • On the student/faculty ratio. All indicators are indexed on the best performer on a that indicator. For faculty/student ratio in 2005 this was Duke University. It was clear that this figure was wrong (not Duke’s mistake but QS’s). However, it was not corrected for the 2006 rankings. Here he finds out what mistakes were made. Since the rest was indexed on this score, the other scores are wrong as well!
  • There were also other universities were things went wrong, for instance here and here.
  • And then there are simply a lot of factual mistakes reported here. No…that’s not all. There are many more.

Holmes informed THES in an open letter about all of the QS mistakes, but they are not exactly in a hurry to correct these misstakes.

But also from the research community there’s criticism. For instance this article in The Australian from Simon Marginson, a higher education researcher at the Center for the Study of Higher Education of the University of Melbourne. He also agrees that rankings are here to stay, but he does provide some valuable input for improving these rankings.

It’s rather disappointing that reputable publications like THES and Forbes use the services of companies like QS. QS clearly doesn’t have any clue about the global academic market and has no understanding of the impact that their rankings are having throughout the world. There has been a lot of critique about the indicators that they use, but at least we can see these indicators. It are the mistakes and the biases that are behind the indicators that make it unacceptable!

Fortune already took the results of the MBA ranking of their website. I wonder whether THES will do the same thing… Probably not.

For the THES/QS World University Rankings of 2007, look here

Leave a Reply

Your email address will not be published. Required fields are marked *