Ranking season is over. Yesterday, the Times Higher published its new ranking and that also marked the end of the ranking season for this year. After the Shanghai Jiao Tong ranking, the Leiden ranking, the QS ranking and the Taiwan ranking, this was the fifth attempt to illustrate the differences in quality of the world’s universities. Whether they succeeded in this remains a question of debate.
Although there are quite some differences in the results of the rankings, a few common observations can be made. First of all, it is clear that the United States still is home to the best universities. In all rankings the US universities are dominant and Harvard is the undisputed leader. Only in the QS ranking it was a non US university – Cambridge – that topped the list.
Another observation is that non of the rankings manage to sufficiently capture the quality of teaching in their assessment. The THE ranking made an attempt to do so, but most of their indicators still reflect research quality and prestige more than the quality of teaching. The Shanghai, Leiden and Taiwan rankings put most emphasis on research.
Even though the ranking predominantly assess research – although in different ways – the results are very different. To illustrate this point I have mapped the results of the Dutch research universities in the different rankings. The results are shown in the graph below (click to enlarge).
The results for the twelve universities (the thirteenth, Tilburg University somehow doesn’t appear in the rankings) show a substantial variation for all universities. For universities like Eindhoven, Twente and Maastricht, the variation seems exceptionally large. Eindhoven for instance was ranked as the best university in the THE ranking while performing worst in the Shanghai ranking. Leiden shows the least variation but here the difference between its rank in the Shanghai ranking (70th) and the THE ranking (124th) is still enormous.
Earlier this week at the OECD/IMHE conference Charles Reed, Chancellor of California State University, critiqued the rankings (“rankings are a disease”) and argued that all universities add value. I guess he’s right. And the value measured by one ranking seems to be quite different than the value measured by the other…
Study abroad for a full degree has developed from an elite to a mass phenomenon. Parallel to this development, we have witnessed a commercialization of international higher education where many institutions have become financially dependent on full fee paying international students. To operate in this (global) market, institutions – and especially the lesser-known ones – now frequently turn to agents and recruiters in order to attract prospective students. Many point to the risks of these third party agents and plea for more regulation or even abolishment.
Abolish or regulate? In Inside Higher Ed, Philip Altbach, the Director of the Center for International Higher Education sheds his light on this issue. His viewpoint is clear and unambiguous: “Agents and recruiters are impairing academic standards and integrity — and it’s time for colleges and universities to stop using them.”
These agents recruit prospective students and provide general information, but – according to Altbach – in reality they are also making offers to students or actually admit students, often based on murky qualifications (even though the colleges that hire them say that they still control admissions).
Some initiatives have appeared in the United States with the objective to regulate this ‘new profession’ but these organisations lack powers to monitor compliance or discipline violators. The solution Altbach provides is simple: abolish them! After all, they have no legitimate role in higher education.
Dutch Self regulation
In the Netherlands, the institutions have chosen self-regulation as the prime instrument for managing (the excesses of) international recruitment. The sector-wide ‘code of conduct’ sets out standards for Dutch higher education institutions in their dealings with international students. One chapter in this code of conduct (pdf) deals with the use of agents. The provisions in this chapter stipulate that agents have to act in the spirit of the code and clarifies the responsibilities of the agents and those of the higher education institutions. One of the starting points is that admission remains the responsibility of the institutions and that institutions have to take action immediately in the case of unethical behavior.
This way of dealing with the risks of student recruitment (in an increasingly commercialized market) is somewhat comparable to the method of self-accreditation or ‘accreditation lite’ in the United States. Altbach criticized this method because of the lack of powers in the case of non-compliance.
In some of the comments below the Inside Higher Ed article, Altbach’s view is portrayed as elitist. Prestigious American schools like Boston College might not need such recruiting agencies. But what about less prestigious universities? What about the ones that are not part of the Ivy League, the Russell Group or the Group of 8? Maybe, these institutions do need professional assistance in reaching prospective international students.
For these institutions, abolishment might not be acceptable. In addition, it remains the question whether all of these agents are rogue operators? Is Altbach’s opinion also valid for agents in other parts of the world? Or is this phenomenon only apparent in the more commercially higher education sectors (Altbach is mainly referring to the USA, Australia and the UK)?
Either way, even if the number of malicious operators would be small, some form of regulation might be necessary to protect the numerous international students who are about to invest a lot of money into their future. Should we let the market do its work or does this sector need government protection? Or is there enough trust in the higher education institutions (and in the majority of the agents) and should we apply soft instruments like codes of conduct and other forms of self-regulation?
Summer holidays are over. In the global field of higher education, this also means that it is ranking season. Last month it was the Shanghai ranking, This week the QS World Universities Ranking were revealed and in two weeks the all new Times Higher Education ranking (THE) will be published. Ranking season also means discussions about the value of rankings and about their methodologies. Two points of critique are addressed here: the volatility of (some) rankings and the overemphasis of research in assessing universities’ performance.
Volatility and stability in international rankings
This year’s discussion has gotten extra fierce (and nasty now and then) because of the THE’s decision to part with consultancy agency QS and to collaborate with Thomson Reuters, a global research-data specialist. The previous joint THE/QS rankings usually received quite some media attention. This was not just because their methodology was heavily criticized (and rightly so) but also because this disputed methodology led to enormous fluctuations in the league tables from year to year. The critique has made THE to join forces with Thomson while QS continues their ranking.
Although the various rankings differed in their methodology, they all seemed to agree on two things: the hegemony of the United States universities and the undisputed leadership of Harvard. This week’s QS rankings again showed the volatility of their methodology. For the first time Cambridge beat Harvard and for the first time the top ten is not dominated by US universities. The top ten is now occupied by five US universities and five UK universities.
The Shanghai ranking on the other side shows much less fluctuations in its rankings. This probably does reflect reality better, but makes it less sensational and therewith less attractive for wide media coverage. The two graphs below clearly show the difference between the stable Shanghai rankings and the volatile QS rankings for a selection of Dutch universities.
The graphs show the positions in the past six years for the four Dutch universities that are in the top 100 of the Shanghai and/or QS ranking. To illustrate the relative meaning of the absolute positions, the Shanghai rankings groups institutions above rank 100 (this also explains the relatively steep drop from Erasmus University in the 2006 ranking). Although Amsterdam has remained fairly stable in the rankings, Leiden and Utrecht show quite some fluctuation. Much more than its real quality would justify.
And who thinks this is volatile, it can be much worse. Simon Marginson in a 2007 paper lists dozens of cases where drops are increases of more than 50 positions (sometimes even up to 150 positions) occur in a year. A case in point is the Universiti Malaya who went from “close to world class” to “a national shame” in only two years…
It will be interesting to see in the coming years how the new THE/Thomson methodology will work out in this respect. The Times Higher published its methodology this week. While the QS ranking based their listing on only 6 indicators (with 50% weighting going to reputational surveys), the new THE ranking takes into account 13 indicators (grouped in five categories). Considering this higher number of indicators and considering that the weight of reputational surveys is significantly lower, it is also likely that fluctuation will be lower than in the QS ranking. Time will tell…
Are international rankings assessing teaching quality?
Another frequently mentioned critique on the existing international rankings is that they put too much emphasis on assessing research and neglect the teaching function of the university. Since the new THE ranking more than doubled the number of indicators, it is likely that the assessment will correspond better with the complex mission of universities.
If we look at the new methodology this indeed seems to be the case. The teaching function now constitutes 30% of the whole score and is based on 5 indicators. In the QS ranking, it was based on only 2 indicators (employer’s survey and staff/student ratio).
The 5 indicators are:
- Reputational survey on teaching (15%)
- PhD awards per academic
- Undergraduates admitted per academic (4.5%)
- Income per academic (2.25%)
- PhD and Bachelor awards
A closer look at these 5 indicators however leaves the question on how much they are related to teaching.
- First of all, one can wonder whether a reputational survey really measures the quality of teaching or whether this in reality is another proxy for research reputation. Colleagues and peers around the world often do have some idea of the quality of research in other institutions, but is it likely that they can seriously evaluate the teaching in other institutions? Apart from the institutions where they graduated or worked themselves, it is unlikely that they can give a fair judgment about the teaching quality in other institutions, in particular in institutions abroad.
- Two other questionable indicators for the quality of teaching are the number of PhD’s awarded and the number of PhD awards per academic. In the Netherlands, and in many other countries in continental European and elsewhere, this says much more about the research quality and the research intensity of an institution than about the teaching quality
- The indicator ‘Undergraduates admitted per academic’ seems the same as their old indicator of student/staff ratio. Assuming here that a lower number is better, this again benefits research intensive institutions more than other institutions. Research intensive institutions employ relative many academics, but many of them will have research only contracts. Yet, in this indicator they will still lead to a higher score on teaching quality
- ‘Income per academic’ is also a dubious indicator. Assuming this concerns the average annual income of academics, there is no reason to believe that higher salaries benefits the quality of teaching in particular. It could be argued that salaries are nowadays more related to research quality and productivity than to teaching quality. If income per academic refers to the external financial resources that an academic attracts, it would even more be an indicator of research intensity.
Although the new THE ranking methodology seems to put more emphasis on teaching, at a closer look this is rather misleading. All this again shows how difficult it is to measure teaching quality. But as long as we do not address teaching quality sufficiently in the international rankings, they cannot fulfill their function as transparency instrument for international students.
And another academic year begins…
The first Monday in September traditionally marks the start of the academic year in the Netherlands. It’s the occasion where university leaders look ahead to the year to come and where inspiring speakers are invited to present their views and opinions. It is also an opportunity to see what the big issues are in Dutch higher education and how prominent is the international dimension in these issues. What will upcoming speakers (and past speakers, in those cases where the opening of the year took place prior to today) talk about?
A quick look at the guest speakers for this year and the topics of their speeches reveals that the universities have their eyes set on the future. The future of higher education seems to be the preferred topic in this year’s opening ceremonies.
The future is digital
European Commissioner Neelie Kroes, responsible for the Digital Agenda for the EU, will deliver a speech with the promising title ‘Europe 3.0’ at the Erasmus University in Rotterdam. Although some might claim that Europe has not yet entered the Web 2.0 era, Kroes – also alumnus of Erasmus University – will reveal her ideas on the digital future of Europe. An IT-festival with the theme ‘Erasmus Virtual Campus’ will precede the Opening ceremony and will include presentations on e-learning and e-research.
At Inholland University of Applied Sciences (UAS), the future is digital. The ongoing digitization of society and the blurring of the boundatries between the physical and virtual reality will provide new opportunities for higher education according to keynote speaker and trend watcher Adjiedj Bakas in Rotterdam last Wednesday.
Differentiation is the future
The University of Maastricht addresses the question what the world will look like in 20 years, and what universities should be doing today to gear their education and research towards this outlook? And who better to ask about the future than a historian? In Maastricht historian and author Bettany Hughes will present her views on the Socratic future of education and of society.
Discussions on the future of Dutch higher education focus mainly on the report ‘Differentiation in threefold’. This report was written by an international advisory committee on the future sustainability of Dutch higher education. The chairman of the committee – Cees Veerman – spoke at Saxion UAS, HAN UAS and will appear at the University of Utrecht today addressing the question: “Is knowledge still power?” At Utrecht UAS, the report was discussed in the context of Europe and the position of the Universities of Applied Sciences in Europe.
The future is Europe?
The University of Amsterdam takes the future of Europe as its central theme for this year’s opening. Daniel Cohn-Bendit, chairman of the European Parliament’s Green Party, will be keynote speaker and will explore how ‘The European Dream’ has evolved over the past few decades. Also, three of its professors will consider ‘The End of Europe’ through the lens of their respective disciplines (Eastern European Studies, European Law and Communication Science).
Study abroad is the future
A special mention should be given to a very exceptional opening. The Dutch students studying at universities abroad, united in NEWS (Netherlands Worldwide Students), organized their own virtual opening of their global academic year. In a virtual address, Alexander Rinnooy Kan sent them the message that the Netherlands can only survive as a knowledge economy if we excel internationally and that we need students that are aware of the opportunities abroad.
Some other interesting speeches planned for today are:
- ‘Two Cultures’ by Pieter Winsemius (member of the Scientific Council for Government Policy) at the University of Twente. He addresses the question how the natural sciences and the social sciences can reinforce each other and how the university contributes to society.
- ‘How engineers can save the world’ by Rosalind Williams, Professor in Science, Technology and Society at MIT, speaking at Eindhoven University of Technology.
- ‘Looking further ahead: Research and innovation for the long term’ by Robbert H. Dijkgraaf, President of the Royal Netherlands Academy of Arts and Sciences (KNAW) speaking at Leiden University.
The Nuffic (The Netherlands Organisation for International Cooperation in Higher Education; whic also happens to be my current employer) has launched its Nuffic International Education Monitor today. I’m sure this will be a valauable tool for many international educators, higher education/international education researchers and others interested in the international dimension of higher ed.
The monitor tracks developments in almost 50 countries all over the world. It provides up-to-date country information and explores core themes in international higher education. It provides you with a selection of the news on international higher education, categorised thematically in seven dossiers and categorised by country. It also gives a daily selection of the most interesting international news and Dutch news. Furthermore, it presents monthly overviews of Dutch, European and international policy initiatives and a list of future conferences.
The monitor also features a blog on international higher education issues. Being one of the blog contributors, I will also cross post my own contributions here. Some forthcoming issues in the Nuffic Blog are: foreign backed universities, regulation of recruiting agents, Russia-Dutch scientific cooperation, mobility statistics and many others…
Let them know what you think of it!