Archive for the 'What?' Category

Twitter timeline

Posted by Eric on July 2nd, 2014


Goodbye to AHELO?

Posted by Eric on July 1st, 2014

This article was written for ‘HE – Policy and markets in higher education‘ from Research Fortnight and was published there on 24 June.

“It is hard to improve what isn’t measured.” So said Andreas Schleicher, who runs a programme to identify the abilities of school children in different countries on behalf of the OECD. Eight years ago, a similar measurement for higher education was proposed. The Assessment of Higher Education Learning Outcomes (AHLEO), as it is known, was due to be discussed this week at the sixth annual international symposium on university rankings and quality assurance in Brussels, but has been dropped from the programme. Where did AHELO go?

A clue could be found in where it came from. The idea was first mooted in 2006 when more than 30 ministers of education gathered in Athens to discuss higher education reforms worldwide. In his opening speech, Angel Gurria, the secretary general of the OECD, suggested undertaking “a comparable survey of the skills and abilities of graduates that might measure learning outcomes in higher education and help families, businesses and governments develop an evidence-based understanding of where and how higher education systems are meeting their expectations of quality and where they are not”. The OECD had developed the methodology to do it, he said. All they needed was a mandate. They got it.

This needs to be seen in the context of discussions about rankings that were taking place at the time. In the early 2000s international rankings in higher education were a new phenomenon. In 2003 the first edition of the Academic Ranking of World Universities was published by Shanghai Jiao Tong University. In 2004 the British magazine Times Higher Education published the first edition of the Times Higher Education-QS World University Rankings, which was later split up into the THE World University Ranking and the QS World University Ranking. Then followed the Taiwan Ranking and the Leiden CWTS Ranking and those of a plethora of newspapers.

The international higher education community was critical, to say the least. It disliked what these rankings measured, how they measured it and the possible consequences. There was too much focus on research and, in particular, on those aspects of research that could be quantified and counted: publications, citations and Nobel prizes. The resulting rankings and scores were presented as quality indicators for all activities in universities, including teaching, and an institution’s position in these rankings appeared to be important in the decision-making processes of international students. As a result, rankers searched for indicators that could express the quality of education, preferably in a single, globally comparable number—something that most experts and practitioners in education knew was an impossible goal.

Those education indicators that were included were poor proxies for the quality of teaching and learning. Teachers’ highest degrees or the ratios of staff to students merely said something about the input, not necessarily about quality. Other indicators, such as reputation or employer surveys as a proxy for teaching quality, were highly controversial. Highly selective institutions—which tended to lead the international rankings—might have many graduates in leading positions, but was that due to the institution’s efforts or the quality of the students it admitted? Is the best school the one that delivers the best graduates or the one that adds most value through the quality of its teaching?

Another problem was the homogenising tendency of international rankings. Having a limited set of indicators could cause institutions to focus on them, instead of the underlying objectives. This could mean specific local contexts would be neglected and local demands would not be met, which could lead to decreasing diversity. The only benchmark in the global knowledge economy—according to the international rankings—seemed to be the world-class, research intensive university.

AHELO—together with transparency tools such as U-Map, which looks at which activities individual institutions across Europe are involved in, and U-Multirank, which examines how well each institution does in these activities—emerged as the answer. The aim was to restore the teaching and learning mission of the university to a central position, look at the added value of institutions rather than their graduates, and show that a range of institutions—not just the so-called world-class research universities—could add value. Although the OECD repeatedly stressed that AHELO was never envisaged as a ranking tool, the rankings issue was frequently mentioned in its favour.

But can AHELO really provide an alternative for international rankings? It seems unlikely. Shortly after Schleicher addressed the conference in March 2013, the OECD published the results of a feasibility study. A staggering three volumes containing more than 500 pages concluded: “It is feasible to develop instruments with reliable and valid results across different countries, languages, cultures and institutional settings.” After that, came a deafening silence. The feasibility study may have proved that it is scientifically possible, but it is probably not financially or politically practical—or desirable.

According to the New York Times, the OECD spent about $13 million on the study. How much time and money was spent by the 17 governments and 248 universities involved is unknown—not great for an organisation that values and advocates transparency.

And AHELO does not solve the problem that rankings only look at input and output rather than the value added. In the feasibility study, the OECD notes that value-added analysis is possible but the challenges are considerable. Furthermore, it claims that the difficulties inherent in making comparisons across jurisdictions, mean the main use of the results of any value-added analysis would be for institutional self-study and comparisons between programmes within a jurisdiction. Although this might be helpful for a university’s internal quality assessment, it still only assesses learning outcomes after three or four years of study and, as has been shown with other instruments, this might be more an indicator of the quality of the admitted students and the quality of secondary education than the quality of the teaching and learning process within the university.

Finally, as critics have suggested, AHELO risks being at least as homogenising as other ranking systems, while standardisation of learning outcomes could lead to conformity and stifle innovation in teaching even more than the existing rankings do. A one-size-fits-all approach to learning outcome assessments, that originates largely from the Euro-American world, could undervalue local knowledge.

Schleicher may well have been right about the difficulties of improving what isn’t measured. But being able to measure something is no guarantee of improvement and measurement is not worth pursuing at all costs.

The end of the university? Not likely

Posted by Eric on September 23rd, 2013

This article was first published in University World News.

This year has frequently seen apocalyptic headlines about the end of the university as we know it. Three main drivers have been and still are fuelling these predictions: the worldwide massification of higher education; the increasing use of information and communication technology in teaching and the delivery of education; and the ongoing globalisation of higher education. These developments will make the traditional university obsolete in 2038. At least, that’s what some want us to believe.

The massification of higher education worldwide – even more than the massification in Western Europe, the United States and Japan in the post-war period – demands new and more efficient types of delivery. The acceleration in the demand for higher education, especially in China and other parts of South and East Asia, has made it nearly impossible for governments to respond to this demand. The increase in demand, together with decreased funding due to the financial crisis, has put pressure on traditional modes of university education.

Innovations in ICT have expanded the possibilities for delivering education and have led to new teaching instruments. The advent of massive open online courses, or MOOCs, in 2012 combined new technologies in order to reach a massive audience. These developments are intensified through the ongoing globalisation of higher education.

Because of the globalisation process, opportunities with regard to where to study have increased, ranging from attending universities abroad to attending online courses.

The concept of ‘the university’ is gone

The conjunction of these developments has led many to believe that the centuries-old model of the contemporary university is coming to an end. If we believe them, the higher education landscape of 2038 will be completely different from the current one. I would argue that these predictions show both a lack of knowledge about the contemporary landscape of higher education and a lack of historical understanding of the development of universities.

The time when the concept of the ‘university’ was clear-cut, referring to a single organisational and educational model, has long been gone. Especially since the massification of higher education in the post-war period, this single model has been accompanied by a wide variety of other higher education institutions. More vocationally oriented institutions were established, such as community colleges. Very large distance-education institutions emerged in many Western countries and beyond. What’s more, when the organisational boundaries of the traditional university were reached, new activities and new organisations appeared. One thing is for sure: in not one country in the world is the traditional university model representative of the entire higher education system any more.

But even if the proclaimers of the end of the university are only referring to the traditional model (whatever that is), they will be proven wrong in 2038, and long after that. The traditional university has been one of the most enduring institutions in the modern world. Granted, university research and university teaching have adapted constantly to changes in the economy and society. This process of adaptation might be too slow, according to many, but it is a constant process in the university. Despite this continual change and adaptation, the model of the university as we know it has changed very little.

The organisation of faculties, schools and departments around disciplines, accountability in the form of peer review, comparable tenure and promotion systems, the connection between education and research, the responsibility of academic staff in both education and research and both graduate and undergraduate education, the primacy of face-to-face instruction etc – these are all characteristics that can be found in universities throughout the world and which have existed for many, many decades – if not centuries.

My bet is they will still be there in 2038. It would be rather naive to think that a financial crisis or even a new type of delivery, like MOOCs, will profoundly change these enduring structures and beliefs.

Universities’ DNA

In the words of Clayton Christensen and Henry Eyring, authors ofThe Innovative University, we are talking about the ‘DNA’ of the university, and saying that this does not change easily. They argue that university DNA is not only similar across institutions but is also highly stable, having evolved over hundreds of years. Replication of that DNA occurs continually, as each retiring employee or graduating student is replaced by someone screened against the same criteria applied to his or her predecessor. The way things are done is determined not by individual preference but by institutional procedure, written into the ‘genetic code’.

New technologies will enable new forms of education and delivery. In the coming 25 years, we will see the emergence of new institutions focusing on specific target groups and we will witness traditional institutions employing these new technologies. But will this make the university as we know it obsolete? No, it will not, because the function of the university as we know it is much more comprehensive than ‘just’ the production and transfer of knowledge.

Students attend universities not simply to ‘consume’ knowledge in the form of a collection of courses. They go there for an academic experience; and they go there for a degree that will provide them with an entry ticket to the labour market and which will give them a certain status. Does the fact that I do not see any substantial changes in 2038 mean that there should be none? The fact that structures and beliefs endure does not always mean they still serve the functions they used to.

This is also the case with many of the traditional structures and beliefs in the university. Holding on to these practices is not an end in itself. At least, it should not be, yet in making policy and in making predictions, it is good to take into account the stabilising character of these structures and beliefs.

25 years from now

Because of the university DNA, there is rarely a revolution of the type so frequently predicted by politics, business and consultants. In addition to the major source of universities’ value to a fickle, fad-prone society, the university’s steadiness is also why one cannot make it more responsive to modern economic and social realities merely by regulating its behaviour. A university cannot be made more efficient by simply cutting its operating budget, nor can universities be made by legislative fiat to perform functions for which they are not expressly designed. Another argument why the university as we know it will still be there in 2038!

Many say that the best way to predict the situation in 25 years is to look back 25 years and see what has changed since then. I was first introduced to university life 25 years ago, in what you could call a traditional university. In the past 25 years I have studied and worked at four universities in and outside The Netherlands. At the time of writing, I work at Leiden University, another traditional university.

Comparing the university of 1988 with the university of 2013, it is remarkable how little these organisations have changed. Of course, the university has adapted to societal, political and economic changes, but at its core the traditional university has remained very much the same. I can safely say that the DNA of the traditional university has not changed in the past 25 years and I can safely predict that it will not change in the coming 25 years. And essentially, that is a good thing.

Mobility Stats: Mapping Mobility & Open Doors

Posted by Eric on November 15th, 2010

altalt

Two international education organisations, Nuffic from the Netherlands and the Washington based Institute of International Education (IIE) published their international student mobility statistics this week. While Open Doors is being published by IIE already since 1948, the Nuffic publication – Mapping Mobility – was published for the first time in 2010. Although Nuffic published international education statistics before, this is the first one solely focused on higher education.

Growth

One finding of the Open Doors report was that the influx of international students into the US continued to grow modestly. Compared to the year before, there were 3% more international students coming to the US for the purpose of study (the vast majority for a full degree). The number of foreign students studying for a full undergraduate or graduate degree in the US (excluding non degree students) in 2009/10 was 568,316. This was almost 3% of the total student population.

In the Netherlands they witnessed a slightly higher growth. In 2009/10, there were 47,226 international degree students in the Netherlands, up 6.3% compared to the year before. Considering that the total student population also increased in the Netherlands, the percentage of foreign students remained stable at 7.4% of the total student population.

If we compare the growth rates between the US and the Netherlands in the past five years, we can observe a growth of over 40% in the Netherlands since 2005-6 and in the US a growth of 15%. (Data based on Table D in fast facts Open Doors and Diagram 06 in Mapping Mobility)

Countries of Origin

Other interesting dynamics are revealed if we look at the countries of origin. We can conclude that the growth in the US in the past year has been caused almost solely by the Chinese international student population. The number of Chinese students in the US increased by almost 30%, now accounting for almost a fifth of the international students in the US. The Netherlands however is much more dependent on a single nation. Germany remains the main source country for foreign students in the Netherlands, now accounting for 44% of all students. The table below shows the main source countries of the US and the Netherlands.

source

Destinations

Not surprisingly, the main destinations of these students are institutions of the Dutch border region with Germany. The University of Maastricht tops the list, followed by four universities of applied science in the southern, central and northern provinces bordering with Germany. In the US this obviously shows a much more dispersed pattern. Most internationalised institutions here are the University of Southern California, the University of Illinois (Urbana Champaign), New York University, Purdue and Colombia.

Framing International Education

Posted by Eric on October 23rd, 2010

alt

Ten days ago or so, I was in Sydney for the annual Australian International Education Conference. I’ve seen some very interesting presentations here, some real eye-openers. I’ll discuss some specific sessions here later (I’ll wait until the presentations are available on the website). Now I just want to share some general impressions.

Most remarkable for me was that the economic framing of international education now seems to be widely accepted. When I lived in Sydney some years ago, my perception was that the government and parts of university management occasionally dropped terms like the ‘education industry’ and ‘higher education exports’. This was really the language of the marketeers and the recruiters.

Nowadays this language has spread throughout the universities and even the international educators themselves have adopted the language. Should we perceive this as conscious, strategic behavior on their part? Is the framing in economic terms an attempt to convince governmental leaders to invest more in higher education because of its strategic economic importance?

In the Netherlands, national governments explicitly frame international education as a quality issue. International education is to be pursued because it improves the quality of Dutch higher education. On the other hand, the income from full fee paying international students have now become a necessary resource for Dutch institutions as well (and especially for some departments or programs).

Does it matter how we frame it? Or is it always about the bottom line anyway? I think it does matter. In framing international education as an export product, as an economic commodity, the recruitment of students becomes the dominant issue. As a result, recruitment and the image of Australia as an education provider have become the dominant issues in Australian international education. But of course, we all know there is so much more to international education…

Five…

Posted by Eric on September 28th, 2010

5 (1)

Dutch universities & the ranking season

Posted by Eric on September 17th, 2010

altRanking season is over. Yesterday, the Times Higher published its new ranking and that also marked the end of the ranking season for this year. After the Shanghai Jiao Tong ranking, the Leiden ranking, the QS ranking and the Taiwan ranking, this was the fifth attempt to illustrate the differences in quality of the world’s universities. Whether they succeeded in this remains a question of debate.

Although there are quite some differences in the results of the rankings, a few common observations can be made. First of all, it is clear that the United States still is home to the best universities. In all rankings the US universities are dominant and Harvard is the undisputed leader. Only in the QS ranking it was a non US university  – Cambridge – that topped the list.

Another observation is that non of the rankings manage to sufficiently capture the quality of teaching in their assessment. The THE ranking made an attempt to do so, but most of their indicators still reflect research quality and prestige more than the quality of teaching. The Shanghai, Leiden and Taiwan rankings put most emphasis on research.

Even though the ranking predominantly assess research – although in different ways – the results are very different. To illustrate this point I have mapped the results of the Dutch research universities in the different rankings. The results are shown in the graph below (click to enlarge).

rankings

The results for the twelve universities (the thirteenth, Tilburg University somehow doesn’t appear in the rankings) show a substantial variation for all universities. For universities like Eindhoven, Twente and Maastricht, the variation seems exceptionally large. Eindhoven for instance was ranked as the best university in the THE ranking while performing worst in the Shanghai ranking. Leiden shows the least variation but here the difference between its rank in the Shanghai ranking (70th) and the THE ranking (124th) is still enormous.

Earlier this week at the OECD/IMHE conference Charles Reed, Chancellor of California State University, critiqued the rankings (“rankings are a disease”) and argued that all universities add value. I guess he’s right. And the value measured by one ranking seems to be quite different than the value measured by the other…

Regulating recruitment agencies

Posted by Eric on September 13th, 2010

Study abroad for a full degree has developed from an elite to a mass phenomenon. Parallel to this development, we have witnessed a commercialization of international higher education where many institutions have become financially dependent on full fee paying international students. To operate in this (global) market, institutions – and especially the lesser-known ones – now frequently turn to agents and recruiters in order to attract prospective students. Many point to the risks of these third party agents and plea for more regulation or even abolishment.

Abolish or regulate? In Inside Higher Ed, Philip Altbach, the Director of the Center for International Higher Education sheds his light on this issue. His viewpoint is clear and unambiguous: “Agents and recruiters are impairing academic standards and integrity — and it’s time for colleges and universities to stop using them.”

These agents recruit prospective students and provide general information, but – according to Altbach – in reality they are also making offers to students or actually admit students, often based on murky qualifications (even though the colleges that hire them say that they still control admissions).

Some initiatives have appeared in the United States with the objective to regulate this ‘new profession’ but these organisations lack powers to monitor compliance or discipline violators. The solution Altbach provides is simple: abolish them! After all, they have no legitimate role in higher education.

Dutch Self regulation

In the Netherlands, the institutions have chosen self-regulation as the prime instrument for managing (the excesses of) international recruitment. The sector-wide ‘code of conduct’ sets out standards for Dutch higher education institutions in their dealings with international students. One chapter in this code of conduct (pdf) deals with the use of agents. The provisions in this chapter stipulate that agents have to act in the spirit of the code and clarifies the responsibilities of the agents and those of the higher education institutions. One of the starting points is that admission remains the responsibility of the institutions and that institutions have to take action immediately in the case of unethical behavior.

This way of dealing with the risks of student recruitment (in an increasingly commercialized market) is somewhat comparable to the method of self-accreditation or ‘accreditation lite’ in the United States. Altbach criticized this method because of the lack of powers in the case of non-compliance.

Other solutions?

In some of the comments below the Inside Higher Ed article, Altbach’s view is portrayed as elitist. Prestigious American schools like Boston College might not need such recruiting agencies. But what about less prestigious universities? What about the ones that are not part of the Ivy League, the Russell Group or the Group of 8? Maybe, these institutions do need professional assistance in reaching prospective international students.

For these institutions, abolishment might not be acceptable. In addition, it remains the question whether all of these agents are rogue operators? Is Altbach’s opinion also valid for agents in other parts of the world? Or is this phenomenon only apparent in the more commercially higher education sectors (Altbach is mainly referring to the USA, Australia and the UK)?

Either way, even if the number of malicious operators would be small, some form of regulation might be necessary to protect the numerous international students who are about to invest a lot of money into their future. Should we let the market do its work or does this sector need government protection? Or is there enough trust in the higher education institutions (and in the majority of the agents) and should we apply soft instruments like codes of conduct and other forms of self-regulation?

Rankings and Reality

Posted by Eric on September 10th, 2010

Summer holidays are over. In the global field of higher education, this also means that it is ranking season. Last month it was the Shanghai ranking, This week the QS World Universities Ranking were revealed and in two weeks the all new Times Higher Education ranking (THE) will be published. Ranking season also means discussions about the value of rankings and about their methodologies. Two points of critique are addressed here: the volatility of (some) rankings and the overemphasis of research in assessing universities’ performance.

Volatility and stability in international rankings 

This year’s discussion has gotten extra fierce (and nasty now and then) because of the THE’s decision to part with consultancy agency QS and to collaborate with Thomson Reuters, a global research-data specialist. The previous joint THE/QS rankings usually received quite some media attention. This was not just because their methodology was heavily criticized (and rightly so) but also because this disputed methodology led to enormous fluctuations in the league tables from year to year. The critique has made THE to join forces with Thomson while QS continues their ranking.

Although the various rankings differed in their methodology, they all seemed to agree on two things: the hegemony of the United States universities and the undisputed leadership of Harvard. This week’s QS rankings again showed the volatility of their methodology. For the first time Cambridge beat Harvard and for the first time the top ten is not dominated by US universities. The top ten is now occupied by five US universities and five UK universities.

The Shanghai ranking on the other side shows much less fluctuations in its rankings. This probably does reflect reality better, but makes it less sensational and therewith less attractive for wide media coverage. The two graphs below clearly show the difference between the stable Shanghai rankings and the volatile QS rankings for a selection of Dutch universities.

Image

 

The graphs show the positions in the past six years for the four Dutch universities that are in the top 100 of the Shanghai and/or QS ranking. To illustrate the relative meaning of the absolute positions, the Shanghai rankings groups institutions above rank 100 (this also explains the relatively steep drop from Erasmus University in the 2006 ranking). Although Amsterdam has remained fairly stable in the rankings, Leiden and Utrecht show quite some fluctuation. Much more than its real quality would justify.

And who thinks this is volatile, it can be much worse. Simon Marginson in a 2007 paper lists dozens of cases where drops are increases of more than 50 positions (sometimes even up to 150 positions) occur in a year. A case in point is the Universiti Malaya who went from “close to world class” to “a national shame” in only two years…

 It will be interesting to see in the coming years how the new THE/Thomson methodology will work out in this respect. The Times Higher published its methodology this week. While the QS ranking based their listing on only 6 indicators (with 50% weighting going to reputational surveys), the new THE ranking takes into account 13 indicators (grouped in five categories). Considering this higher number of indicators and considering that the weight of reputational surveys is significantly lower, it is also likely that fluctuation will be lower than in the QS ranking. Time will tell…

Are international rankings assessing teaching quality?  

Another frequently mentioned critique on the existing international rankings is that they put too much emphasis on assessing research and neglect the teaching function of the university. Since the new THE ranking more than doubled the number of indicators, it is likely that the assessment will correspond better with the complex mission of universities.

If we look at the new methodology this indeed seems to be the case. The teaching function now constitutes 30% of the whole score and is based on 5 indicators. In the QS ranking, it was based on only 2 indicators (employer’s survey and staff/student ratio).

The 5 indicators are:

  • Reputational survey on teaching (15%)
  • PhD awards per academic
  • Undergraduates admitted per academic (4.5%)
  • Income per academic (2.25%)
  • PhD and Bachelor awards

A closer look at these 5 indicators however leaves the question on how much they are related to teaching.

  1. First of all, one can wonder whether a reputational survey really measures the quality of teaching or whether this in reality is another proxy for research reputation. Colleagues and peers around the world often do have some idea of the quality of research in other institutions, but is it likely that they can seriously evaluate the teaching in other institutions? Apart from the institutions where they graduated or worked themselves, it is unlikely that they can give a fair judgment about the teaching quality in other institutions, in particular in institutions abroad.
  2. Two other questionable indicators for the quality of teaching are the number of PhD’s awarded and the number of PhD awards per academic. In the Netherlands, and in many other countries in continental European and elsewhere, this says much more about the research quality and the research intensity of an institution than about the teaching quality
  3. The indicator ‘Undergraduates admitted per academic’ seems the same as their old indicator of student/staff ratio. Assuming here that a lower number is better, this again benefits research intensive institutions more than other institutions. Research intensive institutions employ relative many academics, but many of them will have research only contracts. Yet, in this indicator they will still lead to a higher score on teaching quality
  4. ‘Income per academic’ is also a dubious indicator. Assuming this concerns the average annual income of academics, there is no reason to believe that higher salaries benefits the quality of  teaching in particular. It could be argued that salaries are nowadays more related to research quality and productivity than to teaching quality. If income per academic refers to the external financial resources that an academic attracts, it would even more be an indicator of research intensity.

Although the new THE ranking methodology seems to put more emphasis on teaching, at a closer look this is rather misleading. All this again shows how difficult it is to measure teaching quality. But as long as we do not address teaching quality sufficiently in the international rankings, they cannot fulfill their function as transparency instrument for international students.

What does the future hold for (Dutch) higher ed?

Posted by Eric on September 6th, 2010

And another academic year begins…

The first Monday in September traditionally marks the start of the academic year in the Netherlands. It’s the occasion where university leaders look ahead to the year to come and where inspiring speakers are invited to present their views and opinions. It is also an opportunity to see what the big issues are in Dutch higher education and how prominent is the international dimension in these issues. What will upcoming speakers (and past speakers, in those cases where the opening of the year took place prior to today) talk about?

A quick look at the guest speakers for this year and the topics of their speeches reveals that the universities have their eyes set on the future. The future of higher education seems to be the preferred topic in this year’s opening ceremonies.

The future is digital

European Commissioner Neelie Kroes, responsible for the Digital Agenda for the EU, will deliver a speech with the promising title ‘Europe 3.0’ at the Erasmus University in Rotterdam. Although some might claim that Europe has not yet entered the Web 2.0 era, Kroes – also alumnus of Erasmus University – will reveal her ideas on the digital future of Europe. An IT-festival with the theme ‘Erasmus Virtual Campus’ will precede the Opening ceremony and will include presentations on e-learning and e-research.

At Inholland University of Applied Sciences (UAS), the future is digital. The ongoing digitization of society and the blurring of the boundatries between the physical and virtual reality will provide new opportunities for higher education according to keynote speaker and trend watcher Adjiedj Bakas in Rotterdam last Wednesday.

Differentiation is the future

The University of Maastricht addresses the question what the world will look like in 20 years, and what universities should be doing today to gear their education and research towards this outlook? And who better to ask about the future than a historian? In Maastricht historian and author Bettany Hughes will present her views on  the Socratic future of education and of society.

Discussions on the future of Dutch higher education focus mainly on the report ‘Differentiation in threefold’. This report was written by an international advisory committee on the future sustainability of Dutch higher education. The chairman of the committee – Cees Veerman – spoke at Saxion UAS, HAN UAS and will appear at the University of Utrecht today addressing the question: “Is knowledge still power?” At Utrecht UAS, the report was discussed in the context of Europe and the position of the Universities of Applied Sciences in Europe.

The future is Europe?

The University of Amsterdam takes the future of Europe as its central theme for this year’s opening. Daniel Cohn-Bendit, chairman of the European Parliament’s Green Party, will be keynote speaker and will explore how ‘The European Dream’ has evolved over the past few decades. Also, three of its professors will consider ‘The End of Europe’ through the lens of their respective disciplines (Eastern European Studies, European Law and Communication Science).

Study abroad is the future

A special mention should be given to a very exceptional opening. The Dutch students studying at universities abroad, united in NEWS (Netherlands Worldwide Students), organized their own virtual opening of their global academic year. In a virtual address, Alexander Rinnooy Kan sent them the message that the Netherlands can only survive as a knowledge economy if we excel internationally and that we need students that are aware of the opportunities abroad.

Some other interesting speeches planned for today are:

  • ‘Two Cultures’ by Pieter Winsemius (member of the Scientific Council for Government Policy) at the University of Twente. He addresses the question how the natural sciences and the social sciences can reinforce each other and how the university contributes to society.
  • ‘How engineers can save the world’ by Rosalind Williams, Professor in Science, Technology and Society at MIT, speaking at Eindhoven University of Technology.
  • ‘Looking further ahead: Research and innovation for the long term’ by Robbert H. Dijkgraaf, President of the Royal Netherlands Academy of Arts and Sciences (KNAW) speaking at Leiden University.

 

Nuffic International Education Monitor

Posted by Eric on September 6th, 2010

The Nuffic (The Netherlands Organisation for International Cooperation in Higher Education; whic also happens to be my current employer) has launched its Nuffic International Education Monitor today. I’m sure this will be a valauable tool for many international educators, higher education/international education researchers and others interested in the international dimension of higher ed.

The monitor tracks developments in almost 50 countries all over the world. It provides up-to-date country information and explores core themes in international higher education. It provides you with a selection of the news on international higher education, categorised thematically in seven dossiers and categorised by country. It also gives a daily selection of the most interesting international news and Dutch news. Furthermore, it presents monthly overviews of Dutch, European and international policy initiatives and a list of future conferences.

The monitor also features a blog on international higher education issues. Being one of the blog contributors, I will also cross post my own contributions here. Some forthcoming issues in the Nuffic Blog are: foreign backed universities, regulation of recruiting agents, Russia-Dutch scientific cooperation, mobility statistics and many others…

Let them know what you think of it!

Recognition and Mobility in the Bologna Process

Posted by Eric on March 11th, 2010

Today and tomorrow, the anniversary of the Bologna Process is celebrated. Actually…it is celebrated by most and protested against by some. A consortium of CHEPS, INCHER and ECOTEC was given the task to prepare an independent assessment of the Bologna process. The study was conducted together with experts from the University of Bath, the Bayerisches Staatsinstitut für Hochschulforschung and NUFFIC (i.c. myself). Below is Don Westerheijden (CHEPS) presenting the part of the assessment I’ve been working on: recognition and mobility.

The report is published by the European Commission and can be found here (pdf). Today’s programme was in Budapest; tomorrow, the rest of the programme is brought to you from Vienna. Watch the live stream here.

Podcasting Higher Ed

Posted by Eric on May 3rd, 2009

Some years ago the first podcasts emerged in higher education. Initially these were mostly downloadable lecture series, mainly from US universities. Universities like Berkeley and Stanford took the lead here but soon many other US universities followed and later, also some UK universities jumped the iTunes U bandwagon. In the Netherlands, the universities of Wageningen, Leiden and Rotterdam were the first to podcast lectures. Of course there were fears that these podcasts would make real lectures superfluous, but i don’t think that podcasts ever knocked lectures off the podium.

More recently, also several podcasts have emerged that discuss the topic of higher education. The chronicle has its podcast with weekly interviews with prominent researchers, college leaders, and Chronicle reporters about big ideas in higher education. The Center for International Higher Education at Boston College has a podcast series with a more global scope. It brings key thinkers and leaders in higher education worldwide to a global audience. The series is coordinated by Laura Rumbley and it is definitely worth to have a look.

The past week there have also been some blogs that entered the world of podcasting. The Center for College Affordability and Productivity presented it’s first podcast on it’s blog. It features the center’s director Richard Vedder discussing the role of incentives and power in higher education.Podcast_logo

For several years, the students of the Erasmus Mundus Programme on Higher Education have brought you the Hedda blog to you. I have taught a module on internationalisation, globalisation and the knowledge society for this module for several years (and loved it every year!). Of course I was pleased to see that they have started their own podcast series as well. Their first podcast features an interview with Peter Maassen, an ex colleague of mine at Cheps and now professor of Higher Education at the University of Oslo. He discusses his new book Borderless Knowledge?  Understanding the “New” Internationalisation of Research and Higher Education in Norway.

Update: I was pointed to the podcast series of the Lumina foundation. This is the foundation that is also keeping a close American watch on the Bologna process. The have two podcast sessions on the Bologna process featuring Lumina’s Dewayne Matthews and Tim Birtwistle, professor of law and policy of higher education, and the Jean Monnet chair at Leeds Law School (Leeds Metropolitan University, U.K.).

Last week, the Dutch Volkskrant reported on an interesting study on the distribution of research funding by the Netherlands Research Council (NWO). Loet Leydesdorff (one of the researchers that introduced the Triple Helix concept) and Peter van den Besselaar – both of the Amsterdam School of Communications Research of the University of Amsterdam – conducted a study on the grant allocation decisions of the Netherlands Research Council in the Humanities and Social Sciences in the Netherlands.

Besselaar and Leydesdorff tested whether the grant decisions correlate with the past performances of the applicants in terms of publications and citations, and with the results of the peer review process organized by the Netherlands Research Council

In their paper they show that the Council is successful in distinguishing grant applicants with above-average performance from those with below-average performance, but within the former group no correlation could be found between past performance and receiving a grant. When comparing the best performing researchers who were denied funding with the group of researchers who received it, the rejected researchers significantly outperformed the funded ones. Within the top half of the distribution, neither the review outcomes nor past performance measures correlate positively with the decisions of the Council.

The authors conclude with some questions for further research. They suggest a network analysis of applicants, reviewers, committee members, and Council board members. This might provide an answer to the question whether funding is correlated to the visibility of the applicants within these networks. After all, in the social process of granting proposals many processes play a role, apart from scholarly quality: bias, old-boys’ networks and other types of social networks, bureaucratic competencies, dominant paradigms, etc., all play an important role in selection processes.

If my reading of the paper is correct, it might also point to a discrepancy between the grant decision makers and the international academic community. If we consider that metrics (past performance) and peer review very much emerge in international networks and the grant distributors make decisions contradicting the metrics and peer review, what does that tell about the Council members’ involvement in these international networks?

The paper will be published later this year in the journal Research Evaluation.

New Features

Posted by Eric on April 12th, 2009

Due to (happy) family circumstances posting has been slow recently. I get round to finding interesting news items to blog about but often couldn’t find the time to actually write about them. I will try again to post more regularly. After all, plenty is happening in the world of higher education, science and innovation.

Between posts however, you can enjoy my tweets and links at twitter (@beerkens). Enjoy! And suggestions for new news sources are welcome.

Get Adobe Flash player