The recent results from the PISA surveys have got a lot of attention. In somewhat apocalyptic terms, the Irish Times referred to “2010: the year Irish education fell to earth” and invited reader’s opinions on Ireland’s “Falling literacy levels” though the Irish Independent did carry a more balanced assessment. A careful analysis of the data reveals a more complex picture.
Firstly, these assessments do not show in absolute terms how well or badly our students are doing since in each wave the data are normed to have a mean of 500 and a standard deviation of 100. Comparing the results for 2000 and 2009 will not tell us whether our current 15 year olds know more or less than the equivalent cohort 9 years earlier. It just shows where they are relative to students in other countries. So even if our current 15 year olds have slipped in the rankings, they might actually be better academically than the earlier Irish cohort but have simply been leap-frogged by other countries. Let us not forget that results in public exams have been trending up. Maybe it's grade inflation, maybe not.
Of course rankings do matter, for example foreign investors may only care about the relative skills of countries’ workforce. However when our rankings fall, as they have recently, there may be a tendency to use this as a stick to beat the education system. But it can hardly be to blame if our students are actually learning more but other countries simply got better.
Secondly, PISA includes a diverse range of countries including, for example, Liechtenstein, Kyrgyzstan and Macau. With no disrespect to these, should we really care where we stand relative to them? Does it not make more sense to pay much more attention to countries which are more relevant to us economically and/or are simply bigger?
Thirdly, PISA has changed significantly since it started. In 2000, there were 32, largely OECD countries. This grew to 41 in 2003 and in 2006, which focused on science there were 57 countries. The latest wave has 64 countries - some are actually territories. So when you look at a ranking, you are counting how far you are from the top. But one could equally look at how far one is from the bottom. Now as other countries join, it’s likely some will come in above us and some below. So the ranking can be misleading. Looking at how far we are from the bottom might tell a different story. A fairer comparison over time would just compare us with those countries that were in with us from the beginning and indeed some commentators have looked at our rankings just with the OECD.
To look closer at this consider the reading results. In 2000, PISA put our students between 3rd and 9th (out of 32). For statistical reasons, they didn’t give exact rankings but let’s take 6th place as an average. That puts us in about the top 20% of countries. By 2009 our ranking had slipped to 21st. That may seem a precipitous decline but there are twice as many countries in the frame: we are now only in about the top one third. So while it is nothing to be happy about, it’s not quite the disaster it appears at first blush. Mathematical skills are often seen as being particularly important, so where do we stand? In 2000 we were ranked between 16th and 19th , in around the top 55% of countries. In 2009 that had fallen to 20th but given that there are more countries we are just about in the top half: arguably this is an improvement. Looking at the science results tells a similar story.
So the point is that one has to be careful to jumping to conclusions about how our students and schools are doing from simplistic comparisons. When one studies the detailed reports produced by the OECD it becomes obvious that there is far more information than hits the headlines. The research is particularly informative on issues of socio-economic disadvantage. In all countries, those students from a better-off background do better but the extent to which this is the case differs markedly. In a fascinating analysis, the OECD considers what it defines as “resilient” students, these are students who are in the bottom quarter on the socio-economic scale but who perform in the top quarter in the assessments. It is a good summary of the extent to which young people are not held back by their background. So who tops this particular league table? It is dominated by the Far East, the top 7 are (in order) Shanghai, Hong Kong, Korea, Macau, Singapore, Finland and Japan. And us ? Ireland is pretty average, ahead of the US, the UK and many European countries though not by much. That we are way ahead of Kazakhstan, Panama or Dubai in this respect seems scant consolation.
Anger, we have been advised, does not constitute a policy. Neither do simplistic statistics or headlines. Ireland faces particular challenges in adapting its educational system to changing needs and technologies, a changing international economy and a much less favourable fiscal situation. It is imperative then that we carefully monitor the extent to which educational resources produces results, how we fare relative to our competitors and the extent to which our schools mitigate or reproduce social inequities.