“Travesty of our ‘stagnating’ schools: In a damning indictment of Labour, OECD condemns British education…”
, December 2010
Every three years the Organisation for Economic Cooperation and Development (OECD) publish the Programme for International Student Assessment (PISA) test results. And every three years they generate “shock horror” headlines such as the Mail
Media coverage of PISA results is usually sensational and narrow in its emphasis. And it could also be based on statistics that are fundamentally flawed.
Radio 4* discussed what the Department for Education (DfE) claims is “robust evidence” and came to the conclusion PISA league tables should be treated with scepticism
That’s not to say the information isn’t useful – such tests provide valuable insights
However, the programme revealed flaws in methodology:
1 It’s impossible to remove the effect of different contexts.
2 The statistical model is not used correctly.
3 The difficulty of questions posed in different countries varies.
4 A different emphasis in the questions (eg putting greater weight on reflection rather than interpretation) would result in rank order changing.
5 “Dodgy” questions, ie those that produced a wide variation in responses between countries, are removed from analysis.
Andreas Schleicher, OECD, denied these criticisms. The cross-national comparisons were valid, he said. PISA looks at a broad range of tasks that cover variables and can “approximate” a reliable level.
The programme asked if PISA league table position should be taken at face value.
No, was the answer. It’s a “big leap” to say league table rankings reflect a country’s education system. One participant criticised cherry-picking OECD data especially when, as in England, it led to an “uncontrolled experiment” which was the opposite of an evidence-based approach.
But league tables attract most comment. They are used to judge school systems and justify reforms. Schleicher downplayed their significance. Yes, countries were keen to know how they stood in relation to other countries but interest in them waned after a short time.
But this isn’t the case in England. Again and again Gove used the “plummeting down league tables” argument to validate his policies even though OECD said no comparison should be made with UK results for 2000 and 2003 because of sampling problems. And, despite the UK Statistics Authority censuring this misuse last year, the CBI still quoted them in its 2013 annual report.
It will be interesting to see how many commentators use the flawed data when the 2012 PISA results are released next week.
The BBC asked the DfE to take part in the programme but it declined. However, it sent a written statement.
The DfE said PISA justified Gove’s accountability measures – but the OECD has warned there’s too much emphasis on test results in England.
It said PISA showed the importance of autonomy. It did – but PISA 2009 revealed UK schools already had a greater amount of freedom than schools in most other OECD countries.
It said PISA results allowed the Coalition to “develop policies that will have a real and positive impact on education in this country.”
The policies are indeed having a “real” impact but most of it is not “positive.
*PISA – Global Education Tables Tested, BBC Radio 4, 25 November 2013. Available on Listen Again until 3 December 2013 here
. Thanks to Patrick Hedley for telling me about the programme.
More detailed critique of PISA tests, Andreas Schleicher’s background and his impassioned defence of PISA tests can be found in this Guardian
article. Thanks to Roger Titcombe for sending the link.