England is no longer ‘stagnating’ in PISA tests but is ‘stable’, university minister says

Janet Downs's picture
 1

What a difference a word makes.   The accepted truth since 2010 is that results in the three-yearly OECD PISA tests for England and the UK have ‘stagnated’.

Here’s former education secretary Michael Gove in December 2013:

Previous OECD league tables have shown how, under Labour, England’s schools have, at worst, declined, and at best, remained stagnant.’   

And schools minister Nick Gibb on the 2012 PISA results: ‘they [Labour] let England stagnate in international league tables, while other countries surged ahead.’ 

And Chris Skidmore MP writing in the Telegraph in December 2013:

Essentially results have stagnated since 2009, when we slipped from being 8th for Maths, 7th for reading and 4th for science in 2000, down to 28th for maths, 25th for reading and 16th for science…’

Yesterday, however, Skidmore, now Minister of State for Universities, Science, Research and Innovation, changed emphasis from relative rankings to scores.  Speaking in the Commons, he said:  

‘…according to the latest OECD programme for international student assessment from 2015, while performance has remained stable in England and Northern Ireland since 2006, there has been a sustained decline in science in schools in Wales, and in maths in schools in Scotland… (my emphasis).

While I welcome this change in wording – stability implies consistency while stagnation hints at being stuck in muck – the cynic in me wonders what might have caused this change in tone.

The most recent PISA tests were taken in 2018.  Results won’t be known until December this year.  English pupils taking these tests would have studied ‘reformed’ GCSEs in English, Maths and Science, the three subjects assessed by PISA.

It may be that England’s scores or relative rankings will rise.  If so, the Government (unless it’s kicked out by then) will claim post-2010 reforms have been a success.  But if there’s no change, the Government will likely say England’s scores have been stable, constant, consistent, unwavering - any metaphor to avoid the stigma of stagnation.

Rising, falling or stable, PISA scores won’t reveal the real decline in the education system in England – the excessive emphasis on test results and its negative effects: ‘gaming’, off-rolling, teaching to the test, narrowing the curriculum and schools taking measures to persuade low-achieving or SEND pupils to go elsewhere.   

 

 

FOOTNOTE:  Keen-eyed readers will have noticed that Skidmore’s 2013 Telegraph article cited PISA 2000.  The OECD warned in 2010 that comparisons should not be made with UK PISA results from 2000 because they hadn’t met sampling standards.  But this warning didn’t stop Gove from using the false comparison to justify his education reforms.  The ‘plummeting’ down league table myth was born.

Share on Twitter Share on Facebook

Be notified by email of each new post.





Comments

Roger Titcombe's picture
Tue, 05/02/2019 - 13:23

This is very interesting, but to make sense of this development from Chris Skidmore, it is necessary to revisit my articles on PISA scores and national IQs.

https://rogertitcombelearningmatters.wordpress.com/2016/12/18/national-i...

https://rogertitcombelearningmatters.wordpress.com/2016/12/11/national-i...

Sorry, but the following comment will only make sense if you read both of these articles.

My updated article relates to the 2015 PISA maths tests. Put simply, I plot raw PISA test scores against national IQ percentiles and get a linear relationship with a very high correlation, showing yet again that general intelligence/IQ is the strongest driver of attainment in maths tests, especially when, as in PISA, these tests are designed to test understanding rather than recall of knowledge from any specific taught curriculum.

The regression line is then used to produce predicted scores based on the national IQ. The measure of effectiveness of the national education systems is the actual score minus the predicted score.

The top three places in the 2015 PISA round are as follows.

The numbers after the country are IQ, actual score, predicted score, residual

 1. Poland, 92, 504, 437.9, 66.1

2Ireland, 93, 504, 445.1, 58.9

3. Vietnam, 94, 495, 452.9, 42.1

The figures fro the US and UK are as follows

49. UK, 100, 492, 501.4, -9.4

53. USA, 98, 470, 484.8, -14.8

Note that my top three countries all have raw scores higher than the UK, but their predicted scores are much lower on account of much lower national IQs. There are lots of issues around the validity of the national IQ measures and these are discussed in the article.

Janet writes:

Rising, falling or stable, PISA scores won’t reveal the real decline in the education system in England – the excessive emphasis on test results and its negative effects: ‘gaming’, off-rolling, teaching to the test, narrowing the curriculum and schools taking measures to persuade low-achieving or SEND pupils to go elsewhere.

She is right. In my article I write as follows.

I have to emphasise that I am using these IQ data for the purpose of interpreting the 2015 international PISA test results. Others may use the data for other purposes and come to conclusions that I do not support. However, Lynn’s IQ data now come fully referenced as to sources and include updates resulting from the Flynn effect, along with confirmation that the Flynn increase in IQs over time has ceased for pupils in the UK, as noted in Section 5.10 of, ‘Learning Matters’ where I explain this in terms of the degradation of the English education system caused by marketisation.


Add new comment

Already a member? Click here to log in before you comment. Or register with us.