Why did Ofsted ignore TIMSS 2011 which showed English pupils in more favourable light?

Janet Downs's picture
 0
Ofsted’s widely-publicised report alleging that non-selective schools are “failing” the brightest pupils rested on expected progression and on international comparisons.

Henry Stewart has explained how the progression measure is flawed. Is Ofsted’s use of international comparisons similarly unsound?

Ofsted correctly cited results from PISA* 2009:

1.  1% of England’s 15-year-olds reached the highest level in reading. This was above the OECD* average (0.8%) but lower than some other English-speaking countries.

2.  In Maths, England’s 15-year-olds were less likely to achieve the highest levels: only 1.7% compared with the OECD average of 3.1%.

3.  1.9% of England’s 15-year-olds reached the highest level in Science. This was above the OECD average (1.1%) but below Singapore (4.6%), Finland (3.3%) and Japan (2.6%).

But Ofsted ignored another, more recent test: Trends in Maths and Science Survey (TIMSS) 2011.

In Maths, significantly more 14-year-olds in five Far Eastern jurisdictions**: Chinese Taipei (49%), Singapore (48%), South Korea (47%), Hong Kong (34%) and Japan (27%) reached the Advanced Benchmark. These countries streaked ahead of the next country, Russian Federation, where 14% reached this standard. Then came Israel and Australia (12% and 9% respectively) followed by England (8%). This was more than New Zealand (5%) and Finland (4%).

In Science, 14-year-olds from four East Asian countries had the largest proportion reaching the Advanced Benchmark: Singapore (40%), Chinese Taipei (24%), Korea (20%) and Japan (18%). Next were the Russian Federation and England (14%). This was more than Hong Kong (9%).

So, although PISA found that English 15 year-olds were less likely to reach the highest levels in Maths than pupils in other countries, TIMSS showed a different picture. English 14 year-olds were more likely to reach the Maths Advanced Benchmark than in most countries although they lagged behind the performance of pupils in the Far East.

In TIMSS, English 14 year-olds were more likely to achieve the highest levels in Science than in most other countries even outstripping Hong Kong.

So, TIMSS 2011 painted a more favourable picture of the performance of high-achieving pupils in England.

That’s no reason, of course, to be complacent. The performance of Far Eastern pupils is way ahead (except Hong Kong in Science). But Ofsted should have referred to TIMSS and not just rely on PISA – this would have revealed a more rounded, if more confusing, picture.

Henry Stewart has revealed how the progression measure is flawed and the international test data used by Ofsted omitted more favourable results.

That’s not to say that schools shouldn’t do their utmost to ensure that pupils achieve the highest level they can – but this applies to all pupils not just high-attainers. The London Challenge showed how this could be done.

The Challenge recognised that schools perform best when they feel trusted, encouraged and supported. This approach contrasts with recent Ofsted pronouncements.

Lessons learned from London Challenge should be rolled out across the country. This would be a better use of Ofsted’s time than using flawed progression rates and the results of just one international test to conclude that non-selective schools are “failing” the brightest pupils.

 

* The Programme for Student Assessment (PISA) tests are administered by the Organisation for Economic Co-operation and Development (OECD)

**A jurisdiction can mean a whole country (eg Japan) or a region within a country (eg Chinese Taipei)

 
Share on Twitter Share on Facebook

Be notified by email of each new post.





Add new comment

Already a member? Click here to log in before you comment. Or register with us.