Confused by changing data in school performance tables? You’re not alone

Janet Downs's picture
 0
Two days ago I posted a thread about Cuckoo Hall Primary Academy where results had fallen from 86% in 2012 to 54% according to the 2013 Performance Tables.

But I found something odd when I checked Cuckoo Hall in the Performance Tables for 2012. The proportion reaching Level 4 at Cuckoo Hall was not 86% but 94%.

This wasn’t a typo. It wasn’t just Cuckoo Hall where the figures changed. It was all schools.

So, what could account for this?

In 2012, schools were judged on the proportion of children reaching Level 4 in English and Maths. That was the figure published in the 2012 Performance Tables.

In 2013, the way pupils were tested changed. Year 6 pupils now take three tests (SATs):
1Reading
2Spelling, punctuation and grammar (SPAG aka GPS)
3Maths (including mental arithmetic)

There are also Teacher Assessments (TAs) in English, Reading, Writing, Maths and Science.

Schools are now judged on the proportion of pupils achieving Level 4 or above in the Reading SAT, the Maths SAT and Writing TA.

These changes cause difficulties for comparing the performance of schools over time. If you look at the 2012 performance tables you’ll find the proportion of pupils reaching Level 4 in English and Maths. The results for the previous three years are provided so you can see whether results have risen or not.

But if you look at the 2013 performance tables you’ll only be able to compare 2013 and 2012. And the figures given for 2012 don’t match the figures shown in the 2012 Performance Tables.

Confused? So am I.

I switched between the Performance Tables for 2012 and 2013 and became increasingly puzzled. Then I found the answer: the 2012 figures in the 2013 tables are NOT the proportion reaching Level 4 in English and Maths in 2012. They are the 2012 SAT results for Reading and Maths plus the TA for Writing. It’s unclear why the data should differ but it does.

Does it matter? It shouldn’t if people just compare 2012 and 2013 using only the figures in the 2013 tables. But if number crunchers switch between the 2012 and 2013 tables or look at earlier years then any comparison is misleading because the way the data is calculated is not consistent.

I’m not a statistician. But I can see there might be problems if anyone compares the 2013 figures with earlier years. It could give a misleading impression about progress or the lack of it.
Share on Twitter
Category: 

Add new comment

Already a member? Click here to log in before you comment. Or register with us.