Schools and local authorities (LAs) are constantly told they must close the attainment gap between disadvantaged and advantaged children. Ofsted comments on how well schools and LAs are doing on this measure. It influences inspectors’ judgements.
It would be logical, then, to assume that LAs with good results for pupils who’ve been eligible for free school meals or in LA care any time in the last six years (FSM6) would be praised.
Not so. Bath and North East Somerset was described as a 'problem area
' despite results
for disadvantaged Key Stage 2 FSM6 pupils being slightly above the national average (68% achieved Level 4 in reading, writing and maths against a national average of 67%). This was because the gap between FSM6 11 year-olds and non-FSM6 pupils (18) was still wider than the national average (16).
Why was the gap so wide? It’s because the results of non FSM6 pupils rose even faster: 86% reached the benchmark against 83% nationally. The wide gap is caused by the higher than average score of non FSM6 pupils. Despite the area’s FSM6 pupils doing slightly better than the national average, the LA is still deemed to be doing poorly in narrowing the gap. A cynic might say the best way for LAs and schools to close the gap would be to focus on disadvantaged pupils and less on advantaged ones so results of the latter would be depressed. This would allow disadvantaged pupils to catch up.
However, there is another way of defining disadvantage: pupils who are eligible for free school meals or in LA care in the academic year when tests were taken (FSM1). Nationally, this increases the average gap between advantaged and disadvantaged pupils from 16 to 18 – just two points. The Local Authority Interactive Tool (LAIT), used by Ofsted for its figures in the speech criticising Bath and NE Somerset, defines disadvantage as FSM1.
But measuring Bath and North East Somerset on this FSM1 measure widens the gap from 18 to 26 - eight points. That’s because of the disproportionate effect of small sample size. There were just 160 FSM1 pupils* in Bath and NE Somerset and 319 FSM6 pupils. This would cause the figures in the two datasets to be more variable.
This variability has the effect of increasing a modest gap, which was in any case caused by the better-than-average performance of non-FSM6 pupils, to a considerably wider one.
On the latter measure, Bath and North East Somerset is indeed a ‘problem area’ where just 59% of the 160 FSM1 pupils reached the benchmark against a national FSM1 average of 68%.
The publically-available School Performance Tables use the FSM6 definition of disadvantaged. I believe this is fairer. Children are in schools for several years and could sometimes be eligible for FSM and sometimes not. The effect of education is cumulative and not just the result of what happens in the year when exams are taken.
Does it matter if the Tables and LAIT use different definitions of disadvantaged? Yes, it does – the effect on Bath and North East Somerset shows this. A local authority which did slightly better for FSM6 pupils in School Performance Tables at KS2 becomes one that LAIT identifies as doing badly. It turns Bath and North East Somerset from an authority where all KS2 pupils do quite well into a ‘problem area’.
The amount of data available is burgeoning: School Performance Tables, LAIT, RAISEonline the National Pupil Database (containing information about individual pupils). Data is used by politicians, inspectors, the media, schools themselves to judge ‘performance’. Yet few of the data users are statisticians. Data in the hands of the uninformed (or the unscrupulous) can be unsafe.
My colleague Henry writes about ‘How to use data badly’ here
. Icing On the Cake describes the ‘dangerously flawed’ data in RAISEonline here
There’s something wrong when definitions in datasets don’t correspond. Conclusions have far-reaching consequences for LAs, schools and academy chains. They can make the difference between success and failure. And in our prevalent culture of blaming, naming-and-shaming, it heaps on more negativity.
Failure, remember, is used to enforce academy conversions, oust heads and pillory local authorities. It’s essential that data is accurate, consistent and used with care. But above all, it’s important to remember that data doesn’t always capture what matters. This seems to have been forgotten – education has been reduced to number crunching.
4 April. Sir Mike Tomlinson, Chief HMI 2000-2002, says Ofsted inspections are inconsistent and too dependent on data
. It’s important, therefore, that Ofsted realises the limitations of data. LAIT and school performance tables are two data sets but, as we’ve seen, they differ in definitions used. Interpretations can be equally valid depending on which data set has been used. But, as we see here, the conclusions can mean the difference between success and failure. And this can mean the difference between teachers and LA school improvement staff keeping or losing their jobs.
With stakes as high as this, data should be used with care.
*figures in email to author from Jayne Middlemas, from the Department for Education’s Primary Performance and RAISEonline School Performance Data Unit