The gap between the achievement of disadvantaged and advantaged pupils is now a key measure of school performance. Schools are supposed to close the gap.
I’ve argued before
that there are circumstances which can prevent pupils learning. These can affect advantaged and disadvantaged pupils. And they can be worsened by poverty. It’s unrealistic to expect schools to eradicate totally the effect of socio-economic background. Even countries which produce a very large proportion of ‘resilient’ pupils
, disadvantaged children who overcome their background and perform well in PISA tests, don’t succeed in wiping out the negative effects of disadvantage in all their students*.
That’s not to say schools can do nothing – but they need the help of families
. NFER research
found being eligible for free school meals (FSM) was ‘negatively associated with achievement, even when a wide range of student-level variables are taken into account’. However, NFER found disadvantaged pupils were more likely to be resilient if there were books in the home, for example. The OECD
found resilient pupils were more likely to be persistent, motivated, punctual and attending school. These qualities are related to family support – schools can’t do it alone.
Judging schools on eradicating the gap between advantaged and disadvantaged pupils is judging them on circumstances beyond their control.
And such a measure, of course, favours schools with a low number of disadvantaged pupils. The OECD found ‘teachers instructing socio-economically disadvantaged children are likely to face greater challenges
than teachers with students from more privileged socio-economic backgrounds’. But these challenges appear to be brushed to one side in School Performance Tables. Teachers may be reluctant to take on these challenges if they feel they will be rebuked if the gap isn’t narrowed as fast as the Government deems acceptable.
But it appears the ‘Closing the Gap’ measure is here to stay and Department for Education (DfE) statisticians have proposed changes
to the way the measure is calculated. The number crunchers want views on how the ‘Disadvantaged Pupils Attainment Gap Index’ could be calculated, how to get over the challenges in ‘communicating a unitless index to users’ (gulp) and whether there are alternative approaches.
Short of ditching the measure, my suggestion would be to take into account the proportion of disadvantaged pupils in a school as this affects performance
. Both disadvantaged and advantaged children do worse in schools with a large number of disadvantaged pupils, the Education Endowment Foundation
discovered. The OECD
found this was the case globally, not just in the UK.
If you have any ideas and are willing to read through a dense document
over, or perhaps after, a glass (or two) of wine, then you can email your suggestions. I’ve no ideas when the deadline is – there’s no indication in the document. It’s only been published today. Perhaps DfE statisticians think they’re unlikely to get much response over Christmas when teachers are exhausted, schools closed and other analysts are crunching Christmas food not numbers.
20 December 2014
I have heard from Kylie Hill, Senior Statistical Officer, DfE. She said there's no set time limit to the consultation as the statisticians were unlikely to make a firm decision until summer 2015. The mailbox would be monitored regularly for any feedback.
Ms Hill also pointed out the ‘Disadvantaged Pupils Attainment Gap Index’ is not a school-level accountability measure and is not designed to appear in performance tables (that was my misunderstanding)
. It’s a ‘supplementary measure’ which will track how the gap is closed (or not) over time. It’s hoped the proposed methodology will be able to overcome problems caused by educational reforms which affect comparability.
Thanks to Kylie Hill for her prompt feedback.
*This presumes that OECD PISA cohorts in all countries capture the full range of ability or socio-economic background. OECD has admitted 25% of the cohort was missing from the 2012 PISA test sample in Shanghai, for example. It is likely that the 15 year-olds who didn’t take part were those who left school at 15 to work. These, in turn, are more likely to be pupils whose economic situation made them seek paid employment. The missing 25% would inevitably affect results.