The new Progress 8 measure helps parents choose the ‘right [secondary] school for their child’, writes the Department for Education. It’s media blog says Progress 8 is ‘a fairer measure on how schools are supporting pupils to achieve their best’. While it’s true Progress 8 is fairer than the old, blunt way of judging schools by raw exam results, it still has the potential to mislead.
A report by Bristol University argues that pupil background should be taken into account when calculating Progress 8. It found that the high average Progress 8 scores in the capital ‘more than halves’ when background is factored in. That’s because London schools have higher proportions of ‘high progress ethnic groups’ than other areas. Conversely, the low average Progress 8 scores in the North East ‘increases substantially after adjustment due to the high proportions of poor pupils taught in this region’.
The university’s research found a ‘dramatic’ decrease in Progress 8 scores in selective schools and faith schools when ‘the educationally advantaged nature of their pupils is considered’ Conversely, the below average progress reported by sponsored academies (and, I would add, non-selective schools in selective areas) rises when ‘the disadvantaged nature of their pupils is recognised’:
‘Progress 8 effectively punishes schools teaching high proportions of disadvantaged pupils for the national underperformance of these groups.’
I would add that Progress 8 also discriminates against schools which are more inclusive and those where the intake comprises a significant proportion of previously low-attaining pupils.
That’s because such pupils are less likely to take at least eight GCSEs (or equivalent) which count towards Progress 8. They are also less likely to achieve high grades in the exams they do take.
Bristol university recommends that the Government should revise school league tables to ‘include an adjusted Progress 8 measure side-by-side with Progress 8 to present a more informative picture of school performance.’ This should be published alongside ‘a more detailed explanation as to the limitations of using such scores for school accountability.’
The DfE responded to Bristol University’s report with the usual catch-all statement: ‘We want all pupils to fulfil their potential…’
Nick Brook, deputy general secretary of the National Association of Head Teachers told the BBC:
‘Test and examination results are only part of the picture when judging a school's performance or a pupil's success. A dip in results one year does not necessarily equate to a decline in school effectiveness as cohorts vary annually. We therefore encourage all parents to take these results with a pinch of salt when choosing a school for their child.'