Again, academies do no better than non-academies

Henry Stewart's picture
 0
I was a bit surprised this morning to come across, in the Guardian, the headline "Academies outperform council schools at GCSE, study finds", reporting an NFER publication. I have carried out extensive analysis of the 2011 and 2012 GCSE data sets, on which this report is based, and know that it would be very difficult to come to that conclusion from the data.

However, as a lot of people have pointed out on Twitter, the headline does not reflect the contents of the article. The Guardian notes that for GCESs alone (not including equivalents like BTECs) "long-established academies performed worse than maintained schools". (The headline has now been changed in the online version of the article.)

NFER Summary



Three conclusions are stated in the summary of the NFER report

1) In 2011 and 2012 academy schools attained, on average, higher attainment outcomes and made more progress between KS2 and KS4 than non-academy schools. This was presumably where the Guardian headline came from. However the report (and the Guardian article) go on to explain that the extra progress disappears when GCSE equivalents are removed. Also, in including converter academies here (see below) it will naturally lead to higher attainment in the results, as these were the schools that were already Good or Outstanding.

2) Analysis of the 2012 data using GCSE results without equivalents identified that, on average, academy schools that had held that status for more than two years had average GCSE scores that were significantly lower than non-academy schools. This result occurred in models with and without the key stage 2 prior attainment measure. This refers to sponsored academies, as there were no converter academies that ave held the stats for more than two years. This matches my own analysis, which has shown that any apparent strong performance is down to use of GCSE equivalents.

3) In analysing school-level GCSE data since 2007, no significant improvement is seen in the rate of improvement of GCSE results for academy schools over and above the rate of improvement in all schools. Interesting. I would actually argue this isn't true and GCSE results of sponsored academeis have increased more than "all schools". However they have done no better, and sometimes worse, than similar non-academies (ie, thsoe starting from a similar low base in GCSE results). This conclusion may be the result of combining sponsored and converter academies.

An Odd Report



The NFER report is a curious one. There are two very different types of academy, which are clearly identified in the DfE data sets: "sponsored academies", generally underachieving schools that have converted (plus a few entirely new schools), and "converter academies", schools that were mainly Good or Outstanding before becoming academies. These are very different sets of schools. Sponsored academies start from a low based and should, like other schools at a low base, see a fast improvement in results. Converter academies were already achieving results that were well above average and must be analysed in that context.

The only reference to sponsored or converter academies in this report is the suggestion that these should be looked at in the next report! This is sloppy. Instead the NFER analysis uses a proxy measure of the amount of time a school has been an academy (less than a year, one to two years or more than two years). This is unsatisfactory. While the more than two years category will be entirely sponsored academies, the other two time periods will include a mixture of sponsored and converter academies.

Also the report does not include any actual data or figures, making it impossible to check when they state that a group has "significantly higher scores" or were "broadly similar". Nowhere does it give the attainment figures for different types of schools, or the actual amount of improvement.

I would suggest that any future report should have clearly separate analysis for the two types of academies, and state the results that it found.

Conclusion



Once again an independent report shows that there is no evidence that academies perform better than non-academies in terms of GCSE results for their students. Their results only appear stronger if GCSE equivalents are included, which the DfE has described as "gaming" or "artificially inflating" results. It is interesting that it is some of the most highly-regarded academy chains that are the greatest users of equivalents. The 21% & 25% boost for ARK and Harris, compares to an average 6% from equivalents for schools overall.

ARK: 64% with equivalents, 43% without
Harris:  68% with equivalents, 43% without

(Based on 2011 results, % of students gaining 5 GCSEs including English and Maths)

In the 2014 GCSE results, most of these qualifications will no longer count as equivalents and some schools may see a big fall in results. It seems likely that sponsored academies, such as these chains, will see the biggest falls. As Toby Young tweeted me this morning, "it will take sponsored academies time to adjust". But those 2014 results will probably be a far more accurate measure of school performance.
Share on Twitter Share on Facebook

Be notified by email of each new post.





Add new comment

Already a member? Click here to log in before you comment. Or register with us.