The biggest impact of Michael Gove was in the expansion of academies. While he was Secretary of State, the DfE stated that this was the main vehicle for school improvement. Gove made many claims about how much greater was the improvement in academies than other schools, but normally made a point of comparing academies to all schools (including those where results were already high) rather than to similar schools. When compared to similar schools the evidence does not appear to be there:
* In the High Court last month the DfE only claimed
a "marginally higher" rate of improvement for academies when compared to similar schools, if GCSE equivalents were included. Without those equivalents (which the DfE described as "devaluing" GCSE results) the DfE could provide no evidence of greater improvement.
* The first analysis
of improvement in primary academies found that the increase in KS2 results between 2012 and 2013 was less than in similar maintained schools. Indeed on average the improvement was 5% greater in non-academies.
* The analysis below indicates that when academies are compared with similar schools using the new WOLF measure, then the improvement last year was less in academies than in similar maintained schools.
It is hard to see the justification in the data for the focus of the DfE, over the last few years, on academies. Is it in fact the case that policy has been dictated by "ideological dictat" (to quote last week's Evening Standard editorial) rather than based on evidence.
Analysis based on the WOLF measure
The 2014 GCSE benchmark for schools, of 5 A-Cs including English and Maths (5ACEM), will be based on the new WOLF measure. This is the result of the recommendations of Professor Alison Wolf and will restrict the use of GCSE equivalents. Those that remain will count for only one GCSE, instead of two or four.
In response to my FOI request, the DfE has released data
giving a GCSE benchmark figure – using the WOLF measure - for every school in England. This reveals that, if WOLF had been used, the average 5ACEM result in maintained schools would have fallen by 3.8% and that in sponsored academies by 7.4%, almost twice as much.
The chart below is based on splitting schools into five bands by their previous year 5ACEM results, comparing academies and maintained schools. The increase is highest in the previously under-performing schools while, for schools previously above 60%, results on average fell. This chart alone shows why comparing sponsored academies (largely with lower initial results) with all schools (including those with previously high results) is not statistically sound.
In each of the five bands the increase was greater, or the reduction was less, in the maintained schools. The indication is that, once equivalents are largely removed, the sponsored academies did less well than similar maintained schools. So, for schools in the lowest category (less than 20%), academies grew on average by an impressive 12.4%. But similar non-academies saw their benchmark rise by 16.3%. All schools that showed this level of increase should be congratulated but it is clearly wrong to suggest the increase in academies was greater.
Which schools overperform?
Yesterday I published a list of the 20 best performing secondary schools, based on how much they exceeded the 5ACEM level that would be expected from the age 11 SATs. The same regression analysis can be used to analyse which types of school over or under-perform:
Schools overall: 0%
Sponsored academies: -3.6%
This comparison is a little unfair, as many of the sponsored academies will be newly converted. But taking those schools which have had academy status for at least five years still produces an average under-performance:
Sponsored academies, 5 years old: -1.4%
Indeed analysis by age of academy still shows that even long-standing academies have not, overall, reached the point of overperformance. There are just two blips, which are explained by the spectacular performance of a handful of the most famous academies. On average academies of age 8 years and those of age 10 years do show overperformance, compared to KS2 expectation. In the 8 year cohort this is due to Paddington Academy and Burlington Danes. In the 10 year cohort it is the effect of Mossbourne. Those of 11 years or older revert to, on average, under-performance.
Note: The Wolf measure is still based on achievement of C or better and so gives no indication of overall value-added or how well schools do at taking the "more able" students to A and A* (which will be possible with the Progress8 measure in 2015). Overperformance indicates that the school has been successful at taking students who achieved a 3 or 4c at age 11, and getting them to 5 A-Cs including English and Maths.
Will academies be a positive long-term legacy?
Michael Gove and his supporters based their focus on academies on the claim that increasing autonomy was the route to improvement (although English schools were already among the most autonomous in the world). Critics feared that, while some schools would perform well, there may be a lack of support and challenge for those that need it - with the DfE too remote to effectively provide it.
At a recent meeting of the Association of Directors of Childrens Services (ADCS) Jo Moxham, Assistant DCS at Doncaster Council, recounted their concern that all the academies in their area were performing worse than before, when they were maintained schools. With no power to intervene she had written to the Secretary of State urging him to take action. Eight weeks later she had not received even an acknowledgement.(Of Doncaster's 13 academies, 9 are under-performing on the above measure.)
This was reflected also in the problem was highlighted in the Trojan Horse reports. Ian Kershaw commented that "It is not possible to discern a relationship between Birmingham City Council, Ofsted, the DfE and the Education Funding Agency in the process of sharing critical data and intelligence". Clarke in turn described the academies involved as a state of "benign neglect" from the DfE.
The Education Endowment Foundation is about to produce an analysis of the contribution of academy chains. I predict that they will find, after all the change and upheaval, that there are some high-performing chains, some low-performing ones and a lot of middling ones.
Supporters of Michael Gove claimed this week that academies (and free schools) will be his enduring legacy. Huge resources, effort and finance have been focused on these changes. However the evidence now indicates, with the new data without equivalents, that when they are compared to similar maintained schools, the improvement in academies has been less. Perhaps all this energy would have been better focused on leaving the better local authorities to continue their work, and on improving the LAs that were not doing a great job.
Note: None of my analysis of the relative performance of academies and maintained schools has ever been challenged by the DfE. This includes during the recent high court case. The DfE produced evidence from different (and generally earlier) periods to make its claim of "marginally higher" improvement for academies but did not challenge any of the LSN analysis, which was included in considerable detail in the case.